WorldWideScience

Sample records for ideographic measures based

  1. Holocaust Cartoons as Ideographs

    Directory of Open Access Journals (Sweden)

    Mahdiyeh Meidani

    2015-07-01

    Full Text Available The Holocaust cartoon competition of 2006 in Iran as an instance of social controversy has the potential to raise social and political arguments over various international and global issues. Through using McGee’s theory of ideograph and Edwards and Winkler’s theory of representative form, I identify the ideographs used in these cartoons and argue that the Holocaust cartoons function ideographically to portray Jews, Judaism, Palestine, Israel, Zionism, and the Holocaust. I explain how these controversial images function as representative characters and representative anecdotes and create different ideological interpretations of the Holocaust and associated issues, such as Israel–Palestine conflicts and Western freedom of speech. I argue that the cartoons suggest a connection between Nazism and Zionism, or the Nazi and Israeli regimes, by juxtaposing various elements and situations. I explain that the cartoons anecdotally refer to the Holocaust and represent it as myth or hoax used by Jews/Zionists to justify creation of the nation of Israel.

  2. Ibie'ka (Ideographs): Developing Visual Signs for Expressing ...

    African Journals Online (AJOL)

    Visual signs (ideographs) are artistic codified expressions that promote social and cultural integration. They are normally based on popular conventions which over a period of time become generally accepted. In pre-western literate Africa, apart from oral communication, visual codes were employed within social groups.

  3. Queering marriage: an ideographic interrogation of heteronormative subjectivity.

    Science.gov (United States)

    Grindstaff, Davin

    2003-01-01

    Recent debates on same-sex marriage mark the institution, practice, and concept of marriage as a significant site of power and resistance within American culture. Adopting Michel Foucault's conception of "discipline," this essay examines how marriage discourse reinforces heteronormative power relations through its rhetorical constitution of gay male identity. Supplementing "ideographic" critique with Judith Butler's theory of performative speech acts enables us to better interrogate and resist these operations of power. This essay maps the contemporary scene of heteronormative power and resistance through two rhetorical performances of gay male identity. The marriage debates, in the first instance, demonstrate how a conventional desire for masculine agency influences the heteronormative production of gay male identity. In the second instance, gay male SM [sadomasochism] performs a concept of "relational agency," which potentially resists heteronormativity.

  4. Stroop phenomena in the Japanese language: the case of ideographic characters (kanji) and syllabic characters (kana).

    Science.gov (United States)

    Morikawa, Y

    1981-08-01

    Utilizing a unique feature of the Japanese languages--that besides two syllabic orthographies, which have identical pronunciations, words with the same pronunciation may also be written in an orthography composed of ideographic characters--we have conducted an investigation of Stroop phenomena. The fact that pronunciations of the three Japanese orthographies are identical means that, if there are any differences between them in the Stroop phenomena observed, we can place the locus of this interference effect in the perceptual process. Five color names were written in the ideographic characters (kanji) and the two syllabic orthographies (hiragana and katakana). Color-congruent cards and incongruent cards were utilized in a color-naming task and a word-reading task. Mean required times for the color-naming condition and the word-reading conditions were compared with those for control conditions. Stroop phenomena were observed in both ideographic and syllabic orthographies. Significant differences in mean required times were observed between the ideographic and syllabic orthographies but not between the two syllabic orthographies. Interferences in comparisons of Japanese orthographies and color patch control conditions were much smaller than in the case of Stroop's (1935) experiment. A "Reverse Stroop Phenomenon" was observed only in the case of kanji on incongruent cards in the word-reading condition. The results support the hypothesis that both ideographic characters (in this case, kanji) and colors are processed in a parallel fashion in the non-dominant right cerebral hemisphere, while syllabic or phonetic characters are processed in the dominant left cerebral hemisphere.

  5. Julie Taymor’s Ideographs in Her Adaptations of Shakespeare’s Titus Andronicus and The Tempest

    Directory of Open Access Journals (Sweden)

    Kristijan Stakor

    2017-12-01

    Full Text Available The aim of this paper is to show the unique visual style in director Julie Taymor’s vividly filmed adaptations of Shakespeare’s plays, The Tempest and Titus Andronicus by concentrating on the visual elements called ideographs or ideograms. By definition, these ideographs are usually symbols that represent a particular idea or a thing rather than a word. I will argue that ideographs are also present in her films, Titus (1999 and The Tempest (2010, and that Taymor’s vast theatrical knowledge adds layers of meanings into filmed sequences. Shakespeare’s plays, burdened with foul deeds of war, revenge, struggle, and witchcraft almost invite the director not to settle with the ordinary, but to use contrasting colors and costumes from opposing eras, letting her show us his world through her own prism. Therefore, these adaptations are exceptional not only because of Taymor’s untypical use of familiar historical elements in production design but also because of her use of nonlinguistic devices in order to both express admiration for and criticize the situations presented in the original text. The paper also argues that Taymor’s films should be viewed as cross-cultural and intercultural adaptations, rather than American adaptations, because she uses Eastern theatrical elements and European heritage in order to underline the complexity and extravagant nature of events depicted in the plays.

  6. Dissociation of writing processes: functional magnetic resonance imaging during writing of Japanese ideographic characters.

    Science.gov (United States)

    Matsuo, K; Nakai, T; Kato, C; Moriya, T; Isoda, H; Takehara, Y; Sakahara, H

    2000-06-01

    Dissociation between copying letters and writing to dictation has been reported in the clinical neuropsychological literature. Functional magnetic resonance imaging (fMRI) was conducted in normal volunteers to detect the neurofunctional differences between 'copying Kanji', the Japanese ideographic characters, and 'writing Kanji corresponding to phonological information'. Four tasks were conducted: the copying-Kanji task, the writing-Kanji-corresponding-to-phonogram task, the Kanji-grapheme-puzzle task, and the control task. The right superior parietal lobule was extensively activated during the copying-Kanji task (a model of the copying letters process) and the Kanji-grapheme-puzzle task. These observations suggested that this area was involved in referring the visual stimuli closely related to the ongoing handwriting movements. On the other hand, Broca's area, which is crucial for language production, was extensively activated during the writing-Kanji-corresponding-to-phonogram task (a model of the writing-to-dictation process). The Kanji-grapheme-puzzle task activated the bilateral border portions between the inferior parietal lobule and the occipital lobe, the left premotor area, and the bilateral supplementary motor area (SMA). Since the Kanji-grapheme-puzzle task involved manipulospatial characteristics, these results suggested cooperation between visuospatial and motor executive functions, which may be extensively utilized in demanding visual language processing. The neurofunctional difference between 'copying Kanji' and 'writing Kanji corresponding to phonogram' was efficiently demonstrated by this fMRI experiment.

  7. Refractive index based measurements

    DEFF Research Database (Denmark)

    2014-01-01

    In a method for performing a refractive index based measurement of a property of a fluid such as chemical composition or temperature by observing an apparent angular shift in an interference fringe pattern produced by back or forward scattering interferometry, ambiguities in the measurement caused...... by the apparent shift being consistent with one of a number of numerical possibilities for the real shift which differ by 2n are resolved by combining measurements performed on the same sample using light paths therethrough of differing lengths....

  8. Refractive index based measurements

    DEFF Research Database (Denmark)

    2014-01-01

    In a method for performing a refractive index based measurement of a property of a fluid such as chemical composition or temperature, a chirp in the local spatial frequency of interference fringes of an interference pattern is reduced by mathematical manipulation of the recorded light intensity...

  9. Refractive index based measurements

    DEFF Research Database (Denmark)

    2014-01-01

    A refractive index based measurement of a property of a fluid is measured in an apparatus comprising a variable wavelength coherent light source (16), a sample chamber (12), a wavelength controller (24), a light sensor (20), a data recorder (26) and a computation apparatus (28), by - directing...... coherent light having a wavelength along an input light path, - producing scattering of said light from each of a plurality of interfaces within said apparatus including interfaces between said fluid and a surface bounding said fluid, said scattering producing an interference pattern formed by said...... scattered light, - cyclically varying the wavelength of said light in said input light path over a 1 nm to 20nm wide range of wavelengths a rate of from 10Hz to 50 KHz, - recording variation of intensity of the interfering light with change in wavelength of the light at an angle of observation...

  10. Software-based acoustical measurements

    CERN Document Server

    Miyara, Federico

    2017-01-01

    This textbook provides a detailed introduction to the use of software in combination with simple and economical hardware (a sound level meter with calibrated AC output and a digital recording system) to obtain sophisticated measurements usually requiring expensive equipment. It emphasizes the use of free, open source, and multiplatform software. Many commercial acoustical measurement systems use software algorithms as an integral component; however the methods are not disclosed. This book enables the reader to develop useful algorithms and provides insight into the use of digital audio editing tools to document features in the signal. Topics covered include acoustical measurement principles, in-depth critical study of uncertainty applied to acoustical measurements, digital signal processing from the basics, and metrologically-oriented spectral and statistical analysis of signals. The student will gain a deep understanding of the use of software for measurement purposes; the ability to implement software-based...

  11. Strain measurement based battery testing

    Science.gov (United States)

    Xu, Jeff Qiang; Steiber, Joe; Wall, Craig M.; Smith, Robert; Ng, Cheuk

    2017-05-23

    A method and system for strain-based estimation of the state of health of a battery, from an initial state to an aged state, is provided. A strain gauge is applied to the battery. A first strain measurement is performed on the battery, using the strain gauge, at a selected charge capacity of the battery and at the initial state of the battery. A second strain measurement is performed on the battery, using the strain gauge, at the selected charge capacity of the battery and at the aged state of the battery. The capacity degradation of the battery is estimated as the difference between the first and second strain measurements divided by the first strain measurement.

  12. Status of radiation-based measurement technology

    International Nuclear Information System (INIS)

    Moon, B. S.; Lee, J. W.; Chung, C. E.; Hong, S. B.; Kim, J. T.; Park, W. M.; Kim, J. Y.

    1999-03-01

    This report describes the status of measurement equipment using radiation source and new technologies in this field. This report includes the development status in Korea together with a brief description of the technology development and application status in ten countries including France, America, and Japan. Also this report describes technical factors related to radiation-based measurement and trends of new technologies. Measurement principles are also described for the equipment that is widely used among radiation-based measurement, such as level measurement, density measurement, basis weight measurement, moisture measurement, and thickness measurement. (author). 7 refs., 2 tabs., 21 figs

  13. SQUID-based measuring systems

    Indian Academy of Sciences (India)

    field produced by a given two-dimensional current density distribution is inverted using the Fourier transform technique. Keywords ... Superconducting quantum interference devices (SQUIDs) are the most sensitive detectors for measurement of ... omagnetic prospecting, detection of gravity waves etc. Judging the importance ...

  14. European wet deposition maps based on measurements

    NARCIS (Netherlands)

    Leeuwen EP van; Erisman JW; Draaijers GPJ; Potma CJM; Pul WAJ van; LLO

    1995-01-01

    To date, wet deposition maps on a European scale have been based on long-range transport model results. For most components wet deposition maps based on measurements are only available on national scales. Wet deposition maps of acidifying components and base cations based on measurements are needed

  15. Spectrophotometer-Based Color Measurements

    Science.gov (United States)

    2017-10-24

    equipment. There are several American Society for Testing and Materials ( ASTM ) chapters covering the use of spectrometers for color measurements (refs. 3...Perkin Elmer software and procedures described in ASTM chapter E308 (ref. 3). All spectral data was stored on the computer. A summary of the color...similarity, or lack thereof, between two colors (ref. 5). In this report, the Euclidean distance metric, E, is used and recommended in ASTM D2244

  16. Bluetooth-based distributed measurement system

    International Nuclear Information System (INIS)

    Tang Baoping; Chen Zhuo; Wei Yuguo; Qin Xiaofeng

    2007-01-01

    A novel distributed wireless measurement system, which is consisted of a base station, wireless intelligent sensors and relay nodes etc, is established by combining of Bluetooth-based wireless transmission, virtual instrument, intelligent sensor, and network. The intelligent sensors mounted on the equipments to be measured acquire various parameters and the Bluetooth relay nodes get the acquired data modulated and sent to the base station, where data analysis and processing are done so that the operational condition of the equipment can be evaluated. The establishment of the distributed measurement system is discussed with a measurement flow chart for the distributed measurement system based on Bluetooth technology, and the advantages and disadvantages of the system are analyzed at the end of the paper and the measurement system has successfully been used in Daqing oilfield, China for measurement of parameters, such as temperature, flow rate and oil pressure at an electromotor-pump unit

  17. Bluetooth-based distributed measurement system

    Science.gov (United States)

    Tang, Baoping; Chen, Zhuo; Wei, Yuguo; Qin, Xiaofeng

    2007-07-01

    A novel distributed wireless measurement system, which is consisted of a base station, wireless intelligent sensors and relay nodes etc, is established by combining of Bluetooth-based wireless transmission, virtual instrument, intelligent sensor, and network. The intelligent sensors mounted on the equipments to be measured acquire various parameters and the Bluetooth relay nodes get the acquired data modulated and sent to the base station, where data analysis and processing are done so that the operational condition of the equipment can be evaluated. The establishment of the distributed measurement system is discussed with a measurement flow chart for the distributed measurement system based on Bluetooth technology, and the advantages and disadvantages of the system are analyzed at the end of the paper and the measurement system has successfully been used in Daqing oilfield, China for measurement of parameters, such as temperature, flow rate and oil pressure at an electromotor-pump unit.

  18. Bluetooth-based distributed measurement system

    Energy Technology Data Exchange (ETDEWEB)

    Tang Baoping; Chen Zhuo; Wei Yuguo; Qin Xiaofeng [Department of Mechatronics, College of Mechanical Engineering, Chongqing University, Chongqing, 400030 (China)

    2007-07-15

    A novel distributed wireless measurement system, which is consisted of a base station, wireless intelligent sensors and relay nodes etc, is established by combining of Bluetooth-based wireless transmission, virtual instrument, intelligent sensor, and network. The intelligent sensors mounted on the equipments to be measured acquire various parameters and the Bluetooth relay nodes get the acquired data modulated and sent to the base station, where data analysis and processing are done so that the operational condition of the equipment can be evaluated. The establishment of the distributed measurement system is discussed with a measurement flow chart for the distributed measurement system based on Bluetooth technology, and the advantages and disadvantages of the system are analyzed at the end of the paper and the measurement system has successfully been used in Daqing oilfield, China for measurement of parameters, such as temperature, flow rate and oil pressure at an electromotor-pump unit.

  19. Lexical Base as a Compressed Language Model of the World (on the material of the Ukrainian language)

    OpenAIRE

    Buk, Solomiya

    2004-01-01

    In the article the fact is verified that the list of words selected by formal statistical methods (frequency and functional genre unrestrictedness) is not a conglomerate of non-related words. It creates a system of interrelated items and it can be named "lexical base of language". This selected list of words covers all the spheres of human activities. To verify this statement the invariant synoptical scheme common for ideographic dictionaries of different language was determined.

  20. Korean Clinic Based Outcome Measure Studies

    OpenAIRE

    Jongbae Park

    2003-01-01

    Background: Evidence based medicine has become main tools for medical practice. However, conducting a highly ranked in the evidence hierarchy pyramid is not easy or feasible at all times and places. There remains a room for descriptive clinical outcome measure studies with admitting the limit of the intepretation. Aims: Presents three Korean clinic based outcome measure studies with a view to encouraging Korean clinicians to conduct similar studies. Methods: Three studies are presented...

  1. Measurement-based reliability/performability models

    Science.gov (United States)

    Hsueh, Mei-Chen

    1987-01-01

    Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.

  2. An USB-based time measurement system

    International Nuclear Information System (INIS)

    Qin Xi; Liu Shubin; An Qi

    2010-01-01

    In this paper,we report the electronics of a timing measurement system of PTB(portable TDC board), which is a handy tool based on USB interface, customized for high precision time measurements without any crates. The time digitization is based on the High Performance TDC Chip (HPTDC). The real-time compensation for HPTDC outputs and the USB master logic are implemented in an ALTERA's Cyclone FPGA. The architecture design and logic design are described in detail. Test of the system showed a time resolution of 13.3 ps. (authors)

  3. Toward Measuring Network Aesthetics Based on Symmetry

    Directory of Open Access Journals (Sweden)

    Zengqiang Chen

    2017-05-01

    Full Text Available In this exploratory paper, we discuss quantitative graph-theoretical measures of network aesthetics. Related work in this area has typically focused on geometrical features (e.g., line crossings or edge bendiness of drawings or visual representations of graphs which purportedly affect an observer’s perception. Here we take a very different approach, abandoning reliance on geometrical properties, and apply information-theoretic measures to abstract graphs and networks directly (rather than to their visual representaions as a means of capturing classical appreciation of structural symmetry. Examples are used solely to motivate the approach to measurement, and to elucidate our symmetry-based mathematical theory of network aesthetics.

  4. Accuracy of magnetic resonance based susceptibility measurements

    Science.gov (United States)

    Erdevig, Hannah E.; Russek, Stephen E.; Carnicka, Slavka; Stupic, Karl F.; Keenan, Kathryn E.

    2017-05-01

    Magnetic Resonance Imaging (MRI) is increasingly used to map the magnetic susceptibility of tissue to identify cerebral microbleeds associated with traumatic brain injury and pathological iron deposits associated with neurodegenerative diseases such as Parkinson's and Alzheimer's disease. Accurate measurements of susceptibility are important for determining oxygen and iron content in blood vessels and brain tissue for use in noninvasive clinical diagnosis and treatment assessments. Induced magnetic fields with amplitude on the order of 100 nT, can be detected using MRI phase images. The induced field distributions can then be inverted to obtain quantitative susceptibility maps. The focus of this research was to determine the accuracy of MRI-based susceptibility measurements using simple phantom geometries and to compare the susceptibility measurements with magnetometry measurements where SI-traceable standards are available. The susceptibilities of paramagnetic salt solutions in cylindrical containers were measured as a function of orientation relative to the static MRI field. The observed induced fields as a function of orientation of the cylinder were in good agreement with simple models. The MRI susceptibility measurements were compared with SQUID magnetometry using NIST-traceable standards. MRI can accurately measure relative magnetic susceptibilities while SQUID magnetometry measures absolute magnetic susceptibility. Given the accuracy of moment measurements of tissue mimicking samples, and the need to look at small differences in tissue properties, the use of existing NIST standard reference materials to calibrate MRI reference structures is problematic and better reference materials are required.

  5. A SVD Based Image Complexity Measure

    DEFF Research Database (Denmark)

    Gustafsson, David Karl John; Pedersen, Kim Steenstrup; Nielsen, Mads

    2009-01-01

    Images are composed of geometric structures and texture, and different image processing tools - such as denoising, segmentation and registration - are suitable for different types of image contents. Characterization of the image content in terms of geometric structure and texture is an important...... problem that one is often faced with. We propose a patch based complexity measure, based on how well the patch can be approximated using singular value decomposition. As such the image complexity is determined by the complexity of the patches. The concept is demonstrated on sequences from the newly...... collected DIKU Multi-Scale image database....

  6. Ordinal-Measure Based Shape Correspondence

    Directory of Open Access Journals (Sweden)

    Faouzi Alaya Cheikh

    2002-04-01

    Full Text Available We present a novel approach to shape similarity estimation based on distance transformation and ordinal correlation. The proposed method operates in three steps: object alignment, contour to multilevel image transformation, and similarity evaluation. This approach is suitable for use in shape classification, content-based image retrieval and performance evaluation of segmentation algorithms. The two latter applications are addressed in this papers. Simulation results show that in both applications our proposed measure performs quite well in quantifying shape similarity. The scores obtained using this technique reflect well the correspondence between object contours as humans perceive it.

  7. Green maritime transportation: Market based measures

    DEFF Research Database (Denmark)

    Psaraftis, Harilaos N.

    2016-01-01

    The purpose of this chapter is to introduce the concept of Market Based Measures (MBMs) to reduce Green House Gas (GHG) emissions from ships, and review several distinct MBM proposals that have been under consideration by the International Maritime Organization (IMO). The chapter discusses the me...... the mechanisms used by MBMs, and explores how the concept of the Marginal Abatement Cost (MAC) can be linked to MBMs. It also attempts to discuss the pros and cons of the submitted proposals....

  8. Statistical inference based on divergence measures

    CERN Document Server

    Pardo, Leandro

    2005-01-01

    The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this powerful approach.Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, prese...

  9. Bioimpedance measurement based evaluation of wound healing.

    Science.gov (United States)

    Kekonen, Atte; Bergelin, Mikael; Eriksson, Jan-Erik; Vaalasti, Annikki; Ylänen, Heimo; Viik, Jari

    2017-06-22

    Our group has developed a bipolar bioimpedance measurement-based method for determining the state of wound healing. The objective of this study was to assess the capability of the method. To assess the performance of the method, we arranged a follow-up study of four acute wounds. The wounds were measured using the method and photographed throughout the healing process. Initially the bioimpedance of the wounds was significantly lower than the impedance of the undamaged skin, used as a baseline. Gradually, as healing progressed, the wound impedance increased and finally reached the impedance of the undamaged skin. The clinical appearance of the wounds examined in this study corresponded well with the parameters derived from the bioimpedance data. Hard-to-heal wounds are a significant and growing socioeconomic burden, especially in the developed countries, due to aging populations and to the increasing prevalence of various lifestyle related diseases. The assessment and the monitoring of chronic wounds are mainly based on visual inspection by medical professionals. The dressings covering the wound must be removed before assessment; this may disturb the wound healing process and significantly increases the work effort of the medical staff. There is a need for an objective and quantitative method for determining the status of a wound without removing the wound dressings. This study provided evidence of the capability of the bioimpedance based method for assessing the wound status. In the future measurements with the method should be extended to concern hard-to-heal wounds.

  10. Animal-based measures for welfare assessment

    Directory of Open Access Journals (Sweden)

    Agostino Sevi

    2010-01-01

    Full Text Available Animal welfare assessment can’t be irrespective of measures taken on animals. Indeed, housing parametersrelatedtostructures, designandmicro-environment, evenifreliable parameters related to structures, design and micro-environment, even if reliable and easier to take, can only identify conditions which could be detrimental to animal welfare, but can’t predict poor welfare in animals per se. Welfare assessment through animal-based measures is almost complex, given that animals’ responses to stressful conditions largely depend on the nature, length and intensity of challenges and on physiological status, age, genetic susceptibility and previous experience of animals. Welfare assessment requires a multi-disciplinary approach and the monitoring of productive, ethological, endocrine, immunological and pathological param- eters to be exhaustive and reliable. So many measures are needed, because stresses can act only on some of the mentioned parameters or on all of them but at different times and degree. Under this point of view, the main aim of research is to find feasible and most responsive indicators of poor animal welfare. In last decades, studies focused on the following parameters for animal wel- fare assessment indexes of biological efficiency, responses to behavioral tests, cortisol secretion, neutrophil to lymphocyte ratio, lymphocyte proliferation, production of antigen specific IgG and cytokine release, somatic cell count and acute phase proteins. Recently, a lot of studies have been addressed to reduce handling and constraint of animals for taking measures to be used in welfare assessment, since such procedures can induce stress in animals and undermined the reliability of measures taken for welfare assessment. Range of animal-based measures for welfare assessment is much wider under experimental condition than at on-farm level. In welfare monitoring on-farm the main aim is to find feasible measures of proved validity and reliability

  11. Measuring globalization-based acculturation in Ladakh

    DEFF Research Database (Denmark)

    Ozer, Simon; Schwartz, Seth

    2016-01-01

    Theories and methodologies within acculturation psychology have been advanced in orderto capture the complex process of intercultural contact in various contexts. Differentiatingglobalization-based acculturation from immigrant-based acculturation has broadened thefield of acculturation psychology...... to include groups who are exposed to global culturalstreams without international migration. The globalization-based acculturation process inthe North Indian region of Ladakh appears to be a tricultural encounter, suggesting anaddendum to the bidimensional acculturation model for this group (and perhaps...... for othersas well). This study explores the development, usability, and validity of a tridimensionalacculturation measure aiming to capture the multicultural orientations initiated by theprocess of globalization in Ladakh. The tridimensional acculturation scale was found to fitthe data significantly better...

  12. Property-Based Software Engineering Measurement

    Science.gov (United States)

    Briand, Lionel C.; Morasca, Sandro; Basili, Victor R.

    1997-01-01

    Little theory exists in the field of software system measurement. Concepts such as complexity, coupling, cohesion or even size are very often subject to interpretation and appear to have inconsistent definitions in the literature. As a consequence, there is little guidance provided to the analyst attempting to define proper measures for specific problems. Many controversies in the literature are simply misunderstandings and stem from the fact that some people talk about different measurement concepts under the same label (complexity is the most common case). There is a need to define unambiguously the most important measurement concepts used in the measurement of software products. One way of doing so is to define precisely what mathematical properties characterize these concepts, regardless of the specific software artifacts to which these concepts are applied. Such a mathematical framework could generate a consensus in the software engineering community and provide a means for better communication among researchers, better guidelines for analysts, and better evaluation methods for commercial static analyzers for practitioners. In this paper, we propose a mathematical framework which is generic, because it is not specific to any particular software artifact and rigorous, because it is based on precise mathematical concepts. We use this framework to propose definitions of several important measurement concepts (size, length, complexity, cohesion, coupling). It does not intend to be complete or fully objective; other frameworks could have been proposed and different choices could have been made. However, we believe that the formalisms and properties we introduce are convenient and intuitive. This framework contributes constructively to a firmer theoretical ground of software measurement.

  13. Korean Clinic Based Outcome Measure Studies

    Directory of Open Access Journals (Sweden)

    Jongbae Park

    2003-02-01

    Full Text Available Background: Evidence based medicine has become main tools for medical practice. However, conducting a highly ranked in the evidence hierarchy pyramid is not easy or feasible at all times and places. There remains a room for descriptive clinical outcome measure studies with admitting the limit of the intepretation. Aims: Presents three Korean clinic based outcome measure studies with a view to encouraging Korean clinicians to conduct similar studies. Methods: Three studies are presented briefly here including 1 Quality of Life of liver cancer patients after 8 Constitutional acupuncture; 2 Developing a Korean version of Measuring yourself Medical Outcome profile (MYMOP; and 3 Survey on 5 Shu points: a pilot In the first study, we have included 4 primary or secondary liver cancer patients collecting their diagnostic X-ray film and clinical data f개m their hospital, and asked them to fill in the European Organization Research and Treatment of Cancer, Quality of Life Questionnaire before the commencement of the treatment. The acupuncture treatment is set up format but not disclosed yet. The translation and developing a Korean version of outcome measures that is Korean clinician friendly has been sought for MYMOP is one of the most appropriate one. The permission was granted, the translation into Korean was done, then back translated into English only based on the Korean translation by the researcher who is bilingual in both languages. The back translation was compared by the original developer of MYMOP and confirmed usable. In order to test the existence of acupoints and meridians through popular forms of Korean acupuncture regimes, we aim at collecting opinions from 101 Korean clinicians that have used those forms. The questions asked include most effective symptoms, 5 Shu points, points those are least likely to use due to either adverse events or the lack of effectiveness, theoretical reasons for the above proposals, proposing outcome measures

  14. Measuring Modularity in Open Source Code Bases

    Directory of Open Access Journals (Sweden)

    Roberto Milev

    2009-03-01

    Full Text Available Modularity of an open source software code base has been associated with growth of the software development community, the incentives for voluntary code contribution, and a reduction in the number of users who take code without contributing back to the community. As a theoretical construct, modularity links OSS to other domains of research, including organization theory, the economics of industry structure, and new product development. However, measuring the modularity of an OSS design has proven difficult, especially for large and complex systems. In this article, we describe some preliminary results of recent research at Carleton University that examines the evolving modularity of large-scale software systems. We describe a measurement method and a new modularity metric for comparing code bases of different size, introduce an open source toolkit that implements this method and metric, and provide an analysis of the evolution of the Apache Tomcat application server as an illustrative example of the insights gained from this approach. Although these results are preliminary, they open the door to further cross-discipline research that quantitatively links the concerns of business managers, entrepreneurs, policy-makers, and open source software developers.

  15. Linear systems a measurement based approach

    CERN Document Server

    Bhattacharyya, S P; Mohsenizadeh, D N

    2014-01-01

    This brief presents recent results obtained on the analysis, synthesis and design of systems described by linear equations. It is well known that linear equations arise in most branches of science and engineering as well as social, biological and economic systems. The novelty of this approach is that no models of the system are assumed to be available, nor are they required. Instead, a few measurements made on the system can be processed strategically to directly extract design values that meet specifications without constructing a model of the system, implicitly or explicitly. These new concepts are illustrated by applying them to linear DC and AC circuits, mechanical, civil and hydraulic systems, signal flow block diagrams and control systems. These applications are preliminary and suggest many open problems. The results presented in this brief are the latest effort in this direction and the authors hope these will lead to attractive alternatives to model-based design of engineering and other systems.

  16. Heterogeneity Measurement Based on Distance Measure for Polarimetric SAR Data

    Science.gov (United States)

    Xing, Xiaoli; Chen, Qihao; Liu, Xiuguo

    2018-04-01

    To effectively test the scene heterogeneity for polarimetric synthetic aperture radar (PolSAR) data, in this paper, the distance measure is introduced by utilizing the similarity between the sample and pixels. Moreover, given the influence of the distribution and modeling texture, the K distance measure is deduced according to the Wishart distance measure. Specifically, the average of the pixels in the local window replaces the class center coherency or covariance matrix. The Wishart and K distance measure are calculated between the average matrix and the pixels. Then, the ratio of the standard deviation to the mean is established for the Wishart and K distance measure, and the two features are defined and applied to reflect the complexity of the scene. The proposed heterogeneity measure is proceeded by integrating the two features using the Pauli basis. The experiments conducted on the single-look and multilook PolSAR data demonstrate the effectiveness of the proposed method for the detection of the scene heterogeneity.

  17. Using satellite-based measurements to explore ...

    Science.gov (United States)

    New particle formation (NPF) can potentially alter regional climate by increasing aerosol particle (hereafter particle) number concentrations and ultimately cloud condensation nuclei. The large scales on which NPF is manifest indicate potential to use satellite-based (inherently spatially averaged) measurements of atmospheric conditions to diagnose the occurrence of NPF and NPF characteristics. We demonstrate the potential for using satellite-measurements of insolation (UV), trace gas concentrations (sulfur dioxide (SO2), nitrogen dioxide (NO2), ammonia (NH3), formaldehyde (HCHO), ozone (O3)), aerosol optical properties (aerosol optical depth (AOD), Ångström exponent (AE)), and a proxy of biogenic volatile organic compound emissions (leaf area index (LAI), temperature (T)) as predictors for NPF characteristics: formation rates, growth rates, survival probabilities, and ultrafine particle (UFP) concentrations at five locations across North America. NPF at all sites is most frequent in spring, exhibits a one-day autocorrelation, and is associated with low condensational sink (AOD×AE) and HCHO concentrations, and high UV. However, there are important site-to-site variations in NPF frequency and characteristics, and in which of the predictor variables (particularly gas concentrations) significantly contribute to the explanatory power of regression models built to predict those characteristics. This finding may provide a partial explanation for the reported spatia

  18. Multiparty correlation measure based on the cumulant

    International Nuclear Information System (INIS)

    Zhou, D. L.; Zeng, B.; Xu, Z.; You, L.

    2006-01-01

    We propose a genuine multiparty correlation measure for a multiparty quantum system as the trace norm of the cumulant of the state. The legitimacy of our multiparty correlation measure is explicitly demonstrated by proving it satisfies the five basic conditions required for a correlation measure. As an application we construct an efficient algorithm for the calculation of our measures for all stabilizer states

  19. Orthogonality Measurement for Homogenous Projects-Bases

    Science.gov (United States)

    Ivan, Ion; Sandu, Andrei; Popa, Marius

    2009-01-01

    The homogenous projects-base concept is defined. Next, the necessary steps to create a homogenous projects-base are presented. A metric system is built, which then will be used for analyzing projects. The indicators which are meaningful for analyzing a homogenous projects-base are selected. The given hypothesis is experimentally verified. The…

  20. Laser-based measuring equipment controlled by microcomputer

    International Nuclear Information System (INIS)

    Miron, N.; Sporea, D.; Velculescu, V.G.; Petre, M.

    1988-03-01

    Some laser-based measuring equipment controlled by microcomputer developed for industrial and scientific purposes are described. These equipments are intended for dial indicators verification, graduated rules measurement, and for very accurate measurement of the gravitational constant. (authors)

  1. Parkinson's disease detection based on dysphonia measurements

    Science.gov (United States)

    Lahmiri, Salim

    2017-04-01

    Assessing dysphonic symptoms is a noninvasive and effective approach to detect Parkinson's disease (PD) in patients. The main purpose of this study is to investigate the effect of different dysphonia measurements on PD detection by support vector machine (SVM). Seven categories of dysphonia measurements are considered. Experimental results from ten-fold cross-validation technique demonstrate that vocal fundamental frequency statistics yield the highest accuracy of 88 % ± 0.04. When all dysphonia measurements are employed, the SVM classifier achieves 94 % ± 0.03 accuracy. A refinement of the original patterns space by removing dysphonia measurements with similar variation across healthy and PD subjects allows achieving 97.03 % ± 0.03 accuracy. The latter performance is larger than what is reported in the literature on the same dataset with ten-fold cross-validation technique. Finally, it was found that measures of ratio of noise to tonal components in the voice are the most suitable dysphonic symptoms to detect PD subjects as they achieve 99.64 % ± 0.01 specificity. This finding is highly promising for understanding PD symptoms.

  2. Ground-based measurements of ionospheric dynamics

    Science.gov (United States)

    Kouba, Daniel; Chum, Jaroslav

    2018-05-01

    Different methods are used to research and monitor the ionospheric dynamics using ground measurements: Digisonde Drift Measurements (DDM) and Continuous Doppler Sounding (CDS). For the first time, we present comparison between both methods on specific examples. Both methods provide information about the vertical drift velocity component. The DDM provides more information about the drift velocity vector and detected reflection points. However, the method is limited by the relatively low time resolution. In contrast, the strength of CDS is its high time resolution. The discussed methods can be used for real-time monitoring of medium scale travelling ionospheric disturbances. We conclude that it is advantageous to use both methods simultaneously if possible. The CDS is then applied for the disturbance detection and analysis, and the DDM is applied for the reflection height control.

  3. Statistical Measures for Usage-Based Linguistics

    Science.gov (United States)

    Gries, Stefan Th.; Ellis, Nick C.

    2015-01-01

    The advent of usage-/exemplar-based approaches has resulted in a major change in the theoretical landscape of linguistics, but also in the range of methodologies that are brought to bear on the study of language acquisition/learning, structure, and use. In particular, methods from corpus linguistics are now frequently used to study distributional…

  4. Novel measurement-based indoor cellular radio system design

    OpenAIRE

    Aragón-Zavala, A

    2008-01-01

    A scaleable, measurement-based radio methodology has been created to use for the design, planing and optimisation of in door cellular radio systems. The development of this measurement-based methodology was performed having in mind that measurements are of ten required to valiate radio coverage in a building. Therefore, the concept of using care fully calibrated measurements to design and optimise a system is feasible since these measurements can easily be obtained prior to system deployment ...

  5. Miniaturized diffraction based interferometric distance measurement sensor

    Science.gov (United States)

    Kim, Byungki

    In this thesis, new metrology hardware is designed, fabricated, and tested to provide improvements over current MEMS metrology. The metrology system is a micromachined scanning interferometer (muSI) having a sub-nm resolution in a compact design. The proposed microinterferometer forms a phase sensitive diffraction grating with interferomeric sensitivity, while adding the capability of better lateral resolution by focusing the laser to a smaller spot size. A detailed diffraction model of the microinterferometer was developed to simulate the device performance and to suggest the location of photo detectors for integrated optoelectronics. A particular device is fabricated on a fused silica substrate using aluminum to form the deformable diffraction grating fingers and AZ P4620 photo resist (PR) for the microlens. The details of the fabrication processes are presented. The structure also enables optoelectronics to be integrated so that the interferometer with photo detectors can fit in an area that is 1 mm x 1 mm. The scanning results using a fixed grating muSI demonstrated that it could measure vibration profile as well as static vertical (less than a half wave length) and lateral dimension of MEMS. The muSI, which is integrated with photo diodes, demonstrated its operation by scanning a cMUT. The PID control has been tested and resulted in improvement in scanned images. The integrated muSI demonstrated that the deformable grating could be used to tune the measurement keep the interferometer in quadrature for highest sensitivity.

  6. Development of microcontroller based water flow measurement

    Science.gov (United States)

    Munir, Muhammad Miftahul; Surachman, Arif; Fathonah, Indra Wahyudin; Billah, Muhammad Aziz; Khairurrijal, Mahfudz, Hernawan; Rimawan, Ririn; Lestari, Slamet

    2015-04-01

    A digital instrument for measuring water flow was developed using an AT89S52 microcontroller, DS1302 real time clock (RTC), and EEPROM for an external memory. The sensor used for probing the current was a propeller that will rotate if immersed in a water flow. After rotating one rotation, the sensor sends one pulse and the number of pulses are counted for a certain time of counting. The measurement data, i.e. the number of pulses per unit time, are converted into water flow velocity (m/s) through a mathematical formula. The microcontroller counts the pulse sent by the sensor and the number of counted pulses are stored into the EEPROM memory. The time interval for counting is provided by the RTC and can be set by the operator. The instrument was tested under various time intervals ranging from 10 to 40 seconds and several standard propellers owned by Experimental Station for Hydraulic Structure and Geotechnics (BHGK), Research Institute for Water Resources (Pusair). Using the same propellers and water flows, it was shown that water flow velocities obtained from the developed digital instrument and those found by the provided analog one are almost similar.

  7. Bridge continuous deformation measurement technology based on fiber optic gyro

    Science.gov (United States)

    Gan, Weibing; Hu, Wenbin; Liu, Fang; Tang, Jianguang; Li, Sheng; Yang, Yan

    2016-03-01

    Bridge is an important part of modern transportation systems and deformation is a key index for bridge's safety evaluation. To achieve the long span bridge curve measurement rapidly and timely and accurately locate the bridge maximum deformation, the continuous deformation measurement system (CDMS) based on inertial platform is presented and validated in this paper. Firstly, based on various bridge deformation measurement methods, the method of deformation measurement based on the fiber optic gyro (FOG) is introduced. Secondly, the basic measurement principle based on FOG is presented and the continuous curve trajectory is derived by the formula. Then the measurement accuracy is analyzed in theory and the relevant factors are presented to ensure the measurement accuracy. Finally, the deformation measurement experiments are conducted on a bridge across the Yangtze River. Experimental results show that the presented deformation measurement method is feasible, practical, and reliable; the system can accurately and quickly locate the maximum deformation and has extensive and broad application prospects.

  8. A Transdermal Measurement Platform Based on Microfluidics

    Directory of Open Access Journals (Sweden)

    Wen-Ying Huang

    2017-01-01

    Full Text Available The Franz diffusion cell is one of the most widely used devices to evaluate transdermal drug delivery. However, this static and nonflowing system has some limitations, such as a relatively large solution volume and skin area and the development of gas bubbles during sampling. To overcome these disadvantages, this study provides a proof of concept for miniaturizing models of transdermal delivery by using a microfluidic chip combined with a diffusion cell. The proposed diffusion microchip system requires only 80 μL of sample solution and provides flow circulation. Two model compounds, Coomassie Brilliant Blue G-250 and potassium ferricyanide, were successfully tested for transdermal delivery experiments. The diffusion rate is high for a high sample concentration or a large membrane pore size. The developed diffusion microchip system, which is feasible, can be applied for transdermal measurement in the future.

  9. Competency-Based Education: A Framework for Measuring Quality Courses

    Science.gov (United States)

    Krause, Jackie; Dias, Laura Portolese; Schedler, Chris

    2015-01-01

    The growth of competency-based education in an online environment requires the development and measurement of quality competency-based courses. While quality measures for online courses have been developed and standardized, they do not directly align with emerging best practices and principles in the design of quality competency-based online…

  10. Calibration Base Lines for Electronic Distance Measuring Instruments (EDMI)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A calibration base line (CBL) is a precisely measured, straight-line course of approximately 1,400 m used to calibrate Electronic Distance Measuring Instruments...

  11. Fourier transform based scalable image quality measure.

    Science.gov (United States)

    Narwaria, Manish; Lin, Weisi; McLoughlin, Ian; Emmanuel, Sabu; Chia, Liang-Tien

    2012-08-01

    We present a new image quality assessment (IQA) algorithm based on the phase and magnitude of the 2D (twodimensional) Discrete Fourier Transform (DFT). The basic idea is to compare the phase and magnitude of the reference and distorted images to compute the quality score. However, it is well known that the Human Visual Systems (HVSs) sensitivity to different frequency components is not the same. We accommodate this fact via a simple yet effective strategy of nonuniform binning of the frequency components. This process also leads to reduced space representation of the image thereby enabling the reduced-reference (RR) prospects of the proposed scheme. We employ linear regression to integrate the effects of the changes in phase and magnitude. In this way, the required weights are determined via proper training and hence more convincing and effective. Lastly, using the fact that phase usually conveys more information than magnitude, we use only the phase for RR quality assessment. This provides the crucial advantage of further reduction in the required amount of reference image information. The proposed method is therefore further scalable for RR scenarios. We report extensive experimental results using a total of 9 publicly available databases: 7 image (with a total of 3832 distorted images with diverse distortions) and 2 video databases (totally 228 distorted videos). These show that the proposed method is overall better than several of the existing fullreference (FR) algorithms and two RR algorithms. Additionally, there is a graceful degradation in prediction performance as the amount of reference image information is reduced thereby confirming its scalability prospects. To enable comparisons and future study, a Matlab implementation of the proposed algorithm is available at http://www.ntu.edu.sg/home/wslin/reduced_phase.rar.

  12. Triangulation-based edge measurement using polyview optics

    Science.gov (United States)

    Li, Yinan; Kästner, Markus; Reithmeier, Eduard

    2018-04-01

    Laser triangulation sensors as non-contact measurement devices are widely used in industry and research for profile measurements and quantitative inspections. Some technical applications e.g. edge measurements usually require a configuration of a single sensor and a translation stage or a configuration of multiple sensors, so that they can measure a large measurement range that is out of the scope of a single sensor. However, the cost of both configurations is high, due to the additional rotational axis or additional sensor. This paper provides a special measurement system for measurement of great curved surfaces based on a single sensor configuration. Utilizing a self-designed polyview optics and calibration process, the proposed measurement system allows an over 180° FOV (field of view) with a precise measurement accuracy as well as an advantage of low cost. The detailed capability of this measurement system based on experimental data is discussed in this paper.

  13. History and measurement of the base and derived units

    CERN Document Server

    Treese, Steven A

    2018-01-01

    This book discusses how and why historical measurement units developed, and reviews useful methods for making conversions as well as situations in which dimensional analysis can be used. It starts from the history of length measurement, which is one of the oldest measures used by humans. It highlights the importance of area measurement, briefly discussing the methods for determining areas mathematically and by measurement. The book continues on to detail the development of measures for volume, mass, weight, time, temperature, angle, electrical units, amounts of substances, and light intensity. The seven SI/metric base units are highlighted, as well as a number of other units that have historically been used as base units. Providing a comprehensive reference for interconversion among the commonly measured quantities in the different measurement systems with engineering accuracy, it also examines the relationships among base units in fields such as mechanical/thermal, electromagnetic and physical flow rates and...

  14. The Reliability of Randomly Generated Math Curriculum-Based Measurements

    Science.gov (United States)

    Strait, Gerald G.; Smith, Bradley H.; Pender, Carolyn; Malone, Patrick S.; Roberts, Jarod; Hall, John D.

    2015-01-01

    "Curriculum-Based Measurement" (CBM) is a direct method of academic assessment used to screen and evaluate students' skills and monitor their responses to academic instruction and intervention. Interventioncentral.org offers a math worksheet generator at no cost that creates randomly generated "math curriculum-based measures"…

  15. Performance-Based Measurement: Action for Organizations and HPT Accountability

    Science.gov (United States)

    Larbi-Apau, Josephine A.; Moseley, James L.

    2010-01-01

    Basic measurements and applications of six selected general but critical operational performance-based indicators--effectiveness, efficiency, productivity, profitability, return on investment, and benefit-cost ratio--are presented. With each measurement, goals and potential impact are explored. Errors, risks, limitations to measurements, and a…

  16. Subcopula-based measure of asymmetric association for contingency tables.

    Science.gov (United States)

    Wei, Zheng; Kim, Daeyoung

    2017-10-30

    For the analysis of a two-way contingency table, a new asymmetric association measure is developed. The proposed method uses the subcopula-based regression between the discrete variables to measure the asymmetric predictive powers of the variables of interest. Unlike the existing measures of asymmetric association, the subcopula-based measure is insensitive to the number of categories in a variable, and thus, the magnitude of the proposed measure can be interpreted as the degree of asymmetric association in the contingency table. The theoretical properties of the proposed subcopula-based asymmetric association measure are investigated. We illustrate the performance and advantages of the proposed measure using simulation studies and real data examples. Copyright © 2017 John Wiley & Sons, Ltd.

  17. SEM based overlay measurement between resist and buried patterns

    Science.gov (United States)

    Inoue, Osamu; Okagawa, Yutaka; Hasumi, Kazuhisa; Shao, Chuanyu; Leray, Philippe; Lorusso, Gian; Baudemprez, Bart

    2016-03-01

    With the continuous shrink in pattern size and increased density, overlay control has become one of the most critical issues in semiconductor manufacturing. Recently, SEM based overlay of AEI (After Etch Inspection) wafer has been used for reference and optimization of optical overlay (both Image Based Overlay (IBO) and Diffraction Based Overlay (DBO)). Overlay measurement at AEI stage contributes monitor and forecast the yield after formation by etch and calibrate optical measurement tools. however those overlay value seems difficult directly for feedback to a scanner. Therefore, there is a clear need to have SEM based overlay measurements of ADI (After Develop Inspection) wafers in order to serve as reference for optical overlay and make necessary corrections before wafers go to etch. Furthermore, to make the corrections as accurate as possible, actual device like feature dimensions need to be measured post ADI. This device size measurement is very unique feature of CDSEM , which can be measured with smaller area. This is currently possible only with the CD-SEM. This device size measurement is very unique feature of CD-SEM , which can be measured with smaller area. In this study, we assess SEM based overlay measurement of ADI and AEI wafer by using a sample from an N10 process flow. First, we demonstrate SEM based overlay performance at AEI by using dual damascene process for Via 0 (V0) and metal 1 (M1) layer. We also discuss the overlay measurements between litho-etch-litho stages of a triple patterned M1 layer and double pattern V0. Second, to illustrate the complexities in image acquisition and measurement we will measure overlay between M1B resist and buried M1A-Hard mask trench. Finally, we will show how high accelerating voltage can detect buried pattern information by BSE (Back Scattering Electron). In this paper we discuss the merits of this method versus standard optical metrology based corrections.

  18. Observational attachment theory-based parenting measures predict children's attachment narratives independently from social learning theory-based measures.

    Science.gov (United States)

    Matias, Carla; O'Connor, Thomas G; Futh, Annabel; Scott, Stephen

    2014-01-01

    Conceptually and methodologically distinct models exist for assessing quality of parent-child relationships, but few studies contrast competing models or assess their overlap in predicting developmental outcomes. Using observational methodology, the current study examined the distinctiveness of attachment theory-based and social learning theory-based measures of parenting in predicting two key measures of child adjustment: security of attachment narratives and social acceptance in peer nominations. A total of 113 5-6-year-old children from ethnically diverse families participated. Parent-child relationships were rated using standard paradigms. Measures derived from attachment theory included sensitive responding and mutuality; measures derived from social learning theory included positive attending, directives, and criticism. Child outcomes were independently-rated attachment narrative representations and peer nominations. Results indicated that Attachment theory-based and Social Learning theory-based measures were modestly correlated; nonetheless, parent-child mutuality predicted secure child attachment narratives independently of social learning theory-based measures; in contrast, criticism predicted peer-nominated fighting independently of attachment theory-based measures. In young children, there is some evidence that attachment theory-based measures may be particularly predictive of attachment narratives; however, no single model of measuring parent-child relationships is likely to best predict multiple developmental outcomes. Assessment in research and applied settings may benefit from integration of different theoretical and methodological paradigms.

  19. Cyst-based measurements for assessing lymphangioleiomyomatosis in computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Lo, P., E-mail: pechinlo@mednet.edu.ucla; Brown, M. S.; Kim, H.; Kim, H.; Goldin, J. G. [Center for Computer Vision and Imaging Biomarkers, Department of Radiological Sciences, David Geffen School of Medicine, University of California, Los Angeles, California 90024 (United States); Argula, R.; Strange, C. [Division of Pulmonary and Critical Care Medicine, Medical University of South Carolina, Charleston, South Carolina 29425 (United States)

    2015-05-15

    Purpose: To investigate the efficacy of a new family of measurements made on individual pulmonary cysts extracted from computed tomography (CT) for assessing the severity of lymphangioleiomyomatosis (LAM). Methods: CT images were analyzed using thresholding to identify a cystic region of interest from chest CT of LAM patients. Individual cysts were then extracted from the cystic region by the watershed algorithm, which separates individual cysts based on subtle edges within the cystic regions. A family of measurements were then computed, which quantify the amount, distribution, and boundary appearance of the cysts. Sequential floating feature selection was used to select a small subset of features for quantification of the severity of LAM. Adjusted R{sup 2} from multiple linear regression and R{sup 2} from linear regression against measurements from spirometry were used to compare the performance of our proposed measurements with currently used density based CT measurements in the literature, namely, the relative area measure and the D measure. Results: Volumetric CT data, performed at total lung capacity and residual volume, from a total of 49 subjects enrolled in the MILES trial were used in our study. Our proposed measures had adjusted R{sup 2} ranging from 0.42 to 0.59 when regressing against the spirometry measures, with p < 0.05. For previously used density based CT measurements in the literature, the best R{sup 2} was 0.46 (for only one instance), with the majority being lower than 0.3 or p > 0.05. Conclusions: The proposed family of CT-based cyst measurements have better correlation with spirometric measures than previously used density based CT measurements. They show potential as a sensitive tool for quantitatively assessing the severity of LAM.

  20. Fracture toughness measurements of WC-based hard metals

    International Nuclear Information System (INIS)

    Prakash, L.; Albert, B.

    1983-01-01

    The fracture toughness of WC-based cemented carbides was determined by different methods. The values obtained are dependent on the procedure of measurement. Each method thoughness of hard metals mutually. (orig.) [de

  1. Multivariate Methods Based Soft Measurement for Wine Quality Evaluation

    Directory of Open Access Journals (Sweden)

    Shen Yin

    2014-01-01

    a decision. However, since the physicochemical indexes of wine can to some extent reflect the quality of wine, the multivariate statistical methods based soft measure can help the oenologist in wine evaluation.

  2. Calibration of Smartphone-Based Weather Measurements Using Pairwise Gossip

    Directory of Open Access Journals (Sweden)

    Jane Louie Fresco Zamora

    2015-01-01

    Full Text Available Accurate and reliable daily global weather reports are necessary for weather forecasting and climate analysis. However, the availability of these reports continues to decline due to the lack of economic support and policies in maintaining ground weather measurement systems from where these reports are obtained. Thus, to mitigate data scarcity, it is required to utilize weather information from existing sensors and built-in smartphone sensors. However, as smartphone usage often varies according to human activity, it is difficult to obtain accurate measurement data. In this paper, we present a heuristic-based pairwise gossip algorithm that will calibrate smartphone-based pressure sensors with respect to fixed weather stations as our referential ground truth. Based on actual measurements, we have verified that smartphone-based readings are unstable when observed during movement. Using our calibration algorithm on actual smartphone-based pressure readings, the updated values were significantly closer to the ground truth values.

  3. Calibration of Smartphone-Based Weather Measurements Using Pairwise Gossip.

    Science.gov (United States)

    Zamora, Jane Louie Fresco; Kashihara, Shigeru; Yamaguchi, Suguru

    2015-01-01

    Accurate and reliable daily global weather reports are necessary for weather forecasting and climate analysis. However, the availability of these reports continues to decline due to the lack of economic support and policies in maintaining ground weather measurement systems from where these reports are obtained. Thus, to mitigate data scarcity, it is required to utilize weather information from existing sensors and built-in smartphone sensors. However, as smartphone usage often varies according to human activity, it is difficult to obtain accurate measurement data. In this paper, we present a heuristic-based pairwise gossip algorithm that will calibrate smartphone-based pressure sensors with respect to fixed weather stations as our referential ground truth. Based on actual measurements, we have verified that smartphone-based readings are unstable when observed during movement. Using our calibration algorithm on actual smartphone-based pressure readings, the updated values were significantly closer to the ground truth values.

  4. Measuring Disorientation Based on the Needleman-Wunsch Algorithm

    Science.gov (United States)

    Güyer, Tolga; Atasoy, Bilal; Somyürek, Sibel

    2015-01-01

    This study offers a new method to measure navigation disorientation in web based systems which is powerful learning medium for distance and open education. The Needleman-Wunsch algorithm is used to measure disorientation in a more precise manner. The process combines theoretical and applied knowledge from two previously distinct research areas,…

  5. Measurement properties of performance-based measures to assess physical function in hip and knee osteoarthritis

    DEFF Research Database (Denmark)

    Dobson, F; Hinman, R S; Hall, M

    2012-01-01

    OBJECTIVES: To systematically review the measurement properties of performance-based measures to assess physical function in people with hip and/or knee osteoarthritis (OA). METHODS: Electronic searches were performed in MEDLINE, CINAHL, Embase, and PsycINFO up to the end of June 2012. Two...... investigating measurement properties of performance measures, including responsiveness and interpretability in people with hip and/or knee OA, is needed. Consensus on which combination of measures will best assess physical function in people with hip/and or knee OA is urgently required....

  6. On-Line Voltage Stability Assessment based on PMU Measurements

    DEFF Research Database (Denmark)

    Garcia-Valle, Rodrigo; P. Da Silva, Luiz C.; Nielsen, Arne Hejde

    2009-01-01

    This paper presents a method for on-line monitoring of risk voltage collapse based on synchronised phasor measurement. As there is no room for intensive computation and analysis in real-time, the method is based on the combination of off-line computation and on-line monitoring, which are correlat...

  7. Dipole location using SQUID based measurements: Application to magnetocardiography

    Science.gov (United States)

    Mariyappa, N.; Parasakthi, C.; Sengottuvel, S.; Gireesan, K.; Patel, Rajesh; Janawadkar, M. P.; Sundar, C. S.; Radhakrishnan, T. S.

    2012-07-01

    We report a method of inferring the dipole location using iterative nonlinear least square optimization based on Levenberg-Marquardt algorithm, wherein, we use different sets of pseudo-random numbers as initial parameter values. The method has been applied to (i) the simulated data representing the calculated magnetic field distribution produced by a point dipole placed at a known position, (ii) the experimental data from SQUID based measurements of the magnetic field distribution produced by a source coil carrying current, and (iii) the actual experimentally measured magnetocardiograms of human subjects using a SQUID based system.

  8. Variable screening and ranking using sampling-based sensitivity measures

    International Nuclear Information System (INIS)

    Wu, Y-T.; Mohanty, Sitakanta

    2006-01-01

    This paper presents a methodology for screening insignificant random variables and ranking significant important random variables using sensitivity measures including two cumulative distribution function (CDF)-based and two mean-response based measures. The methodology features (1) using random samples to compute sensitivities and (2) using acceptance limits, derived from the test-of-hypothesis, to classify significant and insignificant random variables. Because no approximation is needed in either the form of the performance functions or the type of continuous distribution functions representing input variables, the sampling-based approach can handle highly nonlinear functions with non-normal variables. The main characteristics and effectiveness of the sampling-based sensitivity measures are investigated using both simple and complex examples. Because the number of samples needed does not depend on the number of variables, the methodology appears to be particularly suitable for problems with large, complex models that have large numbers of random variables but relatively few numbers of significant random variables

  9. Local high precision 3D measurement based on line laser measuring instrument

    Science.gov (United States)

    Zhang, Renwei; Liu, Wei; Lu, Yongkang; Zhang, Yang; Ma, Jianwei; Jia, Zhenyuan

    2018-03-01

    In order to realize the precision machining and assembly of the parts, the geometrical dimensions of the surface of the local assembly surfaces need to be strictly guaranteed. In this paper, a local high-precision three-dimensional measurement method based on line laser measuring instrument is proposed to achieve a high degree of accuracy of the three-dimensional reconstruction of the surface. Aiming at the problem of two-dimensional line laser measuring instrument which lacks one-dimensional high-precision information, a local three-dimensional profile measuring system based on an accurate single-axis controller is proposed. First of all, a three-dimensional data compensation method based on spatial multi-angle line laser measuring instrument is proposed to achieve the high-precision measurement of the default axis. Through the pretreatment of the 3D point cloud information, the measurement points can be restored accurately. Finally, the target spherical surface is needed to make local three-dimensional scanning measurements for accuracy verification. The experimental results show that this scheme can get the local three-dimensional information of the target quickly and accurately, and achieves the purpose of gaining the information and compensating the error for laser scanner information, and improves the local measurement accuracy.

  10. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument

    Directory of Open Access Journals (Sweden)

    Yin Peili

    2017-08-01

    Full Text Available Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI. The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  11. pH measurements of FET-based (bio)chemical sensors using portable measurement system.

    Science.gov (United States)

    Voitsekhivska, T; Zorgiebel, F; Suthau, E; Wolter, K-J; Bock, K; Cuniberti, G

    2015-01-01

    In this study we demonstrate the sensing capabilities of a portable multiplex measurement system for FET-based (bio)chemical sensors with an integrated microfluidic interface. We therefore conducted pH measurements with Silicon Nanoribbon FET-based Sensors using different measurement procedures that are suitable for various applications. We have shown multiplexed measurements in aqueous medium for three different modes that are mutually specialized in fast data acquisition (constant drain current), calibration-less sensing (constant gate voltage) and in providing full information content (sweeping mode). Our system therefore allows surface charge sensing for a wide range of applications and is easily adaptable for multiplexed sensing with novel FET-based (bio)chemical sensors.

  12. Analysis of rocket flight stability based on optical image measurement

    Science.gov (United States)

    Cui, Shuhua; Liu, Junhu; Shen, Si; Wang, Min; Liu, Jun

    2018-02-01

    Based on the abundant optical image measurement data from the optical measurement information, this paper puts forward the method of evaluating the rocket flight stability performance by using the measurement data of the characteristics of the carrier rocket in imaging. On the basis of the method of measuring the characteristics of the carrier rocket, the attitude parameters of the rocket body in the coordinate system are calculated by using the measurements data of multiple high-speed television sets, and then the parameters are transferred to the rocket body attack angle and it is assessed whether the rocket has a good flight stability flying with a small attack angle. The measurement method and the mathematical algorithm steps through the data processing test, where you can intuitively observe the rocket flight stability state, and also can visually identify the guidance system or failure analysis.

  13. Three-dimensional hindfoot alignment measurements based on biplanar radiographs: comparison with standard radiographic measurements

    International Nuclear Information System (INIS)

    Sutter, Reto; Pfirrmann, Christian W.A.; Buck, Florian M.; Espinosa, Norman

    2013-01-01

    To establish a hindfoot alignment measurement technique based on low-dose biplanar radiographs and compare with hindfoot alignment measurements on long axial view radiographs, which is the current reference standard. Long axial view radiographs and low-dose biplanar radiographs of a phantom consisting of a human foot skeleton embedded in acrylic glass (phantom A) and a plastic model of a human foot in three different hindfoot positions (phantoms B1-B3) were imaged in different foot positions (20 internal to 20 external rotation). Two independent readers measured hindfoot alignment on long axial view radiographs and performed 3D hindfoot alignment measurements based on biplanar radiographs on two different occasions. Time for three-dimensional (3D) measurements was determined. Intraclass correlation coefficients (ICC) were calculated. Hindfoot alignment measurements on long axial view radiographs were characterized by a large positional variation, with a range of 14 /13 valgus to 22 /27 varus (reader 1/2 for phantom A), whereas the range of 3D hindfoot alignment measurements was 7.3 /6.0 to 9.0 /10.5 varus (reader 1/2 for phantom A), with a mean and standard deviation of 8.1 ± 0.6/8.7 ± 1.4 respectively. Interobserver agreement was high (ICC = 0.926 for phantom A, and ICC = 0.886 for phantoms B1-B3), and agreement between different readouts was high (ICC = 0.895-0.995 for reader 1, and ICC = 0.987-0.994 for reader 2) for 3D measurements. Mean duration of 3D measurements was 84 ± 15/113 ± 15 s for reader 1/2. Three-dimensional hindfoot alignment measurements based on biplanar radiographs were independent of foot positioning during image acquisition and reader independent. In this phantom study, the 3D measurements were substantially more precise than the standard radiographic measurements. (orig.)

  14. A New Laser Based Approach for Measuring Atmospheric Greenhouse Gases

    Directory of Open Access Journals (Sweden)

    Jeremy Dobler

    2013-11-01

    Full Text Available In 2012, we developed a proof-of-concept system for a new open-path laser absorption spectrometer concept for measuring atmospheric CO2. The measurement approach utilizes high-reliability all-fiber-based, continuous-wave laser technology, along with a unique all-digital lock-in amplifier method that, together, enables simultaneous transmission and reception of multiple fixed wavelengths of light. This new technique, which utilizes very little transmitted energy relative to conventional lidar systems, provides high signal-to-noise (SNR measurements, even in the presence of a large background signal. This proof-of-concept system, tested in both a laboratory environment and a limited number of field experiments over path lengths of 680 m and 1,600 m, demonstrated SNR values >1,000 for received signals of ~18 picoWatts averaged over 60 s. A SNR of 1,000 is equivalent to a measurement precision of ±0.001 or ~0.4 ppmv. The measurement method is expected to provide new capability for automated monitoring of greenhouse gas at fixed sites, such as carbon sequestration facilities, volcanoes, the short- and long-term assessment of urban plumes, and other similar applications. In addition, this concept enables active measurements of column amounts from a geosynchronous orbit for a network of ground-based receivers/stations that would complement other current and planned space-based measurement capabilities.

  15. Measures of Competitive Intensity – Analysis Based on Literature Review

    Directory of Open Access Journals (Sweden)

    Dariusz Kwieciński

    2017-03-01

    Full Text Available Purpose: To systematize the existing approaches and tools used for measuring competitive intensity. Methodology: Systematic literature review along with critical literature review. Findings: Identifcation of two main approaches to measuring competition intensity: the frst pertains to research based on experts’ opinions and involves the use of questionnaires (primary sources, while the second is based on structural variables used with a variety of indexes (secondary sources. In addition, variables applied for the purpose of measuring the intensity of competition are divided into structural and behavioural. Research implications: Research implications are two-fold. Firstly, a distinction is made between various types of existing approaches to measuring competitive intensity. Secondly, research is carried out, inter alia, with regard to the actual object of certain measures, as opposed to their object stemming from commonly accepted defnitions. Practical implications: The issue of measuring competition intensity occupies a prominent place in the discussion on the effectiveness of inter-organizational relationships. The fndings outlined in this paper may help managers to develop/adopt the right approach supporting their strategic decisions. Originality: The paper provides a complex review of the existing methods and measures of competitive intensity. It systematizes recent knowledge about competitive intensity measurements.

  16. Biometric identification based on novel frequency domain facial asymmetry measures

    Science.gov (United States)

    Mitra, Sinjini; Savvides, Marios; Vijaya Kumar, B. V. K.

    2005-03-01

    In the modern world, the ever-growing need to ensure a system's security has spurred the growth of the newly emerging technology of biometric identification. The present paper introduces a novel set of facial biometrics based on quantified facial asymmetry measures in the frequency domain. In particular, we show that these biometrics work well for face images showing expression variations and have the potential to do so in presence of illumination variations as well. A comparison of the recognition rates with those obtained from spatial domain asymmetry measures based on raw intensity values suggests that the frequency domain representation is more robust to intra-personal distortions and is a novel approach for performing biometric identification. In addition, some feature analysis based on statistical methods comparing the asymmetry measures across different individuals and across different expressions is presented.

  17. Phase Difference Measurement Method Based on Progressive Phase Shift

    Directory of Open Access Journals (Sweden)

    Min Zhang

    2018-06-01

    Full Text Available This paper proposes a method for phase difference measurement based on the principle of progressive phase shift (PPS. A phase difference measurement system based on PPS and implemented in the FPGA chip is proposed and tested. In the realized system, a fully programmable delay line (PDL is constructed, which provides accurate and stable delay, benefitting from the feed-back structure of the control module. The control module calibrates the delay according to process, voltage and temperature (PVT variations. Furthermore, a modified method based on double PPS is incorporated to improve the resolution. The obtained resolution is 25 ps. Moreover, to improve the resolution, the proposed method is implemented on the 20 nm Xilinx Kintex Ultrascale platform, and test results indicate that the obtained measurement error and clock synchronization error is within the range of ±5 ps.

  18. Modern gas-based temperature and pressure measurements

    CERN Document Server

    Pavese, Franco

    2013-01-01

    This 2nd edition volume of Modern Gas-Based Temperature and Pressure Measurements follows the first publication in 1992. It collects a much larger set of information, reference data, and bibliography in temperature and pressure metrology of gaseous substances, including the physical-chemical issues related to gaseous substances. The book provides solutions to practical applications where gases are used in different thermodynamic conditions. Modern Gas-Based Temperature and Pressure Measurements, 2nd edition is the only comprehensive survey of methods for pressure measurement in gaseous media used in the medium-to-low pressure range closely connected with thermometry. It assembles current information on thermometry and manometry that involve the use of gaseous substances which are likely to be valid methods for the future. As such, it is an important resource for the researcher. This edition is updated through the very latest scientific and technical developments of gas-based temperature and pressure measurem...

  19. Generalized phase retrieval algorithm based on information measures

    OpenAIRE

    Shioya, Hiroyuki; Gohara, Kazutoshi

    2006-01-01

    An iterative phase retrieval algorithm based on the maximum entropy method (MEM) is presented. Introducing a new generalized information measure, we derive a novel class of algorithms which includes the conventionally used error reduction algorithm and a MEM-type iterative algorithm which is presented for the first time. These different phase retrieval methods are unified on the basis of the framework of information measures used in information theory.

  20. A web-based tool for ranking landslide mitigation measures

    Science.gov (United States)

    Lacasse, S.; Vaciago, G.; Choi, Y. J.; Kalsnes, B.

    2012-04-01

    As part of the research done in the European project SafeLand "Living with landslide risk in Europe: Assessment, effects of global change, and risk management strategies", a compendium of structural and non-structural mitigation measures for different landslide types in Europe was prepared, and the measures were assembled into a web-based "toolbox". Emphasis was placed on providing a rational and flexible framework applicable to existing and future mitigation measures. The purpose of web-based toolbox is to assist decision-making and to guide the user in the choice of the most appropriate mitigation measures. The mitigation measures were classified into three categories, describing whether the mitigation measures addressed the landslide hazard, the vulnerability or the elements at risk themselves. The measures considered include structural measures reducing hazard and non-structural mitigation measures, reducing either the hazard or the consequences (or vulnerability and exposure of elements at risk). The structural measures include surface protection and control of surface erosion; measures modifying the slope geometry and/or mass distribution; measures modifying surface water regime - surface drainage; measures mo¬difying groundwater regime - deep drainage; measured modifying the mechanical charac¬teristics of unstable mass; transfer of loads to more competent strata; retaining structures (to modify slope geometry and/or to transfer stress to compe¬tent layer); deviating the path of landslide debris; dissipating the energy of debris flows; and arresting and containing landslide debris or rock fall. The non-structural mitigation measures, reducing either the hazard or the consequences: early warning systems; restricting or discouraging construction activities; increasing resistance or coping capacity of elements at risk; relocation of elements at risk; sharing of risk through insurance. The measures are described in the toolbox with fact sheets providing a

  1. Predictive Software Measures based on Z Specifications - A Case Study

    Directory of Open Access Journals (Sweden)

    Andreas Bollin

    2012-07-01

    Full Text Available Estimating the effort and quality of a system is a critical step at the beginning of every software project. It is necessary to have reliable ways of calculating these measures, and, it is even better when the calculation can be done as early as possible in the development life-cycle. Having this in mind, metrics for formal specifications are examined with a view to correlations to complexity and quality-based code measures. A case study, based on a Z specification and its implementation in ADA, analyzes the practicability of these metrics as predictors.

  2. Portable audio electronics for impedance-based measurements in microfluidics

    International Nuclear Information System (INIS)

    Wood, Paul; Sinton, David

    2010-01-01

    We demonstrate the use of audio electronics-based signals to perform on-chip electrochemical measurements. Cell phones and portable music players are examples of consumer electronics that are easily operated and are ubiquitous worldwide. Audio output (play) and input (record) signals are voltage based and contain frequency and amplitude information. A cell phone, laptop soundcard and two compact audio players are compared with respect to frequency response; the laptop soundcard provides the most uniform frequency response, while the cell phone performance is found to be insufficient. The audio signals in the common portable music players and laptop soundcard operate in the range of 20 Hz to 20 kHz and are found to be applicable, as voltage input and output signals, to impedance-based electrochemical measurements in microfluidic systems. Validated impedance-based measurements of concentration (0.1–50 mM), flow rate (2–120 µL min −1 ) and particle detection (32 µm diameter) are demonstrated. The prevailing, lossless, wave audio file format is found to be suitable for data transmission to and from external sources, such as a centralized lab, and the cost of all hardware (in addition to audio devices) is ∼10 USD. The utility demonstrated here, in combination with the ubiquitous nature of portable audio electronics, presents new opportunities for impedance-based measurements in portable microfluidic systems. (technical note)

  3. Principal Component Analysis Based Measure of Structural Holes

    Science.gov (United States)

    Deng, Shiguo; Zhang, Wenqing; Yang, Huijie

    2013-02-01

    Based upon principal component analysis, a new measure called compressibility coefficient is proposed to evaluate structural holes in networks. This measure incorporates a new effect from identical patterns in networks. It is found that compressibility coefficient for Watts-Strogatz small-world networks increases monotonically with the rewiring probability and saturates to that for the corresponding shuffled networks. While compressibility coefficient for extended Barabasi-Albert scale-free networks decreases monotonically with the preferential effect and is significantly large compared with that for corresponding shuffled networks. This measure is helpful in diverse research fields to evaluate global efficiency of networks.

  4. Patch near field acoustic holography based on particle velocity measurements

    DEFF Research Database (Denmark)

    Zhang, Yong-Bin; Jacobsen, Finn; Bi, Chuan-Xing

    2009-01-01

    Patch near field acoustic holography (PNAH) based on sound pressure measurements makes it possible to reconstruct the source field near a source by measuring the sound pressure at positions on a surface. that is comparable in size to the source region of concern. Particle velocity is an alternative...... examines the use of particle velocity as the input of PNAH. Because the particle velocity decays faster toward the edges of the measurement aperture than the pressure does and because the wave number ratio that enters into the inverse propagator from pressure to velocity amplifies high spatial frequencies...

  5. Confidence bounds of recurrence-based complexity measures

    International Nuclear Information System (INIS)

    Schinkel, Stefan; Marwan, N.; Dimigen, O.; Kurths, J.

    2009-01-01

    In the recent past, recurrence quantification analysis (RQA) has gained an increasing interest in various research areas. The complexity measures the RQA provides have been useful in describing and analysing a broad range of data. It is known to be rather robust to noise and nonstationarities. Yet, one key question in empirical research concerns the confidence bounds of measured data. In the present Letter we suggest a method for estimating the confidence bounds of recurrence-based complexity measures. We study the applicability of the suggested method with model and real-life data.

  6. Bread Water Content Measurement Based on Hyperspectral Imaging

    DEFF Research Database (Denmark)

    Liu, Zhi; Møller, Flemming

    2011-01-01

    Water content is one of the most important properties of the bread for tasting assesment or store monitoring. Traditional bread water content measurement methods mostly are processed manually, which is destructive and time consuming. This paper proposes an automated water content measurement...... for bread quality based on near-infrared hyperspectral imaging against the conventional manual loss-in-weight method. For this purpose, the hyperspectral components unmixing technology is used for measuring the water content quantitatively. And the definition on bread water content index is presented...

  7. Tethered balloon-based measurements of meteorological variables and aerosols

    Science.gov (United States)

    Sentell, R. J.; Storey, R. W.; Chang, J. J. C.; Jacobsen, S. J.

    1976-01-01

    Tethered balloon based measurements of the vertical distributions of temperature, humidity, wind speed, and aerosol concentrations were taken over a 4-hour period beginning at sunrise on June 29, 1976, at Wallops Island, Virginia. Twelve consecutive profiles of each variable were obtained from ground to about 500 meters. These measurements were in conjuction with a noise propagation study on remotely arrayed acoustic range (ROMAAR) at Wallops Flight Center. An organized listing of these vertical soundings is presented. The tethered balloon system configuration utilized for these measurements is described.

  8. Image based method for aberration measurement of lithographic tools

    Science.gov (United States)

    Xu, Shuang; Tao, Bo; Guo, Yongxing; Li, Gongfa

    2018-01-01

    Information of lens aberration of lithographic tools is important as it directly affects the intensity distribution in the image plane. Zernike polynomials are commonly used for a mathematical description of lens aberrations. Due to the advantage of lower cost and easier implementation of tools, image based measurement techniques have been widely used. Lithographic tools are typically partially coherent systems that can be described by a bilinear model, which entails time consuming calculations and does not lend a simple and intuitive relationship between lens aberrations and the resulted images. Previous methods for retrieving lens aberrations in such partially coherent systems involve through-focus image measurements and time-consuming iterative algorithms. In this work, we propose a method for aberration measurement in lithographic tools, which only requires measuring two images of intensity distribution. Two linear formulations are derived in matrix forms that directly relate the measured images to the unknown Zernike coefficients. Consequently, an efficient non-iterative solution is obtained.

  9. A complex network-based importance measure for mechatronics systems

    Science.gov (United States)

    Wang, Yanhui; Bi, Lifeng; Lin, Shuai; Li, Man; Shi, Hao

    2017-01-01

    In view of the negative impact of functional dependency, this paper attempts to provide an alternative importance measure called Improved-PageRank (IPR) for measuring the importance of components in mechatronics systems. IPR is a meaningful extension of the centrality measures in complex network, which considers usage reliability of components and functional dependency between components to increase importance measures usefulness. Our work makes two important contributions. First, this paper integrates the literature of mechatronic architecture and complex networks theory to define component network. Second, based on the notion of component network, a meaningful IPR is brought into the identifying of important components. In addition, the IPR component importance measures, and an algorithm to perform stochastic ordering of components due to the time-varying nature of usage reliability of components and functional dependency between components, are illustrated with a component network of bogie system that consists of 27 components.

  10. Drone based measurement system for radiofrequency exposure assessment.

    Science.gov (United States)

    Joseph, Wout; Aerts, Sam; Vandenbossche, Matthias; Thielens, Arno; Martens, Luc

    2016-03-10

    For the first time, a method to assess radiofrequency (RF) electromagnetic field (EMF) exposure of the general public in real environments with a true free-space antenna system is presented. Using lightweight electronics and multiple antennas placed on a drone, it is possible to perform exposure measurements. This technique will enable researchers to measure three-dimensional RF-EMF exposure patterns accurately in the future and at locations currently difficult to access. A measurement procedure and appropriate measurement settings have been developed. As an application, outdoor measurements are performed as a function of height up to 60 m for Global System for Mobile Communications (GSM) 900 MHz base station exposure. Bioelectromagnetics. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  11. Forecasting method in multilateration accuracy based on laser tracker measurement

    International Nuclear Information System (INIS)

    Aguado, Sergio; Santolaria, Jorge; Samper, David; José Aguilar, Juan

    2017-01-01

    Multilateration based on a laser tracker (LT) requires the measurement of a set of points from three or more positions. Although the LTs’ angular information is not used, multilateration produces a volume of measurement uncertainty. This paper presents two new coefficients from which to determine whether the measurement of a set of points, before performing the necessary measurements, will improve or worsen the accuracy of the multilateration results, avoiding unnecessary measurement, and reducing the time and economic cost required. The first specific coefficient measurement coefficient (MC LT ) is unique for each laser tracker. It determines the relationship between the radial and angular laser tracker measurement noise. Similarly, the second coefficient is related with specific conditions of measurement β . It is related with the spatial angle between the laser tracker positions α and its effect on error reduction. Both parameters MC LT and β are linked in error reduction limits. Beside these, a new methodology to determine the multilateration reduction limit according to the multilateration technique of an ideal laser tracker distribution and a random one are presented. It provides general rules and advice from synthetic tests that are validated through a real test carried out in a coordinate measurement machine. (paper)

  12. Measurement channel of neutron flow based on software

    International Nuclear Information System (INIS)

    Rivero G, T.; Benitez R, J. S.

    2008-01-01

    The measurement of the thermal power in nuclear reactors is based mainly on the measurement of the neutron flow. The presence of these in the reactor core is associated to neutrons released by the fission reaction of the uranium-235. Once moderate, these neutrons are precursors of new fissions. This process it is known like chain reaction. Thus, the power to which works a nuclear reactor, he is proportional to the number of produced fissions and as these depend on released neutrons, also the power is proportional to the number of present neutrons. The measurement of the thermal power in a reactor is realized with called instruments nuclear channels. To low power (level source), these channels measure the individual counts of detected neutrons, whereas to a medium and high power, they measure the electrical current or fluctuation of the same one that generate the fission neutrons in ionization chambers especially designed to detect neutrons. For the case of TRIGA reactors, the measurement channels of neutron flow use discreet digital electronic technology makes some decades already. Recently new technological tools have arisen that allow developing new versions of nuclear channels of simple form and compacts. The present work consists of the development of a nuclear channel for TRIGA reactors based on the use of the correlated signal of a fission chamber for ample interval. This new measurement channel uses a data acquisition card of high speed and the data processing by software that to the being installed in a computer is created a virtual instrument, with what spreads in real time, in graphic and understandable form for the operator, the power indication to which it operates the nuclear reactor. This system when being based on software, offers a major versatility to realize changes in the signal processing and power monitoring algorithms. The experimental tests of neutronic power measurement show a reliable performance through seven decades of power, with a

  13. Integrated method for the measurement of trace nitrogenous atmospheric bases

    Directory of Open Access Journals (Sweden)

    D. Key

    2011-12-01

    Full Text Available Nitrogenous atmospheric bases are thought to play a key role in the global nitrogen cycle, but their sources, transport, and sinks remain poorly understood. Of the many methods available to measure such compounds in ambient air, few meet the current need of being applicable to the complete range of potential analytes and fewer still are convenient to implement using instrumentation that is standard to most laboratories. In this work, an integrated approach to measuring trace, atmospheric, gaseous nitrogenous bases has been developed and validated. The method uses a simple acid scrubbing step to capture and concentrate the bases as their phosphite salts, which then are derivatized and analyzed using GC/MS and/or LC/MS. The advantages of both techniques in the context of the present measurements are discussed. The approach is sensitive, selective, reproducible, as well as convenient to implement and has been validated for different sampling strategies. The limits of detection for the families of tested compounds are suitable for ambient measurement applications (e.g., methylamine, 1 pptv; ethylamine, 2 pptv; morpholine, 1 pptv; aniline, 1 pptv; hydrazine, 0.1 pptv; methylhydrazine, 2 pptv, as supported by field measurements in an urban park and in the exhaust of on-road vehicles.

  14. Resource management in Diffserv measurement-based admission control PHR

    NARCIS (Netherlands)

    Westberg, L.; Heijenk, Geert; Karagiannis, Georgios; Oosthoek, S.; Partain, D.; Rexhepi, Vlora; Szabo, R.; Wallentin, P.; El Allali, H.

    2002-01-01

    The purpose of this draft is to present the Resource Management in Diffserv (RMD) Measurement-Based Admission Control (RIMA) Per Hop Reservation (PHR) protocol. The RIMA PHR protocol is used on a per-hop basis in a Differentiated Services (Diffserv) domain and extends the Diffserv Per Hop Behavior

  15. Assessing Children's Writing Products: The Role of Curriculum Based Measures

    Science.gov (United States)

    Dockrell, Julie E.; Connelly, Vincent; Walter, Kirsty; Critten, Sarah

    2015-01-01

    The assessment of children's writing raises technical and practical challenges. In this paper we examine the potential use of a curriculum based measure for writing (CBM-W) to assess the written texts of pupils in Key Stage 2 (M age 107 months, range 88 to 125). Two hundred and thirty six Year three, five and six pupils completed a standardized…

  16. Hydrogel-based sensor for CO2 measurements

    NARCIS (Netherlands)

    Herber, S.; Olthuis, Wouter; Bergveld, Piet; van den Berg, Albert

    2004-01-01

    A hydrogel-based sensor is presented for CO2 measurements. The sensor consists of a pressure sensor and porous silicon cover. A pH-sensitive hydrogel is confined between the two parts. Furthermore the porous cover contains a bicarbonate solution and a gaspermeable membrane. CO2 reacts with the

  17. Functional Size Measurement applied to UML-based user requirements

    NARCIS (Netherlands)

    van den Berg, Klaas; Dekkers, Ton; Oudshoorn, Rogier; Dekkers, T.

    There is a growing interest in applying standardized methods for Functional Size Measurement (FSM) to Functional User Requirements (FUR) based on models in the Unified Modelling Language (UML). No consensus exists on this issue. We analyzed the demands that FSM places on FURs. We propose a

  18. Noninvasive microbubble-based pressure measurements: a simulation study

    NARCIS (Netherlands)

    Postema, Michiel; Postema, M.A.B.; Bouakaz, Ayache; de Jong, N.

    2004-01-01

    This paper describes a noninvasive method to measure local hydrostatic pressures in fluid filled cavities. The method is based on the disappearance time of a gas bubble, as the disappearance time is related to the hydrostatic pressure. When a bubble shrinks, its response to ultrasound changes. From

  19. Metrology of human-based and other qualitative measurements

    Science.gov (United States)

    Pendrill, Leslie; Petersson, Niclas

    2016-09-01

    The metrology of human-based and other qualitative measurements is in its infancy—concepts such as traceability and uncertainty are as yet poorly developed. This paper reviews how a measurement system analysis approach, particularly invoking as performance metric the ability of a probe (such as a human being) acting as a measurement instrument to make a successful decision, can enable a more general metrological treatment of qualitative observations. Measures based on human observations are typically qualitative, not only in sectors, such as health care, services and safety, where the human factor is obvious, but also in customer perception of traditional products of all kinds. A principal challenge is that the usual tools of statistics normally employed for expressing measurement accuracy and uncertainty will probably not work reliably if relations between distances on different portions of scales are not fully known, as is typical of ordinal or other qualitative measurements. A key enabling insight is to connect the treatment of decision risks associated with measurement uncertainty to generalized linear modelling (GLM). Handling qualitative observations in this way unites information theory, the perceptive identification and choice paradigms of psychophysics. The Rasch invariant measure psychometric GLM approach in particular enables a proper treatment of ordinal data; a clear separation of probe and item attribute estimates; simple expressions for instrument sensitivity; etc. Examples include two aspects of the care of breast cancer patients, from diagnosis to rehabilitation. The Rasch approach leads in turn to opportunities of establishing metrological references for quality assurance of qualitative measurements. In psychometrics, one could imagine a certified reference for knowledge challenge, for example, a particular concept in understanding physics or for product quality of a certain health care service. Multivariate methods, such as Principal Component

  20. Metrology of human-based and other qualitative measurements

    International Nuclear Information System (INIS)

    Pendrill, Leslie; Petersson, Niclas

    2016-01-01

    The metrology of human-based and other qualitative measurements is in its infancy—concepts such as traceability and uncertainty are as yet poorly developed. This paper reviews how a measurement system analysis approach, particularly invoking as performance metric the ability of a probe (such as a human being) acting as a measurement instrument to make a successful decision, can enable a more general metrological treatment of qualitative observations. Measures based on human observations are typically qualitative, not only in sectors, such as health care, services and safety, where the human factor is obvious, but also in customer perception of traditional products of all kinds. A principal challenge is that the usual tools of statistics normally employed for expressing measurement accuracy and uncertainty will probably not work reliably if relations between distances on different portions of scales are not fully known, as is typical of ordinal or other qualitative measurements. A key enabling insight is to connect the treatment of decision risks associated with measurement uncertainty to generalized linear modelling (GLM). Handling qualitative observations in this way unites information theory, the perceptive identification and choice paradigms of psychophysics. The Rasch invariant measure psychometric GLM approach in particular enables a proper treatment of ordinal data; a clear separation of probe and item attribute estimates; simple expressions for instrument sensitivity; etc. Examples include two aspects of the care of breast cancer patients, from diagnosis to rehabilitation. The Rasch approach leads in turn to opportunities of establishing metrological references for quality assurance of qualitative measurements. In psychometrics, one could imagine a certified reference for knowledge challenge, for example, a particular concept in understanding physics or for product quality of a certain health care service. Multivariate methods, such as Principal Component

  1. Depth Measurement Based on Infrared Coded Structured Light

    Directory of Open Access Journals (Sweden)

    Tong Jia

    2014-01-01

    Full Text Available Depth measurement is a challenging problem in computer vision research. In this study, we first design a new grid pattern and develop a sequence coding and decoding algorithm to process the pattern. Second, we propose a linear fitting algorithm to derive the linear relationship between the object depth and pixel shift. Third, we obtain depth information on an object based on this linear relationship. Moreover, 3D reconstruction is implemented based on Delaunay triangulation algorithm. Finally, we utilize the regularity of the error curves to correct the system errors and improve the measurement accuracy. The experimental results show that the accuracy of depth measurement is related to the step length of moving object.

  2. Visual Peoplemeter: A Vision-based Television Audience Measurement System

    Directory of Open Access Journals (Sweden)

    SKELIN, A. K.

    2014-11-01

    Full Text Available Visual peoplemeter is a vision-based measurement system that objectively evaluates the attentive behavior for TV audience rating, thus offering solution to some of drawbacks of current manual logging peoplemeters. In this paper, some limitations of current audience measurement system are reviewed and a novel vision-based system aiming at passive metering of viewers is prototyped. The system uses camera mounted on a television as a sensing modality and applies advanced computer vision algorithms to detect and track a person, and to recognize attentional states. Feasibility of the system is evaluated on a secondary dataset. The results show that the proposed system can analyze viewer's attentive behavior, therefore enabling passive estimates of relevant audience measurement categories.

  3. Observer-based Coal Mill Control using Oxygen Measurements

    DEFF Research Database (Denmark)

    Andersen, Palle; Bendtsen, Jan Dimon; S., Tom

    2006-01-01

    This paper proposes a novel approach to coal flow estimation in pulverized coal mills, which utilizes measurements of oxygen content in the flue gas. Pulverized coal mills are typically not equipped with sensors that detect the amount of coal injected into the furnace. This makes control...... of the coal flow difficult, causing stability problems and limits the plant's load following capabilities. To alleviate this problem without having to rely on expensive flow measurement equipment, a novel observer-based approach is investigated. A Kalman filter based on measurements of combustion air flow led...... into the furnace and oxygen concentration in the flue gas is designed to estimate the actual coal flow injected into the furnace. With this estimate, it becomes possible to close an inner loop around the coal mill itself, thus giving a better disturbance rejection capability. The approach is validated against...

  4. Steering with big words: articulating ideographs in research programs

    NARCIS (Netherlands)

    Bos, Colette; Walhout, Bart; Walhout, Bart; Peine, Alexander; van Lente, Harro

    2014-01-01

    Nowadays, science should address societal challenges, such as ‘sustainability’, or ‘responsible research and innovation’. This emerging form of steering toward broad and generic goals involves the use of ‘big words’: encompassing concepts that are uncontested themselves, but that allow for multiple

  5. Steering with big words: articulating ideographs in nanotechnology

    NARCIS (Netherlands)

    Bos, Colette; Walhout, Albert; Peine, Alex; van Lente, Harro

    2014-01-01

    Nowadays, science should address societal challenges, such as ‘sustainability’, or ‘responsible research and innovation’. This emerging form of steering toward broad and generic goals involves the use of ‘big words’: encompassing concepts that are uncontested themselves, but that allow for multiple

  6. Microcontroller Power Consumption Measurement Based on PSoC

    Directory of Open Access Journals (Sweden)

    S. P. Janković

    2016-06-01

    Full Text Available Microcontrollers are often used as central processing elements in embedded systems. Because of different sleep and performance modes that microcontrollers support, their power consumption may have a high dynamic range, over 100 dB. In this paper, a data acquisition (DAQ system for measuring and analyzing the power consumption of microcontrollers is presented. DAQ system consists of a current measurement circuit using potentiostat technique, a DAQ device based on system on chip PSoC 5LP and Python PC program for the analysis, storage and visualization of measured data. Both Successive Approximation Register (SAR and Delta-Sigma (DS ADCs contained in the PSoC 5LP are used for measuring voltage drop across the shunt resistor. SAR ADC samples data at a 10 times higher rate than DS ADC, so the input range of DS ADC can be adjusted based on data measured by SAR ADC, thus enabling the extension of current measuring range by 28%. Implemented DAQ device is connected with a computer through a USB port and tested with developed Python PC program.

  7. ICF-based classification and measurement of functioning.

    Science.gov (United States)

    Stucki, G; Kostanjsek, N; Ustün, B; Cieza, A

    2008-09-01

    If we aim towards a comprehensive understanding of human functioning and the development of comprehensive programs to optimize functioning of individuals and populations we need to develop suitable measures. The approval of the International Classification, Disability and Health (ICF) in 2001 by the 54th World Health Assembly as the first universally shared model and classification of functioning, disability and health marks, therefore an important step in the development of measurement instruments and ultimately for our understanding of functioning, disability and health. The acceptance and use of the ICF as a reference framework and classification has been facilitated by its development in a worldwide, comprehensive consensus process and the increasing evidence regarding its validity. However, the broad acceptance and use of the ICF as a reference framework and classification will also depend on the resolution of conceptual and methodological challenges relevant for the classification and measurement of functioning. This paper therefore describes first how the ICF categories can serve as building blocks for the measurement of functioning and then the current state of the development of ICF based practical tools and international standards such as the ICF Core Sets. Finally it illustrates how to map the world of measures to the ICF and vice versa and the methodological principles relevant for the transformation of information obtained with a clinical test or a patient-oriented instrument to the ICF as well as the development of ICF-based clinical and self-reported measurement instruments.

  8. Computer Vision Based Measurement of Wildfire Smoke Dynamics

    Directory of Open Access Journals (Sweden)

    BUGARIC, M.

    2015-02-01

    Full Text Available This article presents a novel method for measurement of wildfire smoke dynamics based on computer vision and augmented reality techniques. The aspect of smoke dynamics is an important feature in video smoke detection that could distinguish smoke from visually similar phenomena. However, most of the existing smoke detection systems are not capable of measuring the real-world size of the detected smoke regions. Using computer vision and GIS-based augmented reality, we measure the real dimensions of smoke plumes, and observe the change in size over time. The measurements are performed on offline video data with known camera parameters and location. The observed data is analyzed in order to create a classifier that could be used to eliminate certain categories of false alarms induced by phenomena with different dynamics than smoke. We carried out an offline evaluation where we measured the improvement in the detection process achieved using the proposed smoke dynamics characteristics. The results show a significant increase in algorithm performance, especially in terms of reducing false alarms rate. From this it follows that the proposed method for measurement of smoke dynamics could be used to improve existing smoke detection algorithms, or taken into account when designing new ones.

  9. Event- and interval-based measurement of stuttering: a review.

    Science.gov (United States)

    Valente, Ana Rita S; Jesus, Luis M T; Hall, Andreia; Leahy, Margaret

    2015-01-01

    Event- and interval-based measurements are two different ways of computing frequency of stuttering. Interval-based methodology emerged as an alternative measure to overcome problems associated with reproducibility in the event-based methodology. No review has been made to study the effect of methodological factors in interval-based absolute reliability data or to compute the agreement between the two methodologies in terms of inter-judge, intra-judge and accuracy (i.e., correspondence between raters' scores and an established criterion). To provide a review related to reproducibility of event-based and time-interval measurement, and to verify the effect of methodological factors (training, experience, interval duration, sample presentation order and judgment conditions) on agreement of time-interval measurement; in addition, to determine if it is possible to quantify the agreement between the two methodologies The first two authors searched for articles on ERIC, MEDLINE, PubMed, B-on, CENTRAL and Dissertation Abstracts during January-February 2013 and retrieved 495 articles. Forty-eight articles were selected for review. Content tables were constructed with the main findings. Articles related to event-based measurements revealed values of inter- and intra-judge greater than 0.70 and agreement percentages beyond 80%. The articles related to time-interval measures revealed that, in general, judges with more experience with stuttering presented significantly higher levels of intra- and inter-judge agreement. Inter- and intra-judge values were beyond the references for high reproducibility values for both methodologies. Accuracy (regarding the closeness of raters' judgements with an established criterion), intra- and inter-judge agreement were higher for trained groups when compared with non-trained groups. Sample presentation order and audio/video conditions did not result in differences in inter- or intra-judge results. A duration of 5 s for an interval appears to be

  10. A measurement-based performability model for a multiprocessor system

    Science.gov (United States)

    Ilsueh, M. C.; Iyer, Ravi K.; Trivedi, K. S.

    1987-01-01

    A measurement-based performability model based on real error-data collected on a multiprocessor system is described. Model development from the raw errror-data to the estimation of cumulative reward is described. Both normal and failure behavior of the system are characterized. The measured data show that the holding times in key operational and failure states are not simple exponential and that semi-Markov process is necessary to model the system behavior. A reward function, based on the service rate and the error rate in each state, is then defined in order to estimate the performability of the system and to depict the cost of different failure types and recovery procedures.

  11. An Energy-Based Similarity Measure for Time Series

    Directory of Open Access Journals (Sweden)

    Pierre Brunagel

    2007-11-01

    Full Text Available A new similarity measure, called SimilB, for time series analysis, based on the cross-ΨB-energy operator (2004, is introduced. ΨB is a nonlinear measure which quantifies the interaction between two time series. Compared to Euclidean distance (ED or the Pearson correlation coefficient (CC, SimilB includes the temporal information and relative changes of the time series using the first and second derivatives of the time series. SimilB is well suited for both nonstationary and stationary time series and particularly those presenting discontinuities. Some new properties of ΨB are presented. Particularly, we show that ΨB as similarity measure is robust to both scale and time shift. SimilB is illustrated with synthetic time series and an artificial dataset and compared to the CC and the ED measures.

  12. Temperature measuring system based on ADuC812 MCU

    International Nuclear Information System (INIS)

    Zhou Dongmei; Ge Liangquan; Cheng Feng; Li Jinfeng

    2009-01-01

    This paper introduces a temperature measuring system which is composed of a single chip microcomputer ADuC812, new type digital temperature sensor TMP100,LED display circuit and based on I 2 C bus. I 2 C bus which is invented by PHILIPS company needs only two signal lines (SDA, SCL), can realized perfect duplex synchronous data transmission. Using the method of hardware setting of device address, can completely avoid the disadvantages of device selection addressing, thus can make hardware system has simplifier and more flexible extension method. The key part of the system is composed of a single chip microcomputer ADuC812 which is compatible with MCS-51 and is invented by AD company in america. The software is compiled with 8051 assembly language. The data acquisitin single chip microcomputer measurement system with I 2 C bus fully shows the features of flexibility, precise and high integration. Proposed high accuracy measurement method to realize environment temperature measure. (authors)

  13. Towards minimal resources of measurement-based quantum computation

    International Nuclear Information System (INIS)

    Perdrix, Simon

    2007-01-01

    We improve the upper bound on the minimal resources required for measurement-only quantum computation (M A Nielsen 2003 Phys. Rev. A 308 96-100; D W Leung 2004 Int. J. Quantum Inform. 2 33; S Perdrix 2005 Int. J. Quantum Inform. 3 219-23). Minimizing the resources required for this model is a key issue for experimental realization of a quantum computer based on projective measurements. This new upper bound also allows one to reply in the negative to the open question presented by Perdrix (2004 Proc. Quantum Communication Measurement and Computing) about the existence of a trade-off between observable and ancillary qubits in measurement-only QC

  14. High Precision Infrared Temperature Measurement System Based on Distance Compensation

    Directory of Open Access Journals (Sweden)

    Chen Jing

    2017-01-01

    Full Text Available To meet the need of real-time remote monitoring of human body surface temperature for optical rehabilitation therapy, a non-contact high-precision real-time temperature measurement method based on distance compensation was proposed, and the system design was carried out. The microcontroller controls the infrared temperature measurement module and the laser range module to collect temperature and distance data. The compensation formula of temperature with distance wass fitted according to the least square method. Testing had been performed on different individuals to verify the accuracy of the system. The results indicate that the designed non-contact infrared temperature measurement system has a residual error of less than 0.2°C and the response time isless than 0.1s in the range of 0 to 60cm. This provides a reference for developing long-distance temperature measurement equipment in optical rehabilitation therapy.

  15. Measurable realistic image-based 3D mapping

    Science.gov (United States)

    Liu, W.; Wang, J.; Wang, J. J.; Ding, W.; Almagbile, A.

    2011-12-01

    Maps with 3D visual models are becoming a remarkable feature of 3D map services. High-resolution image data is obtained for the construction of 3D visualized models.The3D map not only provides the capabilities of 3D measurements and knowledge mining, but also provides the virtual experienceof places of interest, such as demonstrated in the Google Earth. Applications of 3D maps are expanding into the areas of architecture, property management, and urban environment monitoring. However, the reconstruction of high quality 3D models is time consuming, and requires robust hardware and powerful software to handle the enormous amount of data. This is especially for automatic implementation of 3D models and the representation of complicated surfacesthat still need improvements with in the visualisation techniques. The shortcoming of 3D model-based maps is the limitation of detailed coverage since a user can only view and measure objects that are already modelled in the virtual environment. This paper proposes and demonstrates a 3D map concept that is realistic and image-based, that enables geometric measurements and geo-location services. Additionally, image-based 3D maps provide more detailed information of the real world than 3D model-based maps. The image-based 3D maps use geo-referenced stereo images or panoramic images. The geometric relationships between objects in the images can be resolved from the geometric model of stereo images. The panoramic function makes 3D maps more interactive with users but also creates an interesting immersive circumstance. Actually, unmeasurable image-based 3D maps already exist, such as Google street view, but only provide virtual experiences in terms of photos. The topographic and terrain attributes, such as shapes and heights though are omitted. This paper also discusses the potential for using a low cost land Mobile Mapping System (MMS) to implement realistic image 3D mapping, and evaluates the positioning accuracy that a measureable

  16. Augment clinical measurement using a constraint-based esophageal model

    Science.gov (United States)

    Kou, Wenjun; Acharya, Shashank; Kahrilas, Peter; Patankar, Neelesh; Pandolfino, John

    2017-11-01

    Quantifying the mechanical properties of the esophageal wall is crucial to understanding impairments of trans-esophageal flow characteristic of several esophageal diseases. However, these data are unavailable owing to technological limitations of current clinical diagnostic instruments that instead display esophageal luminal cross sectional area based on intraluminal impedance change. In this work, we developed an esophageal model to predict bolus flow and the wall property based on clinical measurements. The model used the constraint-based immersed-boundary method developed previously by our group. Specifically, we first approximate the time-dependent wall geometry based on impedance planimetry data on luminal cross sectional area. We then fed these along with pressure data into the model and computed wall tension based on simulated pressure and flow fields, and the material property based on the strain-stress relationship. As examples, we applied this model to augment FLIP (Functional Luminal Imaging Probe) measurements in three clinical cases: a normal subject, achalasia, and eosinophilic esophagitis (EoE). Our findings suggest that the wall stiffness was greatest in the EoE case, followed by the achalasia case, and then the normal. This is supported by NIH Grant R01 DK56033 and R01 DK079902.

  17. Measurement of energy efficiency based on economic foundations

    International Nuclear Information System (INIS)

    Filippini, Massimo; Hunt, Lester C.

    2015-01-01

    Energy efficiency policy is seen as a very important activity by almost all policy makers. In practical energy policy analysis, the typical indicator used as a proxy for energy efficiency is energy intensity. However, this simple indicator is not necessarily an accurate measure given changes in energy intensity are a function of changes in several factors as well as ‘true’ energy efficiency; hence, it is difficult to make conclusions for energy policy based upon simple energy intensity measures. Related to this, some published academic papers over the last few years have attempted to use empirical methods to measure the efficient use of energy based on the economic theory of production. However, these studies do not generally provide a systematic discussion of the theoretical basis nor the possible parametric empirical approaches that are available for estimating the level of energy efficiency. The objective of this paper, therefore, is to sketch out and explain from an economic perspective the theoretical framework as well as the empirical methods for measuring the level of energy efficiency. Additionally, in the second part of the paper, some of the empirical studies that have attempted to measure energy efficiency using such an economics approach are summarized and discussed.

  18. Self-guaranteed measurement-based quantum computation

    Science.gov (United States)

    Hayashi, Masahito; Hajdušek, Michal

    2018-05-01

    In order to guarantee the output of a quantum computation, we usually assume that the component devices are trusted. However, when the total computation process is large, it is not easy to guarantee the whole system when we have scaling effects, unexpected noise, or unaccounted for correlations between several subsystems. If we do not trust the measurement basis or the prepared entangled state, we do need to be worried about such uncertainties. To this end, we propose a self-guaranteed protocol for verification of quantum computation under the scheme of measurement-based quantum computation where no prior-trusted devices (measurement basis or entangled state) are needed. The approach we present enables the implementation of verifiable quantum computation using the measurement-based model in the context of a particular instance of delegated quantum computation where the server prepares the initial computational resource and sends it to the client, who drives the computation by single-qubit measurements. Applying self-testing procedures, we are able to verify the initial resource as well as the operation of the quantum devices and hence the computation itself. The overhead of our protocol scales with the size of the initial resource state to the power of 4 times the natural logarithm of the initial state's size.

  19. Evaluating airline energy efficiency: An integrated approach with Network Epsilon-based Measure and Network Slacks-based Measure

    International Nuclear Information System (INIS)

    Xu, Xin; Cui, Qiang

    2017-01-01

    This paper focuses on evaluating airline energy efficiency, which is firstly divided into four stages: Operations Stage, Fleet Maintenance Stage, Services Stage and Sales Stage. The new four-stage network structure of airline energy efficiency is a modification of existing models. A new approach, integrated with Network Epsilon-based Measure and Network Slacks-based Measure, is applied to assess the overall energy efficiency and divisional efficiency of 19 international airlines from 2008 to 2014. The influencing factors of airline energy efficiency are analyzed through the regression analysis. The results indicate the followings: 1. The integrated model can identify the benchmarking airlines in the overall system and stages. 2. Most airlines' energy efficiencies keep steady during the period, except for some sharply fluctuations. The efficiency decreases mainly centralized in the year 2008–2011, affected by the financial crisis in the USA. 3. The average age of fleet is positively correlated with the overall energy efficiency, and each divisional efficiency has different significant influencing factors. - Highlights: • An integrated approach with Network Epsilon-based Measure and Network Slacks-based Measure is developed. • 19 airlines' energy efficiencies are evaluated. • Garuda Indonesia has the highest overall energy efficiency.

  20. Link-Based Similarity Measures Using Reachability Vectors

    Directory of Open Access Journals (Sweden)

    Seok-Ho Yoon

    2014-01-01

    Full Text Available We present a novel approach for computing link-based similarities among objects accurately by utilizing the link information pertaining to the objects involved. We discuss the problems with previous link-based similarity measures and propose a novel approach for computing link based similarities that does not suffer from these problems. In the proposed approach each target object is represented by a vector. Each element of the vector corresponds to all the objects in the given data, and the value of each element denotes the weight for the corresponding object. As for this weight value, we propose to utilize the probability of reaching from the target object to the specific object, computed using the “Random Walk with Restart” strategy. Then, we define the similarity between two objects as the cosine similarity of the two vectors. In this paper, we provide examples to show that our approach does not suffer from the aforementioned problems. We also evaluate the performance of the proposed methods in comparison with existing link-based measures, qualitatively and quantitatively, with respect to two kinds of data sets, scientific papers and Web documents. Our experimental results indicate that the proposed methods significantly outperform the existing measures.

  1. Real cell overlay measurement through design based metrology

    Science.gov (United States)

    Yoo, Gyun; Kim, Jungchan; Park, Chanha; Lee, Taehyeong; Ji, Sunkeun; Jo, Gyoyeon; Yang, Hyunjo; Yim, Donggyu; Yamamoto, Masahiro; Maruyama, Kotaro; Park, Byungjun

    2014-04-01

    Until recent device nodes, lithography has been struggling to improve its resolution limit. Even though next generation lithography technology is now facing various difficulties, several innovative resolution enhancement technologies, based on 193nm wavelength, were introduced and implemented to keep the trend of device scaling. Scanner makers keep developing state-of-the-art exposure system which guarantees higher productivity and meets a more aggressive overlay specification. "The scaling reduction of the overlay error has been a simple matter of the capability of exposure tools. However, it is clear that the scanner contributions may no longer be the majority component in total overlay performance. The ability to control correctable overlay components is paramount to achieve the desired performance.(2)" In a manufacturing fab, the overlay error, determined by a conventional overlay measurement: by using an overlay mark based on IBO and DBO, often does not represent the physical placement error in the cell area of a memory device. The mismatch may arise from the size or pitch difference between the overlay mark and the cell pattern. Pattern distortion, caused by etching or CMP, also can be a source of the mismatch. Therefore, the requirement of a direct overlay measurement in the cell pattern gradually increases in the manufacturing field, and also in the development level. In order to overcome the mismatch between conventional overlay measurement and the real placement error of layer to layer in the cell area of a memory device, we suggest an alternative overlay measurement method utilizing by design, based metrology tool. A basic concept of this method is shown in figure1. A CD-SEM measurement of the overlay error between layer 1 and 2 could be the ideal method but it takes too long time to extract a lot of data from wafer level. An E-beam based DBM tool provides high speed to cover the whole wafer with high repeatability. It is enabled by using the design as a

  2. A Game Map Complexity Measure Based on Hamming Distance

    Science.gov (United States)

    Li, Yan; Su, Pan; Li, Wenliang

    With the booming of PC game market, Game AI has attracted more and more researches. The interesting and difficulty of a game are relative with the map used in game scenarios. Besides, the path-finding efficiency in a game is also impacted by the complexity of the used map. In this paper, a novel complexity measure based on Hamming distance, called the Hamming complexity, is introduced. This measure is able to estimate the complexity of binary tileworld. We experimentally demonstrated that Hamming complexity is highly relative with the efficiency of A* algorithm, and therefore it is a useful reference to the designer when developing a game map.

  3. A microprocessor based picture analysis system for automatic track measurements

    International Nuclear Information System (INIS)

    Heinrich, W.; Trakowski, W.; Beer, J.; Schucht, R.

    1982-01-01

    In the last few years picture analysis became a powerful technique for measurements of nuclear tracks in plastic detectors. For this purpose rather expensive commercial systems are available. Two inexpensive microprocessor based systems with different resolution were developed. The video pictures of particles seen through a microscope are digitized in real time and the picture analysis is done by software. The microscopes are equipped with stages driven by stepping motors, which are controlled by separate microprocessors. A PDP 11/03 supervises the operation of all microprocessors and stores the measured data on its mass storage devices. (author)

  4. Atmospheric profiles from active space-based radio measurements

    Science.gov (United States)

    Hardy, Kenneth R.; Hinson, David P.; Tyler, G. L.; Kursinski, E. R.

    1992-01-01

    The paper describes determinations of atmospheric profiles from space-based radio measurements and the retrieval methodology used, with special attention given to the measurement procedure and the characteristics of the soundings. It is speculated that reliable profiles of the terrestrial atmosphere can be obtained by the occultation technique from the surface to a height of about 60 km. With the full complement of 21 the Global Positioning System (GPS) satellites and one GPS receiver in sun synchronous polar orbit, a maximum of 42 soundings could be obtained for each complete orbit or about 670 per day, providing almost uniform global coverage.

  5. Assessing Therapist Competence: Development of a Performance-Based Measure and Its Comparison With a Web-Based Measure.

    Science.gov (United States)

    Cooper, Zafra; Doll, Helen; Bailey-Straebler, Suzanne; Bohn, Kristin; de Vries, Dian; Murphy, Rebecca; O'Connor, Marianne E; Fairburn, Christopher G

    2017-10-31

    Recent research interest in how best to train therapists to deliver psychological treatments has highlighted the need for rigorous, but scalable, means of measuring therapist competence. There are at least two components involved in assessing therapist competence: the assessment of their knowledge of the treatment concerned, including how and when to use its strategies and procedures, and an evaluation of their ability to apply such knowledge skillfully in practice. While the assessment of therapists' knowledge has the potential to be completed efficiently on the Web, the assessment of skill has generally involved a labor-intensive process carried out by clinicians, and as such, may not be suitable for assessing training outcome in certain circumstances. The aims of this study were to develop and evaluate a role-play-based measure of skill suitable for assessing training outcome and to compare its performance with a highly scalable Web-based measure of applied knowledge. Using enhanced cognitive behavioral therapy (CBT-E) for eating disorders as an exemplar, clinical scenarios for role-play assessment were developed and piloted together with a rating scheme for assessing trainee therapists' performance. These scenarios were evaluated by examining the performance of 93 therapists from different professional backgrounds and at different levels of training in implementing CBT-E. These therapists also completed a previously developed Web-based measure of applied knowledge, and the ability of the Web-based measure to efficiently predict competence on the role-play measure was investigated. The role-play measure assessed performance at implementing a range of CBT-E procedures. The majority of the therapists rated their performance as moderately or closely resembling their usual clinical performance. Trained raters were able to achieve good-to-excellent reliability for averaged competence, with intraclass correlation coefficients ranging from .653 to 909. The measure was

  6. The correction of vibration in frequency scanning interferometry based absolute distance measurement system for dynamic measurements

    Science.gov (United States)

    Lu, Cheng; Liu, Guodong; Liu, Bingguo; Chen, Fengdong; Zhuang, Zhitao; Xu, Xinke; Gan, Yu

    2015-10-01

    Absolute distance measurement systems are of significant interest in the field of metrology, which could improve the manufacturing efficiency and accuracy of large assemblies in fields such as aircraft construction, automotive engineering, and the production of modern windmill blades. Frequency scanning interferometry demonstrates noticeable advantages as an absolute distance measurement system which has a high precision and doesn't depend on a cooperative target. In this paper , the influence of inevitable vibration in the frequency scanning interferometry based absolute distance measurement system is analyzed. The distance spectrum is broadened as the existence of Doppler effect caused by vibration, which will bring in a measurement error more than 103 times bigger than the changes of optical path difference. In order to decrease the influence of vibration, the changes of the optical path difference are monitored by a frequency stabilized laser, which runs parallel to the frequency scanning interferometry. The experiment has verified the effectiveness of this method.

  7. Developing a community-based flood resilience measurement standard

    Science.gov (United States)

    Keating, Adriana; Szoenyi, Michael; Chaplowe, Scott; McQuistan, Colin; Campbell, Karen

    2015-04-01

    Given the increased attention to resilience-strengthening in international humanitarian and development work, there has been concurrent interest in its measurement and the overall accountability of "resilience strengthening" initiatives. The literature is reaching beyond the polemic of defining resilience to its measurement. Similarly, donors are increasingly expecting organizations to go beyond claiming resilience programing to measuring and showing it. However, key questions must be asked, in particular "Resilience of whom and to what?". There is no one-size-fits-all solution. The approach to measuring resilience is dependent on the audience and the purpose of the measurement exercise. Deriving a resilience measurement system needs to be based on the question it seeks to answer and needs to be specific. This session highlights key lessons from the Zurich Flood Resilience Alliance approach to develop a flood resilience measurement standard to measure and assess the impact of community based flood resilience interventions, and to inform decision-making to enhance the effectiveness of these interventions. We draw on experience in methodology development to-date, together with lessons from application in two case study sites in Latin America. Attention will be given to the use of a consistent measurement methodology for community resilience to floods over time and place; challenges to measuring a complex and dynamic phenomenon such as community resilience; methodological implications of measuring community resilience versus impact on and contribution to this goal; and using measurement and tools such as cost-benefit analysis to prioritize and inform strategic decision making for resilience interventions. The measurement tool follows the five categories of the Sustainable Livelihoods Framework and the 4Rs of complex adaptive systems - robustness, rapidity, redundancy and resourcefulness -5C-4R. A recent white paper by the Zurich Flood Resilience Alliance traces the

  8. Observer-Based Fuel Control Using Oxygen Measurement

    DEFF Research Database (Denmark)

    Andersen, Palle; Bendtsen, Jan Dimon; Mortensen, Jan Henrik

    is constructed and validated against data obtained at the plant. A Kalman filter based on measurements of combustion air flow led into the furnace and oxygen concentration in the flue gas is designed to estimate the actual coal flow. With this estimate, it becomes possible to close an inner loop around the coal......This report describes an attempt to improve the existing control af coal mills used at the Danish power plant Nordjyllandsværket Unit 3. The coal mills are not equipped with coal flow sensors; thus an observer-based approach is investigated. A nonlinear differential equation model of the boiler...

  9. Fault Location Based on Synchronized Measurements: A Comprehensive Survey

    Science.gov (United States)

    Al-Mohammed, A. H.; Abido, M. A.

    2014-01-01

    This paper presents a comprehensive survey on transmission and distribution fault location algorithms that utilize synchronized measurements. Algorithms based on two-end synchronized measurements and fault location algorithms on three-terminal and multiterminal lines are reviewed. Series capacitors equipped with metal oxide varistors (MOVs), when set on a transmission line, create certain problems for line fault locators and, therefore, fault location on series-compensated lines is discussed. The paper reports the work carried out on adaptive fault location algorithms aiming at achieving better fault location accuracy. Work associated with fault location on power system networks, although limited, is also summarized. Additionally, the nonstandard high-frequency-related fault location techniques based on wavelet transform are discussed. Finally, the paper highlights the area for future research. PMID:24701191

  10. Fault Location Based on Synchronized Measurements: A Comprehensive Survey

    Directory of Open Access Journals (Sweden)

    A. H. Al-Mohammed

    2014-01-01

    Full Text Available This paper presents a comprehensive survey on transmission and distribution fault location algorithms that utilize synchronized measurements. Algorithms based on two-end synchronized measurements and fault location algorithms on three-terminal and multiterminal lines are reviewed. Series capacitors equipped with metal oxide varistors (MOVs, when set on a transmission line, create certain problems for line fault locators and, therefore, fault location on series-compensated lines is discussed. The paper reports the work carried out on adaptive fault location algorithms aiming at achieving better fault location accuracy. Work associated with fault location on power system networks, although limited, is also summarized. Additionally, the nonstandard high-frequency-related fault location techniques based on wavelet transform are discussed. Finally, the paper highlights the area for future research.

  11. Automated pavement horizontal curve measurement methods based on inertial measurement unit and 3D profiling data

    Directory of Open Access Journals (Sweden)

    Wenting Luo

    2016-04-01

    Full Text Available Pavement horizontal curve is designed to serve as a transition between straight segments, and its presence may cause a series of driving-related safety issues to motorists and drivers. As is recognized that traditional methods for curve geometry investigation are time consuming, labor intensive, and inaccurate, this study attempts to develop a method that can automatically conduct horizontal curve identification and measurement at network level. The digital highway data vehicle (DHDV was utilized for data collection, in which three Euler angles, driving speed, and acceleration of survey vehicle were measured with an inertial measurement unit (IMU. The 3D profiling data used for cross slope calibration was obtained with PaveVision3D Ultra technology at 1 mm resolution. In this study, the curve identification was based on the variation of heading angle, and the curve radius was calculated with kinematic method, geometry method, and lateral acceleration method. In order to verify the accuracy of the three methods, the analysis of variance (ANOVA test was applied by using the control variable of curve radius measured by field test. Based on the measured curve radius, a curve safety analysis model was used to predict the crash rates and safe driving speeds at horizontal curves. Finally, a case study on 4.35 km road segment demonstrated that the proposed method could efficiently conduct network level analysis.

  12. An Improved Dissonance Measure Based on Auditory Memory

    DEFF Research Database (Denmark)

    Jensen, Kristoffer; Hjortkjær, Jens

    2012-01-01

    Dissonance is an important feature in music audio analysis. We present here a dissonance model that accounts for the temporal integration of dissonant events in auditory short term memory. We compare the memory-based dissonance extracted from musical audio sequences to the response of human...... listeners. In a number of tests, the memory model predicts listener’s response better than traditional dissonance measures....

  13. Deformation Measurements of Gabion Walls Using Image Based Modeling

    Directory of Open Access Journals (Sweden)

    Marek Fraštia

    2014-06-01

    Full Text Available The image based modeling finds use in applications where it is necessary to reconstructthe 3D surface of the observed object with a high level of detail. Previous experiments showrelatively high variability of the results depending on the camera type used, the processingsoftware, or the process evaluation. The authors tested the method of SFM (Structure fromMotion to determine the stability of gabion walls. The results of photogrammetricmeasurements were compared to precise geodetic point measurements.

  14. Biomass burning aerosols characterization from ground based and profiling measurements

    Science.gov (United States)

    Marin, Cristina; Vasilescu, Jeni; Marmureanu, Luminita; Ene, Dragos; Preda, Liliana; Mihailescu, Mona

    2018-04-01

    The study goal is to assess the chemical and optical properties of aerosols present in the lofted layers and at the ground. The biomass burning aerosols were evaluated in low level layers from multi-wavelength lidar measurements, while chemical composition at ground was assessed using an Aerosol Chemical Speciation Monitor (ACSM) and an Aethalometer. Classification of aerosol type and specific organic markers were used to explore the potential to sense the particles from the same origin at ground base and on profiles.

  15. Pulsed electric field sensor based on original waveform measurement

    International Nuclear Information System (INIS)

    Ma Liang; Wu Wei; Cheng Yinhui; Zhou Hui; Li Baozhong; Li Jinxi; Zhu Meng

    2010-01-01

    The paper introduces the differential and original waveform measurement principles for pulsed E-field, and develops an pulsed E-field sensor based on original waveform measurement along with its theoretical correction model. The sensor consists of antenna, integrator, amplifier and driver, optic-electric/electric-optic conversion module and transmission module. The time-domain calibration in TEM cell indicates that, its risetime response is shorter than 1.0 ns, and the output pulse width at 90% of the maximum amplitude is wider than 10.0 μs. The output amplitude of the sensor is linear to the electric field intensity in a dynamic range of 20 dB. The measurement capability can be extended to 10 V/m or 50 kV/m by changing the system's antenna and other relative modules. (authors)

  16. Arrester Resistive Current Measuring System Based on Heterogeneous Network

    Science.gov (United States)

    Zhang, Yun Hua; Li, Zai Lin; Yuan, Feng; Hou Pan, Feng; Guo, Zhan Nan; Han, Yue

    2018-03-01

    Metal Oxide Arrester (MOA) suffers from aging and poor insulation due to long-term impulse voltage and environmental impact, and the value and variation tendency of resistive current can reflect the health conditions of MOA. The common wired MOA detection need to use long cables, which is complicated to operate, and that wireless measurement methods are facing the problems of poor data synchronization and instability. Therefore a novel synchronous measurement system of arrester current resistive based on heterogeneous network is proposed, which simplifies the calculation process and improves synchronization, accuracy and stability and of the measuring system. This system combines LoRa wireless network, high speed wireless personal area network and the process layer communication, and realizes the detection of arrester working condition. Field test data shows that the system has the characteristics of high accuracy, strong anti-interference ability and good synchronization, which plays an important role in ensuring the stable operation of the power grid.

  17. Nano-displacement measurement based on virtual pinhole confocal method

    International Nuclear Information System (INIS)

    Li, Long; Kuang, Cuifang; Xue, Yi; Liu, Xu

    2013-01-01

    A virtual pinhole confocal system based on charge-coupled device (CCD) detection and image processing techniques is built to measure axial displacement with 10 nm resolution, preeminent flexibility and excellent robustness when facing spot drifting. Axial displacement of the sample surface is determined by capturing the confocal laser spot using a CCD detector and quantifying the energy collected by programmable virtual pinholes. Experiments indicate an applicable measuring range of 1000 nm (Gaussian fitting r = 0.9902) with a highly linear range of 500 nm (linear fitting r = 0.9993). A concentric subtraction algorithm is introduced to further enhance resolution. Factors affecting measuring precision, sensitivity and signal-to-noise ratio are discussed using theoretical deductions and diffraction simulations. The virtual pinhole technique has promising applications in surface profiling and confocal imaging applications which require easily-customizable pinhole configurations. (paper)

  18. Neurally based measurement and evaluation of environmental noise

    CERN Document Server

    Soeta, Yoshiharu

    2015-01-01

    This book deals with methods of measurement and evaluation of environmental noise based on an auditory neural and brain-oriented model. The model consists of the autocorrelation function (ACF) and the interaural cross-correlation function (IACF) mechanisms for signals arriving at the two ear entrances. Even when the sound pressure level of a noise is only about 35 dBA, people may feel annoyed due to the aspects of sound quality. These aspects can be formulated by the factors extracted from the ACF and IACF. Several examples of measuring environmental noise—from outdoor noise such as that of aircraft, traffic, and trains, and indoor noise such as caused by floor impact, toilets, and air-conditioning—are demonstrated. According to the noise measurement and evaluation, applications for sound design are discussed. This book provides an excellent resource for students, researchers, and practitioners in a wide range of fields, such as the automotive, railway, and electronics industries, and soundscape, architec...

  19. Experimental nonlocality-based randomness generation with nonprojective measurements

    Science.gov (United States)

    Gómez, S.; Mattar, A.; Gómez, E. S.; Cavalcanti, D.; Farías, O. Jiménez; Acín, A.; Lima, G.

    2018-04-01

    We report on an optical setup generating more than one bit of randomness from one entangled bit (i.e., a maximally entangled state of two qubits). The amount of randomness is certified through the observation of Bell nonlocal correlations. To attain this result we implemented a high-purity entanglement source and a nonprojective three-outcome measurement. Our implementation achieves a gain of 27% of randomness as compared with the standard methods using projective measurements. Additionally, we estimate the amount of randomness certified in a one-sided device-independent scenario, through the observation of Einstein-Podolsky-Rosen steering. Our results prove that nonprojective quantum measurements allow extending the limits for nonlocality-based certified randomness generation using current technology.

  20. Measurement of unattached radon progeny based in electrostatic deposition method

    International Nuclear Information System (INIS)

    Canoba, A.C.; Lopez, F.O.

    1999-01-01

    A method for the measurement of unattached radon progeny based on its electrostatic deposition onto wire screens, using only one pump, has been implemented and calibrated. The importance of being able of making use of this method is related with the special radiological significance that has the unattached fraction of the short-lived radon progeny. Because of this, the assessment of exposure could be directly related to dose with far greater accuracy than before. The advantages of this method are its simplicity, even with the tools needed for the sample collection, as well as the measurement instruments used. Also, the suitability of this method is enhanced by the fact that it can effectively be used with a simple measuring procedure such as the Kusnetz method. (author)

  1. Measurement system for nitrous oxide based on amperometric gas sensor

    Science.gov (United States)

    Siswoyo, S.; Persaud, K. C.; Phillips, V. R.; Sneath, R.

    2017-03-01

    It has been well known that nitrous oxide is an important greenhouse gas, so monitoring and control of its concentration and emission is very important. In this work a nitrous oxide measurement system has been developed consisting of an amperometric sensor and an appropriate lab-made potentiostat that capable measuring picoampere current ranges. The sensor was constructed using a gold microelectrode as working electrode surrounded by a silver wire as quasi reference electrode, with tetraethyl ammonium perchlorate and dimethylsulphoxide as supporting electrolyte and solvent respectively. The lab-made potentiostat was built incorporating a transimpedance amplifier capable of picoampere measurements. This also incorporated a microcontroller based data acquisition system, controlled by a host personal computer using a dedicated computer program. The system was capable of detecting N2O concentrations down to 0.07 % v/v.

  2. Evidence conflict measure based on OWA operator in open world.

    Directory of Open Access Journals (Sweden)

    Wen Jiang

    Full Text Available Dempster-Shafer evidence theory has been extensively used in many information fusion systems since it was proposed by Dempster and extended by Shafer. Many scholars have been conducted on conflict management of Dempster-Shafer evidence theory in past decades. However, how to determine a potent parameter to measure evidence conflict, when the given environment is in an open world, namely the frame of discernment is incomplete, is still an open issue. In this paper, a new method which combines generalized conflict coefficient, generalized evidence distance, and generalized interval correlation coefficient based on ordered weighted averaging (OWA operator, to measure the conflict of evidence is presented. Through ordered weighted average of these three parameters, the combinatorial coefficient can still measure the conflict effectively when one or two parameters are not valid. Several numerical examples demonstrate the effectiveness of the proposed method.

  3. Beam based measurement of beam position monitor electrode gains

    Directory of Open Access Journals (Sweden)

    D. L. Rubin

    2010-09-01

    Full Text Available Low emittance tuning at the Cornell Electron Storage Ring (CESR test accelerator depends on precision measurement of vertical dispersion and transverse coupling. The CESR beam position monitors (BPMs consist of four button electrodes, instrumented with electronics that allow acquisition of turn-by-turn data. The response to the beam will vary among the four electrodes due to differences in electronic gain and/or misalignment. This variation in the response of the BPM electrodes will couple real horizontal offset to apparent vertical position, and introduce spurious measurements of coupling and vertical dispersion. To alleviate this systematic effect, a beam based technique to measure the relative response of the four electrodes has been developed. With typical CESR parameters, simulations show that turn-by-turn BPM data can be used to determine electrode gains to within ∼0.1%.

  4. Beam based measurement of beam position monitor electrode gains

    Science.gov (United States)

    Rubin, D. L.; Billing, M.; Meller, R.; Palmer, M.; Rendina, M.; Rider, N.; Sagan, D.; Shanks, J.; Strohman, C.

    2010-09-01

    Low emittance tuning at the Cornell Electron Storage Ring (CESR) test accelerator depends on precision measurement of vertical dispersion and transverse coupling. The CESR beam position monitors (BPMs) consist of four button electrodes, instrumented with electronics that allow acquisition of turn-by-turn data. The response to the beam will vary among the four electrodes due to differences in electronic gain and/or misalignment. This variation in the response of the BPM electrodes will couple real horizontal offset to apparent vertical position, and introduce spurious measurements of coupling and vertical dispersion. To alleviate this systematic effect, a beam based technique to measure the relative response of the four electrodes has been developed. With typical CESR parameters, simulations show that turn-by-turn BPM data can be used to determine electrode gains to within ˜0.1%.

  5. Soil-Carbon Measurement System Based on Inelastic Neutron Scattering

    International Nuclear Information System (INIS)

    Orion, I.; Wielopolski, L.

    2002-01-01

    Increase in the atmospheric CO 2 is associated with concurrent increase in the amount of carbon sequestered in the soil. For better understanding of the carbon cycle it is imperative to establish a better and extensive database of the carbon concentrations in various soil types, in order to develop improved models for changes in the global climate. Non-invasive soil carbon measurement is based on Inelastic Neutron Scattering (INS). This method has been used successfully to measure total body carbon in human beings. The system consists of a pulsed neutron generator that is based on D-T reaction, which produces 14 MeV neutrons, a neutron flux monitoring detector and a couple of large NaI(Tl), 6'' diameter by 6'' high, spectrometers [4]. The threshold energy for INS reaction in carbon is 4.8 MeV. Following INS of 14 MeV neutrons in carbon 4.44 MeV photons are emitted and counted during a gate pulse period of 10 μsec. The repetition rate of the neutron generator is 104 pulses per sec. The gamma spectra are acquired only during the neutron generator gate pulses. The INS method for soil carbon content measurements provides a non-destructive, non-invasive tool, which can be optimized in order to develop a system for in field measurements

  6. Computer vision based nacre thickness measurement of Tahitian pearls

    Science.gov (United States)

    Loesdau, Martin; Chabrier, Sébastien; Gabillon, Alban

    2017-03-01

    The Tahitian Pearl is the most valuable export product of French Polynesia contributing with over 61 million Euros to more than 50% of the total export income. To maintain its excellent reputation on the international market, an obligatory quality control for every pearl deemed for exportation has been established by the local government. One of the controlled quality parameters is the pearls nacre thickness. The evaluation is currently done manually by experts that are visually analyzing X-ray images of the pearls. In this article, a computer vision based approach to automate this procedure is presented. Even though computer vision based approaches for pearl nacre thickness measurement exist in the literature, the very specific features of the Tahitian pearl, namely the large shape variety and the occurrence of cavities, have so far not been considered. The presented work closes the. Our method consists of segmenting the pearl from X-ray images with a model-based approach, segmenting the pearls nucleus with an own developed heuristic circle detection and segmenting possible cavities with region growing. Out of the obtained boundaries, the 2-dimensional nacre thickness profile can be calculated. A certainty measurement to consider imaging and segmentation imprecisions is included in the procedure. The proposed algorithms are tested on 298 manually evaluated Tahitian pearls, showing that it is generally possible to automatically evaluate the nacre thickness of Tahitian pearls with computer vision. Furthermore the results show that the automatic measurement is more precise and faster than the manual one.

  7. Measurements of Electromagnetic Fields Emitted from Cellular Base Stations in

    Directory of Open Access Journals (Sweden)

    K. J. Ali

    2013-05-01

    Full Text Available With increasing the usage of mobile communication devices and internet network information, the entry of private telecommunications companies in Iraq has been started since 2003. These companies began to build up cellular towers to accomplish the telecommunication works but they ignore the safety conditions imposed for the health and environment that are considered in random way. These negative health effects which may cause a health risk for life beings and environment pollution. The aim of this work is to determine the safe and unsafe ranges and discuss damage caused by radiation emitted from Asia cell base stations in Shirqat city and discuses the best ways in which can be minimize its exposure level to avoid its negative health effects. Practical measurements of power density around base stations has been accomplished by using a radiation survey meter type (Radio frequency EMF Strength Meter 480846 in two ways. The first way of measurements has been accomplished at a height of 2 meters above ground for different distances from (0-300 meters .The second way is at a distance of 150 meters for different levels from (2-15 meters above ground level. The maximum measured power density is about (3 mW/m2. Results indicate that the levels of power density are far below the RF radiation exposure of USSR safety standards levels. And that means these cellular base station don't cause negative the health effect for life being if the exposure is within the acceptable international standard levels.

  8. Smartphone based hemispherical photography for canopy structure measurement

    Science.gov (United States)

    Wan, Xuefen; Cui, Jian; Jiang, Xueqin; Zhang, Jingwen; Yang, Yi; Zheng, Tao

    2018-01-01

    The canopy is the most direct and active interface layer of the interaction between plant and environment, and has important influence on energy exchange, biodiversity, ecosystem matter and climate change. The measurement about canopy structure of plant is an important foundation to analyze the pattern, process and operation mechanism of forest ecosystem. Through the study of canopy structure of plant, solar radiation, ambient wind speed, air temperature and humidity, soil evaporation, soil temperature and other forest environmental climate characteristics can be evaluated. Because of its accuracy and effectiveness, canopy structure measurement based on hemispherical photography has been widely studied. However, the traditional method of canopy structure hemispherical photogrammetry based on SLR camera and fisheye lens. This method is expensive and difficult to be used in some low-cost occasions. In recent years, smartphone technology has been developing rapidly. The smartphone not only has excellent image acquisition ability, but also has the considerable computational processing ability. In addition, the gyroscope and positioning function on the smartphone will also help to measure the structure of the canopy. In this paper, we present a smartphone based hemispherical photography system. The system consists of smart phones, low-cost fisheye lenses and PMMA adapters. We designed an Android based App to obtain the canopy hemisphere images through low-cost fisheye lenses and provide horizontal collimation information. In addition, the App will add the acquisition location tag obtained by GPS and auxiliary positioning method in hemisphere image information after the canopy structure hemisphere image acquisition. The system was tested in the urban forest after it was completed. The test results show that the smartphone based hemispherical photography system can effectively collect the high-resolution canopy structure image of the plant.

  9. A Feature-Based Structural Measure: An Image Similarity Measure for Face Recognition

    Directory of Open Access Journals (Sweden)

    Noor Abdalrazak Shnain

    2017-08-01

    Full Text Available Facial recognition is one of the most challenging and interesting problems within the field of computer vision and pattern recognition. During the last few years, it has gained special attention due to its importance in relation to current issues such as security, surveillance systems and forensics analysis. Despite this high level of attention to facial recognition, the success is still limited by certain conditions; there is no method which gives reliable results in all situations. In this paper, we propose an efficient similarity index that resolves the shortcomings of the existing measures of feature and structural similarity. This measure, called the Feature-Based Structural Measure (FSM, combines the best features of the well-known SSIM (structural similarity index measure and FSIM (feature similarity index measure approaches, striking a balance between performance for similar and dissimilar images of human faces. In addition to the statistical structural properties provided by SSIM, edge detection is incorporated in FSM as a distinctive structural feature. Its performance is tested for a wide range of PSNR (peak signal-to-noise ratio, using ORL (Olivetti Research Laboratory, now AT&T Laboratory Cambridge and FEI (Faculty of Industrial Engineering, São Bernardo do Campo, São Paulo, Brazil databases. The proposed measure is tested under conditions of Gaussian noise; simulation results show that the proposed FSM outperforms the well-known SSIM and FSIM approaches in its efficiency of similarity detection and recognition of human faces.

  10. Novel Schemes for Measurement-Based Quantum Computation

    International Nuclear Information System (INIS)

    Gross, D.; Eisert, J.

    2007-01-01

    We establish a framework which allows one to construct novel schemes for measurement-based quantum computation. The technique develops tools from many-body physics--based on finitely correlated or projected entangled pair states--to go beyond the cluster-state based one-way computer. We identify resource states radically different from the cluster state, in that they exhibit nonvanishing correlations, can be prepared using nonmaximally entangling gates, or have very different local entanglement properties. In the computational models, randomness is compensated in a different manner. It is shown that there exist resource states which are locally arbitrarily close to a pure state. We comment on the possibility of tailoring computational models to specific physical systems

  11. Novel schemes for measurement-based quantum computation.

    Science.gov (United States)

    Gross, D; Eisert, J

    2007-06-01

    We establish a framework which allows one to construct novel schemes for measurement-based quantum computation. The technique develops tools from many-body physics-based on finitely correlated or projected entangled pair states-to go beyond the cluster-state based one-way computer. We identify resource states radically different from the cluster state, in that they exhibit nonvanishing correlations, can be prepared using nonmaximally entangling gates, or have very different local entanglement properties. In the computational models, randomness is compensated in a different manner. It is shown that there exist resource states which are locally arbitrarily close to a pure state. We comment on the possibility of tailoring computational models to specific physical systems.

  12. Developing safety performance functions incorporating reliability-based risk measures.

    Science.gov (United States)

    Ibrahim, Shewkar El-Bassiouni; Sayed, Tarek

    2011-11-01

    Current geometric design guides provide deterministic standards where the safety margin of the design output is generally unknown and there is little knowledge of the safety implications of deviating from these standards. Several studies have advocated probabilistic geometric design where reliability analysis can be used to account for the uncertainty in the design parameters and to provide a risk measure of the implication of deviation from design standards. However, there is currently no link between measures of design reliability and the quantification of safety using collision frequency. The analysis presented in this paper attempts to bridge this gap by incorporating a reliability-based quantitative risk measure such as the probability of non-compliance (P(nc)) in safety performance functions (SPFs). Establishing this link will allow admitting reliability-based design into traditional benefit-cost analysis and should lead to a wider application of the reliability technique in road design. The present application is concerned with the design of horizontal curves, where the limit state function is defined in terms of the available (supply) and stopping (demand) sight distances. A comprehensive collision and geometric design database of two-lane rural highways is used to investigate the effect of the probability of non-compliance on safety. The reliability analysis was carried out using the First Order Reliability Method (FORM). Two Negative Binomial (NB) SPFs were developed to compare models with and without the reliability-based risk measures. It was found that models incorporating the P(nc) provided a better fit to the data set than the traditional (without risk) NB SPFs for total, injury and fatality (I+F) and property damage only (PDO) collisions. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. New Genome Similarity Measures based on Conserved Gene Adjacencies.

    Science.gov (United States)

    Doerr, Daniel; Kowada, Luis Antonio B; Araujo, Eloi; Deshpande, Shachi; Dantas, Simone; Moret, Bernard M E; Stoye, Jens

    2017-06-01

    Many important questions in molecular biology, evolution, and biomedicine can be addressed by comparative genomic approaches. One of the basic tasks when comparing genomes is the definition of measures of similarity (or dissimilarity) between two genomes, for example, to elucidate the phylogenetic relationships between species. The power of different genome comparison methods varies with the underlying formal model of a genome. The simplest models impose the strong restriction that each genome under study must contain the same genes, each in exactly one copy. More realistic models allow several copies of a gene in a genome. One speaks of gene families, and comparative genomic methods that allow this kind of input are called gene family-based. The most powerful-but also most complex-models avoid this preprocessing of the input data and instead integrate the family assignment within the comparative analysis. Such methods are called gene family-free. In this article, we study an intermediate approach between family-based and family-free genomic similarity measures. Introducing this simpler model, called gene connections, we focus on the combinatorial aspects of gene family-free genome comparison. While in most cases, the computational costs to the general family-free case are the same, we also find an instance where the gene connections model has lower complexity. Within the gene connections model, we define three variants of genomic similarity measures that have different expression powers. We give polynomial-time algorithms for two of them, while we show NP-hardness for the third, most powerful one. We also generalize the measures and algorithms to make them more robust against recent local disruptions in gene order. Our theoretical findings are supported by experimental results, proving the applicability and performance of our newly defined similarity measures.

  14. Covariance-Based Measurement Selection Criterion for Gaussian-Based Algorithms

    Directory of Open Access Journals (Sweden)

    Fernando A. Auat Cheein

    2013-01-01

    Full Text Available Process modeling by means of Gaussian-based algorithms often suffers from redundant information which usually increases the estimation computational complexity without significantly improving the estimation performance. In this article, a non-arbitrary measurement selection criterion for Gaussian-based algorithms is proposed. The measurement selection criterion is based on the determination of the most significant measurement from both an estimation convergence perspective and the covariance matrix associated with the measurement. The selection criterion is independent from the nature of the measured variable. This criterion is used in conjunction with three Gaussian-based algorithms: the EIF (Extended Information Filter, the EKF (Extended Kalman Filter and the UKF (Unscented Kalman Filter. Nevertheless, the measurement selection criterion shown herein can also be applied to other Gaussian-based algorithms. Although this work is focused on environment modeling, the results shown herein can be applied to other Gaussian-based algorithm implementations. Mathematical descriptions and implementation results that validate the proposal are also included in this work.

  15. Measurement of radiation dose with a PC-based instrument

    International Nuclear Information System (INIS)

    Jangland, L.; Neubeck, R.

    1994-01-01

    The purpose of this study was to investigate in what way the introduction of Digital Subtraction Angiography has influenced absorbed doses to the patient and personnel. Calculation of the energy imparted to the patient, ε, was based on measurements of the dose-area product, tube potential and tube current which were registered with a PC-based instrument. The absorbed doses to the personnel were measured with TLD. The measurements on the personnel were made only at the digital system. The results indicate large variations in ε between different types of angiographic examinations of the same type. The total ε were similar on both systems, although the relative contribution from image acquisition and fluoroscopy were different. At the conventional system fluoroscopy and image acquisition contributed almost equally to the total ε. At the digital system 25% of the total ε was due to fluoroscopy and 75% to image acquisition. The differences were due to longer fluoroscopic times on the conventional system, mainly due to lack of image memory and road mapping, and lower ε/image, due to lower dose settings to the film changer compared to the image intensifier on the digital system. 11 refs., 8 figs., 9 tabs

  16. IMU-Based Joint Angle Measurement for Gait Analysis

    Directory of Open Access Journals (Sweden)

    Thomas Seel

    2014-04-01

    Full Text Available This contribution is concerned with joint angle calculation based on inertial measurement data in the context of human motion analysis. Unlike most robotic devices, the human body lacks even surfaces and right angles. Therefore, we focus on methods that avoid assuming certain orientations in which the sensors are mounted with respect to the body segments. After a review of available methods that may cope with this challenge, we present a set of new methods for: (1 joint axis and position identification; and (2 flexion/extension joint angle measurement. In particular, we propose methods that use only gyroscopes and accelerometers and, therefore, do not rely on a homogeneous magnetic field. We provide results from gait trials of a transfemoral amputee in which we compare the inertial measurement unit (IMU-based methods to an optical 3D motion capture system. Unlike most authors, we place the optical markers on anatomical landmarks instead of attaching them to the IMUs. Root mean square errors of the knee flexion/extension angles are found to be less than 1° on the prosthesis and about 3° on the human leg. For the plantar/dorsiflexion of the ankle, both deviations are about 1°.

  17. Noninvasive blood pressure measurement scheme based on optical fiber sensor

    Science.gov (United States)

    Liu, Xianxuan; Yuan, Xueguang; Zhang, Yangan

    2016-10-01

    Optical fiber sensing has many advantages, such as volume small, light quality, low loss, strong in anti-jamming. Since the invention of the optical fiber sensing technology in 1977, optical fiber sensing technology has been applied in the military, national defense, aerospace, industrial, medical and other fields in recent years, and made a great contribution to parameter measurement in the environment under the limited condition .With the rapid development of computer, network system, the intelligent optical fiber sensing technology, the sensor technology, the combination of computer and communication technology , the detection, diagnosis and analysis can be automatically and efficiently completed. In this work, we proposed a noninvasive blood pressure detection and analysis scheme which uses optical fiber sensor. Optical fiber sensing system mainly includes the light source, optical fiber, optical detector, optical modulator, the signal processing module and so on. wavelength optical signals were led into the optical fiber sensor and the signals reflected by the human body surface were detected. By comparing actual testing data with the data got by traditional way to measure the blood pressure we can establish models for predicting the blood pressure and achieve noninvasive blood pressure measurement by using spectrum analysis technology. Blood pressure measurement method based on optical fiber sensing system is faster and more convenient than traditional way, and it can get accurate analysis results in a shorter period of time than before, so it can efficiently reduce the time cost and manpower cost.

  18. WSN-Based Space Charge Density Measurement System.

    Science.gov (United States)

    Deng, Dawei; Yuan, Haiwen; Lv, Jianxun; Ju, Yong

    2017-01-01

    It is generally acknowledged that high voltage direct current (HVDC) transmission line endures the drawback of large area, because of which the utilization of cable for space charge density monitoring system is of inconvenience. Compared with the traditional communication network, wireless sensor network (WSN) shows advantages in small volume, high flexibility and strong self-organization, thereby presenting great potential in solving the problem. Additionally, WSN is more suitable for the construction of distributed space charge density monitoring system as it has longer distance and higher mobility. A distributed wireless system is designed for collecting and monitoring the space charge density under HVDC transmission lines, which has been widely applied in both Chinese state grid HVDC test base and power transmission projects. Experimental results of the measuring system demonstrated its adaptability in the complex electromagnetic environment under the transmission lines and the ability in realizing accurate, flexible, and stable demands for the measurement of space charge density.

  19. Radiotomography Based on Monostatic Interference Measurements with Controlled Oscillator

    Directory of Open Access Journals (Sweden)

    Sukhanov Dmitry

    2016-01-01

    Full Text Available The method of three-dimensional tomography based on radioholography measurements with the reference signal transmitted by the transmitter in the near zone and the receiver near zone. We solve the problem of repairing the object signal phase due to the reference signal in the near field in a wide frequency band and the consideration of analytical signals. Here are presented results of experimental studies on application of a tunable YIG (yttrium iron garnet oscillator in the frequency range from 6.5 to 10.7 GHz for radio tomography of metal objects in air. Holographic principle is applied on the basis of measuring of the interference field amplitude by the detector diode. The interference occurs with the direct wave and waves scattered by the object. To restore the radio images the method of aperture synthesis and extraction of quadrature components at all frequencies sensing are applied. Experimental study on test object shows resolution about 15 mm.

  20. Optical character recognition based on nonredundant correlation measurements.

    Science.gov (United States)

    Braunecker, B; Hauck, R; Lohmann, A W

    1979-08-15

    The essence of character recognition is a comparison between the unknown character and a set of reference patterns. Usually, these reference patterns are all possible characters themselves, the whole alphabet in the case of letter characters. Obviously, N analog measurements are highly redundant, since only K = log(2)N binary decisions are enough to identify one out of N characters. Therefore, we devised K reference patterns accordingly. These patterns, called principal components, are found by digital image processing, but used in an optical analog computer. We will explain the concept of principal components, and we will describe experiments with several optical character recognition systems, based on this concept.

  1. MASS MEASUREMENTS OF ISOLATED OBJECTS FROM SPACE-BASED MICROLENSING

    DEFF Research Database (Denmark)

    Zhu, Wei; Novati, S. Calchi; Gould, A.

    2016-01-01

    lies behind the same amount of dust as the Bulge red clump, we find the lens is a 45 ± 7 {M}{{J}} BD at 5.9 ± 1.0 kpc. The lens of of the second event, OGLE-2015-BLG-0763, is a 0.50 ± 0.04 {M}⊙ star at 6.9 ± 1.0 kpc. We show that the probability to definitively measure the mass of isolated microlenses...... is dramatically increased once simultaneous ground- and space-based observations are conducted....

  2. FLEXIBLE PH SENSOR WITH POLYANILINE LAYER BASED ON IMPEDANCE MEASUREMENT

    OpenAIRE

    Chuang, Cheng-Hsin; Wu, Hsun-Pei; Chen, Cheng-Ho; Wu, Peng-Rong

    2012-01-01

    A flexible sensor with conducting polyaniline layer for detecting pH value based on the impedance measurement is fabricated and demonstrated in this study. The pH sensor consists of an interdigital electrode array on a flexible printed circuit and a thin-film polyaniline as the sensing layer. As the conductivity of polyaniline depends on the redox state, the impedance change of the polyaniline after it has reacted with different pH value solutions works as the sensing mechanism. In order to o...

  3. Model-based cartilage thickness measurement in the submillimeter range

    International Nuclear Information System (INIS)

    Streekstra, G. J.; Strackee, S. D.; Maas, M.; Wee, R. ter; Venema, H. W.

    2007-01-01

    Current methods of image-based thickness measurement in thin sheet structures utilize second derivative zero crossings to locate the layer boundaries. It is generally acknowledged that the nonzero width of the point spread function (PSF) limits the accuracy of this measurement procedure. We propose a model-based method that strongly reduces PSF-induced bias by incorporating the PSF into the thickness estimation method. We estimated the bias in thickness measurements in simulated thin sheet images as obtained from second derivative zero crossings. To gain insight into the range of sheet thickness where our method is expected to yield improved results, sheet thickness was varied between 0.15 and 1.2 mm with an assumed PSF as present in the high-resolution modes of current computed tomography (CT) scanners [full width at half maximum (FWHM) 0.5-0.8 mm]. Our model-based method was evaluated in practice by measuring layer thickness from CT images of a phantom mimicking two parallel cartilage layers in an arthrography procedure. CT arthrography images of cadaver wrists were also evaluated, and thickness estimates were compared to those obtained from high-resolution anatomical sections that served as a reference. The thickness estimates from the simulated images reveal that the method based on second derivative zero crossings shows considerable bias for layers in the submillimeter range. This bias is negligible for sheet thickness larger than 1 mm, where the size of the sheet is more than twice the FWHM of the PSF but can be as large as 0.2 mm for a 0.5 mm sheet. The results of the phantom experiments show that the bias is effectively reduced by our method. The deviations from the true thickness, due to random fluctuations induced by quantum noise in the CT images, are of the order of 3% for a standard wrist imaging protocol. In the wrist the submillimeter thickness estimates from the CT arthrography images correspond within 10% to those estimated from the anatomical

  4. Laser-Based Diagnostic Measurements of Low Emissions Combustor Concepts

    Science.gov (United States)

    Hicks, Yolanda R.

    2011-01-01

    This presentation provides a summary of primarily laser-based measurement techniques we use at NASA Glenn Research Center to characterize fuel injection, fuel/air mixing, and combustion. The report highlights using Planar Laser-Induced Fluorescence, Particle Image Velocimetry, and Phase Doppler Interferometry to obtain fuel injector patternation, fuel and air velocities, and fuel drop sizes and turbulence intensities during combustion. We also present a brief comparison between combustors burning standard JP-8 Jet fuel and an alternative fuels. For this comparison, we used flame chemiluminescence and high speed imaging.

  5. Measuring participant rurality in Web-based interventions

    Directory of Open Access Journals (Sweden)

    McKay H Garth

    2007-08-01

    Full Text Available Abstract Background Web-based health behavior change programs can reach large groups of disparate participants and thus they provide promise of becoming important public health tools. Data on participant rurality can complement other demographic measures to deepen our understanding of the success of these programs. Specifically, analysis of participant rurality can inform recruitment and social marketing efforts, and facilitate the targeting and tailoring of program content. Rurality analysis can also help evaluate the effectiveness of interventions across population groupings. Methods We describe how the RUCAs (Rural-Urban Commuting Area Codes methodology can be used to examine results from two Randomized Controlled Trials of Web-based tobacco cessation programs: the ChewFree.com project for smokeless tobacco cessation and the Smokers' Health Improvement Program (SHIP project for smoking cessation. Results Using RUCAs methodology helped to highlight the extent to which both Web-based interventions reached a substantial percentage of rural participants. The ChewFree program was found to have more rural participation which is consistent with the greater prevalence of smokeless tobacco use in rural settings as well as ChewFree's multifaceted recruitment program that specifically targeted rural settings. Conclusion Researchers of Web-based health behavior change programs targeted to the US should routinely include RUCAs as a part of analyzing participant demographics. Researchers in other countries should examine rurality indices germane to their country.

  6. Air temperature measurements based on the speed of sound to compensate long distance interferometric measurements

    Directory of Open Access Journals (Sweden)

    Astrua Milena

    2014-01-01

    Full Text Available A method to measure the real time temperature distribution along an interferometer path based on the propagation of acoustic waves is presented. It exploits the high sensitivity of the speed of sound in air to the air temperature. In particular, it takes advantage of a special set-up where the generation of the acoustic waves is synchronous with the amplitude modulation of a laser source. A photodetector converts the laser light to an electronic signal considered as reference, while the incoming acoustic waves are focused on a microphone and generate a second signal. In this condition, the phase difference between the two signals substantially depends on the temperature of the air volume interposed between the sources and the receivers. The comparison with the traditional temperature sensors highlighted the limit of the latter in case of fast temperature variations and the advantage of a measurement integrated along the optical path instead of a sampling measurement. The capability of the acoustic method to compensate the interferometric distance measurements due to air temperature variations has been demonstrated for distances up to 27 m.

  7. Effects of curriculum-based measurement on teachers' instructional planning.

    Science.gov (United States)

    Fuchs, L S; Fuchs, D; Stecker, P M

    1989-01-01

    This study assessed the effects of curriculum-based measurement (CBM) on teachers' instructional planning. Subjects were 30 teachers, assigned randomly to a computer-assisted CBM group, a noncomputer CBM group, and a contrast group. In the CBM groups, teachers specified 15-week reading goals, established CBM systems to measure student progress toward goals at least twice weekly, and systematically evaluated those data bases to determine when instructional modifications were necessary. Contrast teachers monitored student progress toward Individualized Education Program (IEP) goals as they wished and were encouraged to develop instructional programs as necessary. At the end of a 12- to 15-week implementation period, teachers completed a questionnaire with reference to one randomly selected pupil. Analyses of variance indicated no difference between the CBM groups. However, compared to the contrast group, CBM teachers (a) used more specific, acceptable goals; (b) were less optimistic about goal attainment; (c) cited more objective and frequent data sources for determining the adequacy of student progress and for deciding whether program modifications were necessary; and (d) modified student programs more frequently. Questionnaire responses were correlated with verifiable data sources, and results generally supported the usefulness of the self-report information. Implications for special education research and practice are discussed.

  8. Fiber Bragg Grating Based System for Temperature Measurements

    Science.gov (United States)

    Tahir, Bashir Ahmed; Ali, Jalil; Abdul Rahman, Rosly

    In this study, a fiber Bragg grating sensor for temperature measurement is proposed and experimentally demonstrated. In particular, we point out that the method is well-suited for monitoring temperature because they are able to withstand a high temperature environment, where standard thermocouple methods fail. The interrogation technologies of the sensor systems are all simple, low cost and effective as well. In the sensor system, fiber grating was dipped into a water beaker that was placed on a hotplate to control the temperature of water. The temperature was raised in equal increments. The sensing principle is based on tracking of Bragg wavelength shifts caused by the temperature change. So the temperature is measured based on the wavelength-shifts of the FBG induced by the heating water. The fiber grating is high temperature stable excimer-laser-induced grating and has a linear function of wavelength-temperature in the range of 0-285°C. A dynamic range of 0-285°C and a sensitivity of 0.0131 nm/°C almost equal to that of general FBG have been obtained by this sensor system. Furthermore, the correlation of theoretical analysis and experimental results show the capability and feasibility of the purposed technique.

  9. Quantum Jarzynski equality of measurement-based work extraction.

    Science.gov (United States)

    Morikuni, Yohei; Tajima, Hiroyasu; Hatano, Naomichi

    2017-03-01

    Many studies of quantum-size heat engines assume that the dynamics of an internal system is unitary and that the extracted work is equal to the energy loss of the internal system. Both assumptions, however, should be under scrutiny. In the present paper, we analyze quantum-scale heat engines, employing the measurement-based formulation of the work extraction recently introduced by Hayashi and Tajima [M. Hayashi and H. Tajima, arXiv:1504.06150]. We first demonstrate the inappropriateness of the unitary time evolution of the internal system (namely, the first assumption above) using a simple two-level system; we show that the variance of the energy transferred to an external system diverges when the dynamics of the internal system is approximated to a unitary time evolution. Second, we derive the quantum Jarzynski equality based on the formulation of Hayashi and Tajima as a relation for the work measured by an external macroscopic apparatus. The right-hand side of the equality reduces to unity for "natural" cyclic processes but fluctuates wildly for noncyclic ones, exceeding unity often. This fluctuation should be detectable in experiments and provide evidence for the present formulation.

  10. Security Measurement for Unknown Threats Based on Attack Preferences

    Directory of Open Access Journals (Sweden)

    Lihua Yin

    2018-01-01

    Full Text Available Security measurement matters to every stakeholder in network security. It provides security practitioners the exact security awareness. However, most of the works are not applicable to the unknown threat. What is more, existing efforts on security metric mainly focus on the ease of certain attack from a theoretical point of view, ignoring the “likelihood of exploitation.” To help administrator have a better understanding, we analyze the behavior of attackers who exploit the zero-day vulnerabilities and predict their attack timing. Based on the prediction, we propose a method of security measurement. In detail, we compute the optimal attack timing from the perspective of attacker, using a long-term game to estimate the risk of being found and then choose the optimal timing based on the risk and profit. We design a learning strategy to model the information sharing mechanism among multiattackers and use spatial structure to model the long-term process. After calculating the Nash equilibrium for each subgame, we consider the likelihood of being attacked for each node as the security metric result. The experiment results show the efficiency of our approach.

  11. Environmental dose measurement with microprocessor based portable TLD reader

    International Nuclear Information System (INIS)

    Deme, S.; Apathy, I.; Feher, I.

    1996-01-01

    Application of TL method for environmental gamma-radiation dosimetry involves uncertainty caused by the dose collected during the transport from the point of annealing to the place of exposure and back to the place of evaluation. Should an accident occur read out is delayed due to the need to transport to a laboratory equipped with a TLD reader. A portable reader capable of reading out the TL dosemeter at the place of exposure ('in situ TLD reader') eliminates the above mentioned disadvantages. We have developed a microprocessor based portable TLD reader for monitoring environmental gamma-radiation doses and for on board reading out of doses on space stations. The first version of our portable, battery operated reader (named Pille - 'butterfly') was made at the beginning of the 80s. These devices used CaSO 4 bulb dosemeters and the evaluation technique was based on analogue timing circuits and analogue to digital conversion of the photomultiplier current with a read out precision of 1 μGy and a measuring range up to 10 Gy. The measured values were displayed and manually recorded. The version with an external power supply was used for space dosimetry as an onboard TLD reader

  12. Heart rate measurement based on face video sequence

    Science.gov (United States)

    Xu, Fang; Zhou, Qin-Wu; Wu, Peng; Chen, Xing; Yang, Xiaofeng; Yan, Hong-jian

    2015-03-01

    This paper proposes a new non-contact heart rate measurement method based on photoplethysmography (PPG) theory. With this method we can measure heart rate remotely with a camera and ambient light. We collected video sequences of subjects, and detected remote PPG signals through video sequences. Remote PPG signals were analyzed with two methods, Blind Source Separation Technology (BSST) and Cross Spectral Power Technology (CSPT). BSST is a commonly used method, and CSPT is used for the first time in the study of remote PPG signals in this paper. Both of the methods can acquire heart rate, but compared with BSST, CSPT has clearer physical meaning, and the computational complexity of CSPT is lower than that of BSST. Our work shows that heart rates detected by CSPT method have good consistency with the heart rates measured by a finger clip oximeter. With good accuracy and low computational complexity, the CSPT method has a good prospect for the application in the field of home medical devices and mobile health devices.

  13. Coordinate measuring system based on microchip lasers for reverse prototyping

    Science.gov (United States)

    Iakovlev, Alexey; Grishkanich, Alexsandr S.; Redka, Dmitriy; Tsvetkov, Konstantin

    2017-02-01

    According to the current great interest concerning Large-Scale Metrology applications in many different fields of manufacturing industry, technologies and techniques for dimensional measurement have recently shown a substantial improvement. Ease-of-use, logistic and economic issues, as well as metrological performance, are assuming a more and more important role among system requirements. The project is planned to conduct experimental studies aimed at identifying the impact of the application of the basic laws of chip and microlasers as radiators on the linear-angular characteristics of existing measurement systems. The project is planned to conduct experimental studies aimed at identifying the impact of the application of the basic laws of microlasers as radiators on the linear-angular characteristics of existing measurement systems. The system consists of a distributed network-based layout, whose modularity allows to fit differently sized and shaped working volumes by adequately increasing the number of sensing units. Differently from existing spatially distributed metrological instruments, the remote sensor devices are intended to provide embedded data elaboration capabilities, in order to share the overall computational load.

  14. Analogy between gambling and measurement-based work extraction

    Science.gov (United States)

    Vinkler, Dror A.; Permuter, Haim H.; Merhav, Neri

    2016-04-01

    In information theory, one area of interest is gambling, where mutual information characterizes the maximal gain in wealth growth rate due to knowledge of side information; the betting strategy that achieves this maximum is named the Kelly strategy. In the field of physics, it was recently shown that mutual information can characterize the maximal amount of work that can be extracted from a single heat bath using measurement-based control protocols, i.e. using ‘information engines’. However, to the best of our knowledge, no relation between gambling and information engines has been presented before. In this paper, we briefly review the two concepts and then demonstrate an analogy between gambling, where bits are converted into wealth, and information engines, where bits representing measurements are converted into energy. From this analogy follows an extension of gambling to the continuous-valued case, which is shown to be useful for investments in currency exchange rates or in the stock market using options. Moreover, the analogy enables us to use well-known methods and results from one field to solve problems in the other. We present three such cases: maximum work extraction when the probability distributions governing the system and measurements are unknown, work extraction when some energy is lost in each cycle, e.g. due to friction, and an analysis of systems with memory. In all three cases, the analogy enables us to use known results in order to obtain new ones.

  15. Electroencephalogram measurement using polymer-based dry microneedle electrode

    Science.gov (United States)

    Arai, Miyako; Nishinaka, Yuya; Miki, Norihisa

    2015-06-01

    In this paper, we report a successful electroencephalogram (EEG) measurement using polymer-based dry microneedle electrodes. The electrodes consist of needle-shaped substrates of SU-8, a silver film, and a nanoporous parylene protective film. Differently from conventional wet electrodes, microneedle electrodes do not require skin preparation and a conductive gel. SU-8 is superior as a structural material to poly(dimethylsiloxane) (PDMS; Dow Corning Toray Sylgard 184) in terms of hardness, which was used in our previous work, and facilitates the penetration of needles through the stratum corneum. SU-8 microneedles can be successfully inserted into the skin without breaking and could maintain a sufficiently low skin-electrode contact impedance for EEG measurement. The electrodes successfully measured EEG from the frontal pole, and the quality of acquired signals was verified to be as high as those obtained using commercially available wet electrodes without any skin preparation or a conductive gel. The electrodes are readily applicable to record brain activities for a long period with little stress involved in skin preparation to the users.

  16. A Method to Measure the Bracelet Based on Feature Energy

    Science.gov (United States)

    Liu, Hongmin; Li, Lu; Wang, Zhiheng; Huo, Zhanqiang

    2017-12-01

    To measure the bracelet automatically, a novel method based on feature energy is proposed. Firstly, the morphological method is utilized to preprocess the image, and the contour consisting of a concentric circle is extracted. Then, a feature energy function, which is relevant to the distances from one pixel to the edge points, is defined taking into account the geometric properties of the concentric circle. The input image is subsequently transformed to the feature energy distribution map (FEDM) by computing the feature energy of each pixel. The center of the concentric circle is thus located by detecting the maximum on the FEDM; meanwhile, the radii of the concentric circle are determined according to the feature energy function of the center pixel. Finally, with the use of a calibration template, the internal diameter and thickness of the bracelet are measured. The experimental results show that the proposed method can measure the true sizes of the bracelet accurately with the simplicity, directness and robustness compared to the existing methods.

  17. EIGENVECTOR-BASED CENTRALITY MEASURES FOR TEMPORAL NETWORKS*

    Science.gov (United States)

    TAYLOR, DANE; MYERS, SEAN A.; CLAUSET, AARON; PORTER, MASON A.; MUCHA, PETER J.

    2017-01-01

    Numerous centrality measures have been developed to quantify the importances of nodes in time-independent networks, and many of them can be expressed as the leading eigenvector of some matrix. With the increasing availability of network data that changes in time, it is important to extend such eigenvector-based centrality measures to time-dependent networks. In this paper, we introduce a principled generalization of network centrality measures that is valid for any eigenvector-based centrality. We consider a temporal network with N nodes as a sequence of T layers that describe the network during different time windows, and we couple centrality matrices for the layers into a supra-centrality matrix of size NT × NT whose dominant eigenvector gives the centrality of each node i at each time t. We refer to this eigenvector and its components as a joint centrality, as it reflects the importances of both the node i and the time layer t. We also introduce the concepts of marginal and conditional centralities, which facilitate the study of centrality trajectories over time. We find that the strength of coupling between layers is important for determining multiscale properties of centrality, such as localization phenomena and the time scale of centrality changes. In the strong-coupling regime, we derive expressions for time-averaged centralities, which are given by the zeroth-order terms of a singular perturbation expansion. We also study first-order terms to obtain first-order-mover scores, which concisely describe the magnitude of nodes’ centrality changes over time. As examples, we apply our method to three empirical temporal networks: the United States Ph.D. exchange in mathematics, costarring relationships among top-billed actors during the Golden Age of Hollywood, and citations of decisions from the United States Supreme Court. PMID:29046619

  18. Measuring energy efficiency: Is energy intensity a good evidence base?

    International Nuclear Information System (INIS)

    Proskuryakova, L.; Kovalev, A.

    2015-01-01

    Highlights: • Energy intensity measure reflects consumption, not energy efficiency. • Thermodynamic indicators should describe energy efficiency at all levels. • These indicators should have no reference to economic or financial parameters. • A set of energy efficiency indicators should satisfy several basic principles. • There are trade-offs between energy efficiency, power and costs. - Abstract: There is a widespread assumption in energy statistics and econometrics that energy intensity and energy efficiency are equivalent measures of energy performance of economies. The paper points to the discrepancy between the engineering concept of energy efficiency and the energy intensity as it is understood in macroeconomic statistics. This double discrepancy concerns definitions (while engineering concept of energy efficiency is based on the thermodynamic definition, energy intensity includes economic measures) and use. With regard to the latter, the authors conclude that energy intensity can only provide indirect and delayed evidence of technological and engineering energy efficiency of energy conversion processes, which entails shortcomings for management and policymaking. Therefore, we suggest to stop considering subsectoral, sectoral and other levels of energy intensities as aggregates of lower-level energy efficiency. It is suggested that the insufficiency of energy intensity indicators can be compensated with the introduction of thermodynamic indicators describing energy efficiency at the physical, technological, enterprise, sub-sector, sectoral and national levels without references to any economic or financial parameters. Structured statistical data on thermodynamic efficiency is offered as a better option for identifying break-through technologies and technological bottle-necks that constrain efficiency advancements. It is also suggested that macro-level thermodynamic indicators should be based on the thermodynamic first law efficiency and the energy

  19. Accurate fluid force measurement based on control surface integration

    Science.gov (United States)

    Lentink, David

    2018-01-01

    Nonintrusive 3D fluid force measurements are still challenging to conduct accurately for freely moving animals, vehicles, and deforming objects. Two techniques, 3D particle image velocimetry (PIV) and a new technique, the aerodynamic force platform (AFP), address this. Both rely on the control volume integral for momentum; whereas PIV requires numerical integration of flow fields, the AFP performs the integration mechanically based on rigid walls that form the control surface. The accuracy of both PIV and AFP measurements based on the control surface integration is thought to hinge on determining the unsteady body force associated with the acceleration of the volume of displaced fluid. Here, I introduce a set of non-dimensional error ratios to show which fluid and body parameters make the error negligible. The unsteady body force is insignificant in all conditions where the average density of the body is much greater than the density of the fluid, e.g., in gas. Whenever a strongly deforming body experiences significant buoyancy and acceleration, the error is significant. Remarkably, this error can be entirely corrected for with an exact factor provided that the body has a sufficiently homogenous density or acceleration distribution, which is common in liquids. The correction factor for omitting the unsteady body force, {{{ {ρ f}} {1 - {ρ f} ( {{ρ b}+{ρ f}} )}.{( {{{{ρ }}b}+{ρ f}} )}}} , depends only on the fluid, {ρ f}, and body, {{ρ }}b, density. Whereas these straightforward solutions work even at the liquid-gas interface in a significant number of cases, they do not work for generalized bodies undergoing buoyancy in combination with appreciable body density inhomogeneity, volume change (PIV), or volume rate-of-change (PIV and AFP). In these less common cases, the 3D body shape needs to be measured and resolved in time and space to estimate the unsteady body force. The analysis shows that accounting for the unsteady body force is straightforward to non

  20. A Time-Measurement System Based on Isotopic Ratios

    International Nuclear Information System (INIS)

    Vo, Duc T.; Karpius, P.J.; MacArthur, D.W.; Thron, J.L.

    2007-01-01

    A time-measurement system can be built based on the ratio of gamma-ray peak intensities from two radioactive isotopes. The ideal system would use a parent isotope with a short half-life decaying to a long half-life daughter. The activities of the parent-daughter isotopes would be measured using a gamma-ray detector system. The time can then be determined from the ratio of the activities. The best-known candidate for such a system is the 241 Pu- 241 Am parent-daughter pair. However, this 241 Pu- 241 Am system would require a high-purity germanium detector system and sophisticated software to separate and distinguish between the many gamma-ray peaks produced by the decays of the two isotopes. An alternate system would use two different isotopes, again one with a short half-life and one with a half-life that is long relative to the other. The pair of isotopes 210 Pb and 241 Am (with half-lives of 22 and 432 years, respectively) appears suitable for such a system. This time-measurement system operates by measuring the change in the ratio of the 47-keV peak of 210 Pb to the 60-keV peak of 241 Am. For the system to work reasonably well, the resolution of the detector would need to be such that the two gamma-ray peaks are well separated so that their peak areas can be accurately determined using a simple region-of-interest (ROI) method. A variety of detectors were tested to find a suitable system for this application. The results of these tests are presented here.

  1. Ozone Measurements Monitoring Using Data-Based Approach

    KAUST Repository

    Harrou, Fouzi; Kadri, Farid; Khadraoui, Sofiane; Sun, Ying

    2016-01-01

    The complexity of ozone (O3) formation mechanisms in the troposphere make the fast and accurate modeling of ozone very challenging. In the absence of a process model, principal component analysis (PCA) has been extensively used as a data-based monitoring technique for highly correlated process variables; however conventional PCA-based detection indices often fail to detect small or moderate anomalies. In this work, we propose an innovative method for detecting small anomalies in highly correlated multivariate data. The developed method combine the multivariate exponentially weighted moving average (MEWMA) monitoring scheme with PCA modelling in order to enhance anomaly detection performance. Such a choice is mainly motivated by the greater ability of the MEWMA monitoring scheme to detect small changes in the process mean. The proposed PCA-based MEWMA monitoring scheme is successfully applied to ozone measurements data collected from Upper Normandy region, France, via the network of air quality monitoring stations. The detection results of the proposed method are compared to that declared by Air Normand air monitoring association.

  2. Ozone Measurements Monitoring Using Data-Based Approach

    KAUST Repository

    Harrou, Fouzi

    2016-02-01

    The complexity of ozone (O3) formation mechanisms in the troposphere make the fast and accurate modeling of ozone very challenging. In the absence of a process model, principal component analysis (PCA) has been extensively used as a data-based monitoring technique for highly correlated process variables; however conventional PCA-based detection indices often fail to detect small or moderate anomalies. In this work, we propose an innovative method for detecting small anomalies in highly correlated multivariate data. The developed method combine the multivariate exponentially weighted moving average (MEWMA) monitoring scheme with PCA modelling in order to enhance anomaly detection performance. Such a choice is mainly motivated by the greater ability of the MEWMA monitoring scheme to detect small changes in the process mean. The proposed PCA-based MEWMA monitoring scheme is successfully applied to ozone measurements data collected from Upper Normandy region, France, via the network of air quality monitoring stations. The detection results of the proposed method are compared to that declared by Air Normand air monitoring association.

  3. Output power distributions of mobile radio base stations based on network measurements

    International Nuclear Information System (INIS)

    Colombi, D; Thors, B; Persson, T; Törnevik, C; Wirén, N; Larsson, L-E

    2013-01-01

    In this work output power distributions of mobile radio base stations have been analyzed for 2G and 3G telecommunication systems. The approach is based on measurements in selected networks using performance surveillance tools part of the network Operational Support System (OSS). For the 3G network considered, direct measurements of output power levels were possible, while for the 2G networks, output power levels were estimated from measurements of traffic volumes. Both voice and data services were included in the investigation. Measurements were conducted for large geographical areas, to ensure good overall statistics, as well as for smaller areas to investigate the impact of different environments. For high traffic hours, the 90th percentile of the averaged output power was found to be below 65% and 45% of the available output power for the 2G and 3G systems, respectively.

  4. Output power distributions of mobile radio base stations based on network measurements

    Science.gov (United States)

    Colombi, D.; Thors, B.; Persson, T.; Wirén, N.; Larsson, L.-E.; Törnevik, C.

    2013-04-01

    In this work output power distributions of mobile radio base stations have been analyzed for 2G and 3G telecommunication systems. The approach is based on measurements in selected networks using performance surveillance tools part of the network Operational Support System (OSS). For the 3G network considered, direct measurements of output power levels were possible, while for the 2G networks, output power levels were estimated from measurements of traffic volumes. Both voice and data services were included in the investigation. Measurements were conducted for large geographical areas, to ensure good overall statistics, as well as for smaller areas to investigate the impact of different environments. For high traffic hours, the 90th percentile of the averaged output power was found to be below 65% and 45% of the available output power for the 2G and 3G systems, respectively.

  5. Validation of OMI UV measurements against ground-based measurements at a station in Kampala, Uganda

    Science.gov (United States)

    Muyimbwa, Dennis; Dahlback, Arne; Stamnes, Jakob; Hamre, Børge; Frette, Øyvind; Ssenyonga, Taddeo; Chen, Yi-Chun

    2015-04-01

    We present solar ultraviolet (UV) irradiance data measured with a NILU-UV instrument at a ground site in Kampala (0.31°N, 32.58°E), Uganda for the period 2005-2014. The data were analyzed and compared with UV irradiances inferred from the Ozone Monitoring Instrument (OMI) for the same period. Kampala is located on the shores of lake Victoria, Africa's largest fresh water lake, which may influence the climate and weather conditions of the region. Also, there is an excessive use of worn cars, which may contribute to a high anthropogenic loading of absorbing aerosols. The OMI surface UV algorithm does not account for absorbing aerosols, which may lead to systematic overestimation of surface UV irradiances inferred from OMI satellite data. We retrieved UV index values from OMI UV irradiances and validated them against the ground-based UV index values obtained from NILU-UV measurements. The UV index values were found to follow a seasonal pattern similar to that of the clouds and the rainfall. OMI inferred UV index values were overestimated with a mean bias of about 28% under all-sky conditions, but the mean bias was reduced to about 8% under clear-sky conditions when only days with radiation modification factor (RMF) greater than 65% were considered. However, when days with RMF greater than 70, 75, and 80% were considered, OMI inferred UV index values were found to agree with the ground-based UV index values to within 5, 3, and 1%, respectively. In the validation we identified clouds/aerosols, which were present in 88% of the measurements, as the main cause of OMI inferred overestimation of the UV index.

  6. Developing barbed microtip-based electrode arrays for biopotential measurement.

    Science.gov (United States)

    Hsu, Li-Sheng; Tung, Shu-Wei; Kuo, Che-Hsi; Yang, Yao-Joe

    2014-07-10

    This study involved fabricating barbed microtip-based electrode arrays by using silicon wet etching. KOH anisotropic wet etching was employed to form a standard pyramidal microtip array and HF/HNO3 isotropic etching was used to fabricate barbs on these microtips. To improve the electrical conductance between the tip array on the front side of the wafer and the electrical contact on the back side, a through-silicon via was created during the wet etching process. The experimental results show that the forces required to detach the barbed microtip arrays from human skin, a polydimethylsiloxane (PDMS) polymer, and a polyvinylchloride (PVC) film were larger compared with those required to detach microtip arrays that lacked barbs. The impedances of the skin-electrode interface were measured and the performance levels of the proposed dry electrode were characterized. Electrode prototypes that employed the proposed tip arrays were implemented. Electroencephalogram (EEG) and electrocardiography (ECG) recordings using these electrode prototypes were also demonstrated.

  7. EPR-based distance measurements at ambient temperature.

    Science.gov (United States)

    Krumkacheva, Olesya; Bagryanskaya, Elena

    2017-07-01

    Pulsed dipolar (PD) EPR spectroscopy is a powerful technique allowing for distance measurements between spin labels in the range of 2.5-10.0nm. It was proposed more than 30years ago, and nowadays is widely used in biophysics and materials science. Until recently, PD EPR experiments were limited to cryogenic temperatures (TEPR as well as other approaches based on EPR (e.g., relaxation enhancement; RE). In this paper, we review the features of PD EPR and RE at ambient temperatures, in particular, requirements on electron spin phase memory time, ways of immobilization of biomolecules, the influence of a linker between the spin probe and biomolecule, and future opportunities. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Smart phone-based Chemistry Instrumentation: Digitization of Colorimetric Measurements

    International Nuclear Information System (INIS)

    Chang, Byoung Yong

    2012-01-01

    This report presents a mobile instrumentation platform based on a smart phone using its built-in functions for colorimetric diagnosis. The color change as a result of detection is taken as a picture through a CCD camera built in the smart phone, and is evaluated in the form of the hue value to give the well-defined relationship between the color and the concentration. To prove the concept in the present work, proton concentration measurements were conducted on pH paper coupled with a smart phone for demonstration. This report is believed to show the possibility of adapting a smart phone to a mobile analytical transducer, and more applications for bioanalysis are expected to be developed using other built-in functions of the smart phone

  9. Radiation-damage measurements on PVT-based plastic scintillators

    International Nuclear Information System (INIS)

    Ilie, S.; Schoenbacher, H.; Tavlet, M.

    1993-01-01

    Samples of PVT-based plastic scintillators produced by Nuclear Enterprise Technology Ltd. (NET) were irradiated up to 9 kGy, both with a gamma source and within a typical accelerator radiation field (CERN PS ACOL Irradiation Facility). The consequent reduction of scintillating efficiency and light transmission were measured, as well as subsequent recovery, over a period of several months. The main results show that irradiation affects more the light transmission than the light emission. The radiation type does not affect either the amount of transmission reduction or the recovery. Observations were also made by means of polarized light. Non-uniformities and internal stresses were observed in scintillator bulks which were polymerized too quickly. These defects influence the light transmission. (orig.)

  10. Measurement-Based Entanglement of Noninteracting Bosonic Atoms.

    Science.gov (United States)

    Lester, Brian J; Lin, Yiheng; Brown, Mark O; Kaufman, Adam M; Ball, Randall J; Knill, Emanuel; Rey, Ana M; Regal, Cindy A

    2018-05-11

    We demonstrate the ability to extract a spin-entangled state of two neutral atoms via postselection based on a measurement of their spatial configuration. Typically, entangled states of neutral atoms are engineered via atom-atom interactions. In contrast, in our Letter, we use Hong-Ou-Mandel interference to postselect a spin-singlet state after overlapping two atoms in distinct spin states on an effective beam splitter. We verify the presence of entanglement and determine a bound on the postselected fidelity of a spin-singlet state of (0.62±0.03). The experiment has direct analogy to creating polarization entanglement with single photons and hence demonstrates the potential to use protocols developed for photons to create complex quantum states with noninteracting atoms.

  11. Monte Carlo evaluation of derivative-based global sensitivity measures

    Energy Technology Data Exchange (ETDEWEB)

    Kucherenko, S. [Centre for Process Systems Engineering, Imperial College London, London SW7 2AZ (United Kingdom)], E-mail: s.kucherenko@ic.ac.uk; Rodriguez-Fernandez, M. [Process Engineering Group, Instituto de Investigaciones Marinas, Spanish Council for Scientific Research (C.S.I.C.), C/ Eduardo Cabello, 6, 36208 Vigo (Spain); Pantelides, C.; Shah, N. [Centre for Process Systems Engineering, Imperial College London, London SW7 2AZ (United Kingdom)

    2009-07-15

    A novel approach for evaluation of derivative-based global sensitivity measures (DGSM) is presented. It is compared with the Morris and the Sobol' sensitivity indices methods. It is shown that there is a link between DGSM and Sobol' sensitivity indices. DGSM are very easy to implement and evaluate numerically. The computational time required for numerical evaluation of DGSM is many orders of magnitude lower than that for estimation of the Sobol' sensitivity indices. It is also lower than that for the Morris method. Efficiencies of Monte Carlo (MC) and quasi-Monte Carlo (QMC) sampling methods for calculation of DGSM are compared. It is shown that the superiority of QMC over MC depends on the problem's effective dimension, which can also be estimated using DGSM.

  12. Monte Carlo evaluation of derivative-based global sensitivity measures

    International Nuclear Information System (INIS)

    Kucherenko, S.; Rodriguez-Fernandez, M.; Pantelides, C.; Shah, N.

    2009-01-01

    A novel approach for evaluation of derivative-based global sensitivity measures (DGSM) is presented. It is compared with the Morris and the Sobol' sensitivity indices methods. It is shown that there is a link between DGSM and Sobol' sensitivity indices. DGSM are very easy to implement and evaluate numerically. The computational time required for numerical evaluation of DGSM is many orders of magnitude lower than that for estimation of the Sobol' sensitivity indices. It is also lower than that for the Morris method. Efficiencies of Monte Carlo (MC) and quasi-Monte Carlo (QMC) sampling methods for calculation of DGSM are compared. It is shown that the superiority of QMC over MC depends on the problem's effective dimension, which can also be estimated using DGSM.

  13. Measuring intracellular redox conditions using GFP-based sensors

    DEFF Research Database (Denmark)

    Björnberg, Olof; Ostergaard, Henrik; Winther, Jakob R

    2006-01-01

    Recent years have seen the development of methods for analyzing the redox conditions in specific compartments in living cells. These methods are based on genetically encoded sensors comprising variants of Green Fluorescent Protein in which vicinal cysteine residues have been introduced at solvent......-exposed positions. Several mutant forms have been identified in which formation of a disulfide bond between these cysteine residues results in changes of their fluorescence properties. The redox sensors have been characterized biochemically and found to behave differently, both spectroscopically and in terms...... of redox properties. As genetically encoded sensors they can be expressed in living cells and used for analysis of intracellular redox conditions; however, which parameters are measured depends on how the sensors interact with various cellular redox components. Results of both biochemical and cell...

  14. Accurate position estimation methods based on electrical impedance tomography measurements

    Science.gov (United States)

    Vergara, Samuel; Sbarbaro, Daniel; Johansen, T. A.

    2017-08-01

    than 0.05% of the tomograph radius value. These results demonstrate that the proposed approaches can estimate an object’s position accurately based on EIT measurements if enough process information is available for training or modelling. Since they do not require complex calculations it is possible to use them in real-time applications without requiring high-performance computers.

  15. Surface characterization of hemodialysis membranes based on streaming potential measurements.

    Science.gov (United States)

    Werner, C; Jacobasch, H J; Reichelt, G

    1995-01-01

    Hemodialysis membranes made from cellulose (CUPROPHAN, HEMOPHAN) and sulfonated polyethersulfone (SPES) were characterized using the streaming potential technique to determine the zeta potential at their interfaces against well-defined aqueous solutions of varied pH and potassium chloride concentrations. Streaming potential measurements enable distinction between different membrane materials. In addition to parameters of the electrochemical double layer at membrane interfaces, thermodynamic characteristics of adsorption of different solved species were evaluated. For that aim a description of double layer formation as suggested by Börner and Jacobasch (in: Electrokinetic Phenomena, p. 231. Institut für Technologie der Polymere, Dresden (1989)) was applied which is based on the generally accepted model of the electrochemical double layer according to Stern (Z. Elektrochemie 30, 508 (1924)) and Grahame (Chem. Rev. 41, 441 (1947)). The membranes investigated show different surface acidic/basic and polar/nonpolar behavior. Furthermore, alterations of membrane interfaces through adsorption processes of components of biologically relevant solutions were shown to be detectable by streaming potential measurements.

  16. Uav Positioning and Collision Avoidance Based on RSS Measurements

    Science.gov (United States)

    Masiero, A.; Fissore, F.; Guarnieri, A.; Pirotti, F.; Vettore, A.

    2015-08-01

    In recent years, Unmanned Aerial Vehicles (UAVs) are attracting more and more attention in both the research and industrial communities: indeed, the possibility to use them in a wide range of remote sensing applications makes them a very flexible and attractive solution in both civil and commercial cases (e.g. precision agriculture, security and control, monitoring of sites, exploration of areas difficult to reach). Most of the existing UAV positioning systems rely on the use of the GPS signal. Despite this can be a satisfactory solution in open environments where the GPS signal is available, there are several operating conditions of interest where it is unavailable or unreliable (e.g. close to high buildings, or mountains, in indoor environments). Consequently, a different approach has to be adopted in these cases. This paper considers the use ofWiFi measurements in order to obtain position estimations of the device of interest. More specifically, to limit the costs for the devices involved in the positioning operations, an approach based on radio signal strengths (RSS) measurements is considered. Thanks to the use of a Kalman filter, the proposed approach takes advantage of the temporal dynamic of the device of interest in order to improve the positioning results initially provided by means of maximum likelihood estimations. The considered UAVs are assumed to be provided with communication devices, which can allow them to communicate with each other in order to improve their cooperation abilities. In particular, the collision avoidance problem is examined in this work.

  17. Extrapolated HPGe efficiency estimates based on a single calibration measurement

    International Nuclear Information System (INIS)

    Winn, W.G.

    1994-01-01

    Gamma spectroscopists often must analyze samples with geometries for which their detectors are not calibrated. The effort to experimentally recalibrate a detector for a new geometry can be quite time consuming, causing delay in reporting useful results. Such concerns have motivated development of a method for extrapolating HPGe efficiency estimates from an existing single measured efficiency. Overall, the method provides useful preliminary results for analyses that do not require exceptional accuracy, while reliably bracketing the credible range. The estimated efficiency element-of for a uniform sample in a geometry with volume V is extrapolated from the measured element-of 0 of the base sample of volume V 0 . Assuming all samples are centered atop the detector for maximum efficiency, element-of decreases monotonically as V increases about V 0 , and vice versa. Extrapolation of high and low efficiency estimates element-of h and element-of L provides an average estimate of element-of = 1/2 [element-of h + element-of L ] ± 1/2 [element-of h - element-of L ] (general) where an uncertainty D element-of = 1/2 (element-of h - element-of L ] brackets limits for a maximum possible error. The element-of h and element-of L both diverge from element-of 0 as V deviates from V 0 , causing D element-of to increase accordingly. The above concepts guided development of both conservative and refined estimates for element-of

  18. A computer-based measure of resultant achievement motivation.

    Science.gov (United States)

    Blankenship, V

    1987-08-01

    Three experiments were conducted to develop a computer-based measure of individual differences in resultant achievement motivation (RAM) on the basis of level-of-aspiration, achievement motivation, and dynamics-of-action theories. In Experiment 1, the number of atypical shifts and greater responsiveness to incentives on 21 trials with choices among easy, intermediate, and difficult levels of an achievement-oriented game were positively correlated and were found to differentiate the 62 subjects (31 men, 31 women) on the amount of time they spent at a nonachievement task (watching a color design) 1 week later. In Experiment 2, test-retest reliability was established with the use of 67 subjects (15 men, 52 women). Point and no-point trials were offered in blocks, with point trials first for half the subjects and no-point trials first for the other half. Reliability was higher for the atypical-shift measure than for the incentive-responsiveness measure and was higher when points were offered first. In Experiment 3, computer anxiety was manipulated by creating a simulated computer breakdown in the experimental condition. Fifty-nine subjects (13 men, 46 women) were randomly assigned to the experimental condition or to one of two control conditions (an interruption condition and a no-interruption condition). Subjects with low RAM, as demonstrated by a low number of typical shifts, took longer to choose the achievement-oriented task, as predicted by the dynamics-of-action theory. The difference was evident in all conditions and most striking in the computer-breakdown condition. A change of focus from atypical to typical shifts is discussed.

  19. A Dynamic Attitude Measurement System Based on LINS

    Directory of Open Access Journals (Sweden)

    Hanzhou Li

    2014-08-01

    Full Text Available A dynamic attitude measurement system (DAMS is developed based on a laser inertial navigation system (LINS. Three factors of the dynamic attitude measurement error using LINS are analyzed: dynamic error, time synchronization and phase lag. An optimal coning errors compensation algorithm is used to reduce coning errors, and two-axis wobbling verification experiments are presented in the paper. The tests indicate that the attitude accuracy is improved 2-fold by the algorithm. In order to decrease coning errors further, the attitude updating frequency is improved from 200 Hz to 2000 Hz. At the same time, a novel finite impulse response (FIR filter with three notches is designed to filter the dither frequency of the ring laser gyro (RLG. The comparison tests suggest that the new filter is five times more effective than the old one. The paper indicates that phase-frequency characteristics of FIR filter and first-order holder of navigation computer constitute the main sources of phase lag in LINS. A formula to calculate the LINS attitude phase lag is introduced in the paper. The expressions of dynamic attitude errors induced by phase lag are derived. The paper proposes a novel synchronization mechanism that is able to simultaneously solve the problems of dynamic test synchronization and phase compensation. A single-axis turntable and a laser interferometer are applied to verify the synchronization mechanism. The experiments results show that the theoretically calculated values of phase lag and attitude error induced by phase lag can both match perfectly with testing data. The block diagram of DAMS and physical photos are presented in the paper. The final experiments demonstrate that the real-time attitude measurement accuracy of DAMS can reach up to 20″ (1σ and the synchronization error is less than 0.2 ms on the condition of three axes wobbling for 10 min.

  20. A Dynamic Attitude Measurement System Based on LINS

    Science.gov (United States)

    Li, Hanzhou; Pan, Quan; Wang, Xiaoxu; Zhang, Juanni; Li, Jiang; Jiang, Xiangjun

    2014-01-01

    A dynamic attitude measurement system (DAMS) is developed based on a laser inertial navigation system (LINS). Three factors of the dynamic attitude measurement error using LINS are analyzed: dynamic error, time synchronization and phase lag. An optimal coning errors compensation algorithm is used to reduce coning errors, and two-axis wobbling verification experiments are presented in the paper. The tests indicate that the attitude accuracy is improved 2-fold by the algorithm. In order to decrease coning errors further, the attitude updating frequency is improved from 200 Hz to 2000 Hz. At the same time, a novel finite impulse response (FIR) filter with three notches is designed to filter the dither frequency of the ring laser gyro (RLG). The comparison tests suggest that the new filter is five times more effective than the old one. The paper indicates that phase-frequency characteristics of FIR filter and first-order holder of navigation computer constitute the main sources of phase lag in LINS. A formula to calculate the LINS attitude phase lag is introduced in the paper. The expressions of dynamic attitude errors induced by phase lag are derived. The paper proposes a novel synchronization mechanism that is able to simultaneously solve the problems of dynamic test synchronization and phase compensation. A single-axis turntable and a laser interferometer are applied to verify the synchronization mechanism. The experiments results show that the theoretically calculated values of phase lag and attitude error induced by phase lag can both match perfectly with testing data. The block diagram of DAMS and physical photos are presented in the paper. The final experiments demonstrate that the real-time attitude measurement accuracy of DAMS can reach up to 20″ (1σ) and the synchronization error is less than 0.2 ms on the condition of three axes wobbling for 10 min. PMID:25177802

  1. Measuring Costs to Community-Based Agencies for Implementation of an Evidence-Based Practice.

    Science.gov (United States)

    Lang, Jason M; Connell, Christian M

    2017-01-01

    Healthcare reform has led to an increase in dissemination of evidence-based practices. Cost is frequently cited as a significant yet rarely studied barrier to dissemination of evidence-based practices and the associated improvements in quality of care. This study describes an approach to measuring the incremental, unreimbursed costs in staff time and direct costs to community-based clinics implementing an evidence-based practice through participating in a learning collaborative. Initial implementation costs exceeding those for providing "treatment as usual" were collected for ten clinics implementing trauma-focused cognitive behavioral therapy through participation in 10-month learning collaboratives. Incremental implementation costs of these ten community-based clinic teams averaged the equivalent of US$89,575 (US$ 2012). The most costly activities were training, supervision, preparation time, and implementation team meetings. Recommendations are made for further research on implementation costs, dissemination of evidence-based practices, and implications for researchers and policy makers.

  2. RISK LOAN PORTFOLIO OPTIMIZATION MODEL BASED ON CVAR RISK MEASURE

    Directory of Open Access Journals (Sweden)

    Ming-Chang LEE

    2015-07-01

    Full Text Available In order to achieve commercial banks liquidity, safety and profitability objective requirements, loan portfolio risk analysis based optimization decisions are rational allocation of assets.  The risk analysis and asset allocation are the key technology of banking and risk management.  The aim of this paper, build a loan portfolio optimization model based on risk analysis.  Loan portfolio rate of return by using Value-at-Risk (VaR and Conditional Value-at-Risk (CVaR constraint optimization decision model reflects the bank's risk tolerance, and the potential loss of direct control of the bank.  In this paper, it analyze a general risk management model applied to portfolio problems with VaR and CVaR risk measures by using Using the Lagrangian Algorithm.  This paper solves the highly difficult problem by matrix operation method.  Therefore, the combination of this paper is easy understanding the portfolio problems with VaR and CVaR risk model is a hyperbola in mean-standard deviation space.  It is easy calculation in proposed method.

  3. Research on cloud-based remote measurement and analysis system

    Science.gov (United States)

    Gao, Zhiqiang; He, Lingsong; Su, Wei; Wang, Can; Zhang, Changfan

    2015-02-01

    The promising potential of cloud computing and its convergence with technologies such as cloud storage, cloud push, mobile computing allows for creation and delivery of newer type of cloud service. Combined with the thought of cloud computing, this paper presents a cloud-based remote measurement and analysis system. This system mainly consists of three parts: signal acquisition client, web server deployed on the cloud service, and remote client. This system is a special website developed using asp.net and Flex RIA technology, which solves the selective contradiction between two monitoring modes, B/S and C/S. This platform supplies customer condition monitoring and data analysis service by Internet, which was deployed on the cloud server. Signal acquisition device is responsible for data (sensor data, audio, video, etc.) collection and pushes the monitoring data to the cloud storage database regularly. Data acquisition equipment in this system is only conditioned with the function of data collection and network function such as smartphone and smart sensor. This system's scale can adjust dynamically according to the amount of applications and users, so it won't cause waste of resources. As a representative case study, we developed a prototype system based on Ali cloud service using the rotor test rig as the research object. Experimental results demonstrate that the proposed system architecture is feasible.

  4. Video-based measurements for wireless capsule endoscope tracking

    International Nuclear Information System (INIS)

    Spyrou, Evaggelos; Iakovidis, Dimitris K

    2014-01-01

    The wireless capsule endoscope is a swallowable medical device equipped with a miniature camera enabling the visual examination of the gastrointestinal (GI) tract. It wirelessly transmits thousands of images to an external video recording system, while its location and orientation are being tracked approximately by external sensor arrays. In this paper we investigate a video-based approach to tracking the capsule endoscope without requiring any external equipment. The proposed method involves extraction of speeded up robust features from video frames, registration of consecutive frames based on the random sample consensus algorithm, and estimation of the displacement and rotation of interest points within these frames. The results obtained by the application of this method on wireless capsule endoscopy videos indicate its effectiveness and improved performance over the state of the art. The findings of this research pave the way for a cost-effective localization and travel distance measurement of capsule endoscopes in the GI tract, which could contribute in the planning of more accurate surgical interventions. (paper)

  5. Video-based measurements for wireless capsule endoscope tracking

    Science.gov (United States)

    Spyrou, Evaggelos; Iakovidis, Dimitris K.

    2014-01-01

    The wireless capsule endoscope is a swallowable medical device equipped with a miniature camera enabling the visual examination of the gastrointestinal (GI) tract. It wirelessly transmits thousands of images to an external video recording system, while its location and orientation are being tracked approximately by external sensor arrays. In this paper we investigate a video-based approach to tracking the capsule endoscope without requiring any external equipment. The proposed method involves extraction of speeded up robust features from video frames, registration of consecutive frames based on the random sample consensus algorithm, and estimation of the displacement and rotation of interest points within these frames. The results obtained by the application of this method on wireless capsule endoscopy videos indicate its effectiveness and improved performance over the state of the art. The findings of this research pave the way for a cost-effective localization and travel distance measurement of capsule endoscopes in the GI tract, which could contribute in the planning of more accurate surgical interventions.

  6. Node-based measures of connectivity in genetic networks.

    Science.gov (United States)

    Koen, Erin L; Bowman, Jeff; Wilson, Paul J

    2016-01-01

    At-site environmental conditions can have strong influences on genetic connectivity, and in particular on the immigration and settlement phases of dispersal. However, at-site processes are rarely explored in landscape genetic analyses. Networks can facilitate the study of at-site processes, where network nodes are used to model site-level effects. We used simulated genetic networks to compare and contrast the performance of 7 node-based (as opposed to edge-based) genetic connectivity metrics. We simulated increasing node connectivity by varying migration in two ways: we increased the number of migrants moving between a focal node and a set number of recipient nodes, and we increased the number of recipient nodes receiving a set number of migrants. We found that two metrics in particular, the average edge weight and the average inverse edge weight, varied linearly with simulated connectivity. Conversely, node degree was not a good measure of connectivity. We demonstrated the use of average inverse edge weight to describe the influence of at-site habitat characteristics on genetic connectivity of 653 American martens (Martes americana) in Ontario, Canada. We found that highly connected nodes had high habitat quality for marten (deep snow and high proportions of coniferous and mature forest) and were farther from the range edge. We recommend the use of node-based genetic connectivity metrics, in particular, average edge weight or average inverse edge weight, to model the influences of at-site habitat conditions on the immigration and settlement phases of dispersal. © 2015 John Wiley & Sons Ltd.

  7. Application of model-based and knowledge-based measuring methods as analytical redundancy

    International Nuclear Information System (INIS)

    Hampel, R.; Kaestner, W.; Chaker, N.; Vandreier, B.

    1997-01-01

    The safe operation of nuclear power plants requires the application of modern and intelligent methods of signal processing for the normal operation as well as for the management of accident conditions. Such modern and intelligent methods are model-based and knowledge-based ones being founded on analytical knowledge (mathematical models) as well as experiences (fuzzy information). In addition to the existing hardware redundancies analytical redundancies will be established with the help of these modern methods. These analytical redundancies support the operating staff during the decision-making. The design of a hybrid model-based and knowledge-based measuring method will be demonstrated by the example of a fuzzy-supported observer. Within the fuzzy-supported observer a classical linear observer is connected with a fuzzy-supported adaptation of the model matrices of the observer model. This application is realized for the estimation of the non-measurable variables as steam content and mixture level within pressure vessels with water-steam mixture during accidental depressurizations. For this example the existing non-linearities will be classified and the verification of the model will be explained. The advantages of the hybrid method in comparison to the classical model-based measuring methods will be demonstrated by the results of estimation. The consideration of the parameters which have an important influence on the non-linearities requires the inclusion of high-dimensional structures of fuzzy logic within the model-based measuring methods. Therefore methods will be presented which allow the conversion of these high-dimensional structures to two-dimensional structures of fuzzy logic. As an efficient solution of this problem a method based on cascaded fuzzy controllers will be presented. (author). 2 refs, 12 figs, 5 tabs

  8. Grid Based Integration Technologies of Virtual Measurement System

    International Nuclear Information System (INIS)

    Zhang, D P; He, L S; Yang, H

    2006-01-01

    This paper presents a novel integrated architecture of measurement system for the new requirements of measurement collaboration, measurement resource interconnection and transparent access etc in the wide-area and across organization in the context of a grid. The complexity of integration on a grid arises from the scale, dynamism, autonomy, and distribution of the measurement resources. The main argument of this paper is that these complexities should be made transparent to the collaborative measurement, via flexible reconfigurable mechanisms and dynamic virtualization services. The paper is started by discussing the integration-oriented measurement architecture which provides collaborative measurement services to distributed measurement resources and then the measurement mechanisms are discussed which implements the transparent access and collaboration of measurement resources by providing protocols, measurement schedule and global data driven model

  9. Neutrosophic Refined Similarity Measure Based on Cosine Function

    Directory of Open Access Journals (Sweden)

    Said Broumi

    2014-12-01

    Full Text Available In this paper, the cosine similarity measure of neutrosophic refined (multi- sets is proposed and its properties are studied. The concept of this cosine similarity measure of neutrosophic refined sets is the extension of improved cosine similarity measure of single valued neutrosophic. Finally, using this cosine similarity measure of neutrosophic refined set, the application of medical diagnosis is presented.

  10. Sorption isotherms: A review on physical bases, modeling and measurement

    Energy Technology Data Exchange (ETDEWEB)

    Limousin, G. [Atomic Energy Commission, Tracers Technology Laboratory, 38054 Grenoble Cedex (France) and Laboratoire d' etude des Transferts en Hydrologie et Environnement (CNRS-INPG-IRD-UJF), BP 53, 38041 Grenoble Cedex (France)]. E-mail: guillaumelimousin@yahoo.fr; Gaudet, J.-P. [Laboratoire d' etude des Transferts en Hydrologie et Environnement (CNRS-INPG-IRD-UJF), BP 53, 38041 Grenoble Cedex (France); Charlet, L. [Laboratoire de Geophysique Interne et Techtonophysique - CNRS-IRD-LCPC-UJF-Universite de Savoie, BP 53, 38041 Grenoble Cedex (France); Szenknect, S. [Atomic Energy Commission, Tracers Technology Laboratory, 38054 Grenoble Cedex (France); Barthes, V. [Atomic Energy Commission, Tracers Technology Laboratory, 38054 Grenoble Cedex (France); Krimissa, M. [Electricite de France, Division Recherche et Developpement, Laboratoire National d' Hydraulique et d' Environnement - P78, 6 quai Watier, 78401 Chatou (France)

    2007-02-15

    The retention (or release) of a liquid compound on a solid controls the mobility of many substances in the environment and has been quantified in terms of the 'sorption isotherm'. This paper does not review the different sorption mechanisms. It presents the physical bases underlying the definition of a sorption isotherm, different empirical or mechanistic models, and details several experimental methods to acquire a sorption isotherm. For appropriate measurements and interpretations of isotherm data, this review emphasizes 4 main points: (i) the adsorption (or desorption) isotherm does not provide automatically any information about the reactions involved in the sorption phenomenon. So, mechanistic interpretations must be carefully verified. (ii) Among studies, the range of reaction times is extremely wide and this can lead to misinterpretations regarding the irreversibility of the reaction: a pseudo-hysteresis of the release compared with the retention is often observed. The comparison between the mean characteristic time of the reaction and the mean residence time of the mobile phase in the natural system allows knowing if the studied retention/release phenomenon should be considered as an instantaneous reversible, almost irreversible phenomenon, or if reaction kinetics must be taken into account. (iii) When the concentration of the retained substance is low enough, the composition of the bulk solution remains constant and a single-species isotherm is often sufficient, although it remains strongly dependent on the background medium. At higher concentrations, sorption may be driven by the competition between several species that affect the composition of the bulk solution. (iv) The measurement method has a great influence. Particularly, the background ionic medium, the solid/solution ratio and the use of flow-through or closed reactor are of major importance. The chosen method should balance easy-to-use features and representativity of the studied

  11. PC-based hardware and software for tracer measurements

    International Nuclear Information System (INIS)

    Kaemaeraeinen, V.J.; Kall, Leif; Kaeki, Arvo

    1990-01-01

    Cheap, efficient personal computers can be used for both measurement and analysis. The results can be calculated immediately after the measurements are made in order to exploit the real-time measuring capabilities of tracer techniques fully. In the analysis phase the measurement information is visualized using graphical methods. The programs are menu drive to make them easy to use and adaptable for field conditions. The measuring equipment is modular for easy installation and maintenance. (author)

  12. GIS Based Measurement and Regulatory Zoning of Urban Ecological Vulnerability

    Directory of Open Access Journals (Sweden)

    Xiaorui Zhang

    2015-07-01

    Full Text Available Urban ecological vulnerability is measured on the basis of ecological sensitivity and resilience based on the concept analysis of vulnerability. GIS-based multicriteria decision analysis (GIS-MCDA methods are used, supported by the spatial analysis tools of GIS, to define different levels of vulnerability for areas of the urban ecology. These areas are further classified into different types of regulatory zones. Taking the city of Hefei in China as the empirical research site, this study uses GIS-MCDA, including the index system, index weights and overlay rules, to measure the degree of its ecological vulnerability on the GIS platform. There are eight indices in the system. Raking and analytical hierarchy process (AHP methods are used to calculate index weights according to the characteristics of the index system. The integrated overlay rule, including selection of the maximum value, and weighted linear combination (WLC are applied as the overlay rules. In this way, five types of vulnerability areas have been classified as follows: very low vulnerability, low vulnerability, medium vulnerability, high vulnerability and very high vulnerability. They can be further grouped into three types of regulatory zone of ecological green line, ecological grey line and ecological red line. The study demonstrates that ecological green line areas are the largest (53.61% of the total study area and can be intensively developed; ecological grey line areas (19.59% of the total area can serve as the ecological buffer zone, and ecological red line areas (26.80% cannot be developed and must be protected. The results indicate that ecological green line areas may provide sufficient room for future urban development in Hefei city. Finally, the respective regulatory countermeasures are put forward. This research provides a scientific basis for decision-making around urban ecological protection, construction and sustainable development. It also provides theoretical method

  13. Measurement properties of performance-based measures to assess physical function in hip and knee osteoarthritis: a systematic review.

    Science.gov (United States)

    Dobson, F; Hinman, R S; Hall, M; Terwee, C B; Roos, E M; Bennell, K L

    2012-12-01

    To systematically review the measurement properties of performance-based measures to assess physical function in people with hip and/or knee osteoarthritis (OA). Electronic searches were performed in MEDLINE, CINAHL, Embase, and PsycINFO up to the end of June 2012. Two reviewers independently rated measurement properties using the consensus-based standards for the selection of health status measurement instrument (COSMIN). "Best evidence synthesis" was made using COSMIN outcomes and the quality of findings. Twenty-four out of 1792 publications were eligible for inclusion. Twenty-one performance-based measures were evaluated including 15 single-activity measures and six multi-activity measures. Measurement properties evaluated included internal consistency (three measures), reliability (16 measures), measurement error (14 measures), validity (nine measures), responsiveness (12 measures) and interpretability (three measures). A positive rating was given to only 16% of possible measurement ratings. Evidence for the majority of measurement properties of tests reported in the review has yet to be determined. On balance of the limited evidence, the 40 m self-paced test was the best rated walk test, the 30 s-chair stand test and timed up and go test were the best rated sit to stand tests, and the Stratford battery, Physical Activity Restrictions and Functional Assessment System were the best rated multi-activity measures. Further good quality research investigating measurement properties of performance measures, including responsiveness and interpretability in people with hip and/or knee OA, is needed. Consensus on which combination of measures will best assess physical function in people with hip/and or knee OA is urgently required. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.

  14. Developing Barbed Microtip-Based Electrode Arrays for Biopotential Measurement

    Directory of Open Access Journals (Sweden)

    Li-Sheng Hsu

    2014-07-01

    Full Text Available This study involved fabricating barbed microtip-based electrode arrays by using silicon wet etching. KOH anisotropic wet etching was employed to form a standard pyramidal microtip array and HF/HNO3 isotropic etching was used to fabricate barbs on these microtips. To improve the electrical conductance between the tip array on the front side of the wafer and the electrical contact on the back side, a through-silicon via was created during the wet etching process. The experimental results show that the forces required to detach the barbed microtip arrays from human skin, a polydimethylsiloxane (PDMS polymer, and a polyvinylchloride (PVC film were larger compared with those required to detach microtip arrays that lacked barbs. The impedances of the skin-electrode interface were measured and the performance levels of the proposed dry electrode were characterized. Electrode prototypes that employed the proposed tip arrays were implemented. Electroencephalogram (EEG and electrocardiography (ECG recordings using these electrode prototypes were also demonstrated.

  15. Observing Tsunamis in the Ionosphere Using Ground Based GPS Measurements

    Science.gov (United States)

    Galvan, D. A.; Komjathy, A.; Song, Y. Tony; Stephens, P.; Hickey, M. P.; Foster, J.

    2011-01-01

    Ground-based Global Positioning System (GPS) measurements of ionospheric Total Electron Content (TEC) show variations consistent with atmospheric internal gravity waves caused by ocean tsunamis following recent seismic events, including the Tohoku tsunami of March 11, 2011. We observe fluctuations correlated in time, space, and wave properties with this tsunami in TEC estimates processed using JPL's Global Ionospheric Mapping Software. These TEC estimates were band-pass filtered to remove ionospheric TEC variations with periods outside the typical range of internal gravity waves caused by tsunamis. Observable variations in TEC appear correlated with the Tohoku tsunami near the epicenter, at Hawaii, and near the west coast of North America. Disturbance magnitudes are 1-10% of the background TEC value. Observations near the epicenter are compared to estimates of expected tsunami-driven TEC variations produced by Embry Riddle Aeronautical University's Spectral Full Wave Model, an atmosphere-ionosphere coupling model, and found to be in good agreement. The potential exists to apply these detection techniques to real-time GPS TEC data, providing estimates of tsunami speed and amplitude that may be useful for future early warning systems.

  16. Injection quality measurements with diamond based particle detectors

    CERN Document Server

    Stein, Oliver; CERN. Geneva. ATS Department

    2016-01-01

    During the re-commissioning phase of the LHC after the long shutdown 1 very high beam losses were observed at the TDI during beam injection. The losses reached up to 90% of the dump threshold. To decrease the through beam losses induced stress on the accelerator components these loss levels need to be reduced. Measurements with diamond based particle detectors (dBLMs), which have nano-second time resolution, revealed that the majority of these losses come from recaptured SPS beam surrounding the nominal bunch train. In this MD the injection loss patterns and loss intensities were investigated in greater detail. Performed calibration shots on the TDI (internal beam absorber for injection) gave a conversion factor from impacting particles intensities to signal in the dBLMs (0.1Vs/109 protons). Using the SPS tune kicker for cleaning the recaptured beam in the SPS and changing the LHC injection kicker settings resulted in a reduction of the injection losses. For 144 bunch injections the loss levels were decreased...

  17. Online monitoring of Mezcal fermentation based on redox potential measurements.

    Science.gov (United States)

    Escalante-Minakata, P; Ibarra-Junquera, V; Rosu, H C; De León-Rodríguez, A; González-García, R

    2009-01-01

    We describe an algorithm for the continuous monitoring of the biomass and ethanol concentrations as well as the growth rate in the Mezcal fermentation process. The algorithm performs its task having available only the online measurements of the redox potential. The procedure combines an artificial neural network (ANN) that relates the redox potential to the ethanol and biomass concentrations with a nonlinear observer-based algorithm that uses the ANN biomass estimations to infer the growth rate of this fermentation process. The results show that the redox potential is a valuable indicator of the metabolic activity of the microorganisms during Mezcal fermentation. In addition, the estimated growth rate can be considered as a direct evidence of the presence of mixed culture growth in the process. Usually, mixtures of microorganisms could be intuitively clear in this kind of processes; however, the total biomass data do not provide definite evidence by themselves. In this paper, the detailed design of the software sensor as well as its experimental application is presented at the laboratory level.

  18. Measurement-Based Performance Evaluation of Advanced MIMO Transceiver Designs

    Directory of Open Access Journals (Sweden)

    Schneider Christian

    2005-01-01

    Full Text Available This paper describes the methodology and the results of performance investigations on a multiple-input multiple-output (MIMO transceiver scheme for frequency-selective radio channels. The method relies on offline simulations and employs real-time MIMO channel sounder measurement data to ensure a realistic channel modeling. Thus it can be classified in between the performance evaluation using some predefined channel models and the evaluation of a prototype hardware in field experiments. New aspects for the simulation setup are discussed, which are frequently ignored when using simpler model-based evaluations. Example simulations are provided for an iterative ("turbo" MIMO equalizer concept. The dependency of the achievable bit error rate performance on the propagation characteristics and on the variation in some system design parameters is shown, whereas the antenna constellation is of particular concern for MIMO systems. Although in many of the considered constellations turbo MIMO equalization appears feasible in real field scenarios, there exist cases with poor performance as well, indicating that in practical applications link adaptation of the transmitter and receiver processing to the environment is necessary.

  19. EMG Processing Based Measures of Fatigue Assessment during Manual Lifting

    Directory of Open Access Journals (Sweden)

    E. F. Shair

    2017-01-01

    Full Text Available Manual lifting is one of the common practices used in the industries to transport or move objects to a desired place. Nowadays, even though mechanized equipment is widely available, manual lifting is still considered as an essential way to perform material handling task. Improper lifting strategies may contribute to musculoskeletal disorders (MSDs, where overexertion contributes as the highest factor. To overcome this problem, electromyography (EMG signal is used to monitor the workers’ muscle condition and to find maximum lifting load, lifting height and number of repetitions that the workers are able to handle before experiencing fatigue to avoid overexertion. Past researchers have introduced several EMG processing techniques and different EMG features that represent fatigue indices in time, frequency, and time-frequency domain. The impact of EMG processing based measures in fatigue assessment during manual lifting are reviewed in this paper. It is believed that this paper will greatly benefit researchers who need a bird’s eye view of the biosignal processing which are currently available, thus determining the best possible techniques for lifting applications.

  20. EMG Processing Based Measures of Fatigue Assessment during Manual Lifting

    Science.gov (United States)

    Marhaban, M. H.; Abdullah, A. R.

    2017-01-01

    Manual lifting is one of the common practices used in the industries to transport or move objects to a desired place. Nowadays, even though mechanized equipment is widely available, manual lifting is still considered as an essential way to perform material handling task. Improper lifting strategies may contribute to musculoskeletal disorders (MSDs), where overexertion contributes as the highest factor. To overcome this problem, electromyography (EMG) signal is used to monitor the workers' muscle condition and to find maximum lifting load, lifting height and number of repetitions that the workers are able to handle before experiencing fatigue to avoid overexertion. Past researchers have introduced several EMG processing techniques and different EMG features that represent fatigue indices in time, frequency, and time-frequency domain. The impact of EMG processing based measures in fatigue assessment during manual lifting are reviewed in this paper. It is believed that this paper will greatly benefit researchers who need a bird's eye view of the biosignal processing which are currently available, thus determining the best possible techniques for lifting applications. PMID:28303251

  1. Ground-based observations coordinated with Viking satellite measurements

    International Nuclear Information System (INIS)

    Opgenoorth, H.J.; Kirkwood, S.

    1989-01-01

    The instrumentation and the orbit of the Viking satellite made this first Swedish satellite mission ideally suited for coordinated observations with the dense network of ground-based stations in northern Scandinavia. Several arrays of complementing instruments such as magnetometers, all-sky cameras, riometers and doppler radars monitored on a routine basis the ionosphere under the magnetospheric region passed by Viking. For a large number of orbits the Viking passages close to Scandinavia were covered by the operation of specially designed programmes at the European incoherent-scatter facility (EISCAT). First results of coordinated observations on the ground and aboard Viking have shed new light on the most spectacular feature of substorm expansion, the westward-travelling surge. The end of a substorm and the associated decay of a westward-travelling surge have been analysed. EISCAT measurements of high spatial and temporal resolution indicate that the conductivities and electric fields associated with westward-travelling surges are not represented correctly by the existing models. (author)

  2. FIB-based measurement of local residual stresses on microsystems

    Science.gov (United States)

    Vogel, Dietmar; Sabate, Neus; Gollhardt, Astrid; Keller, Juergen; Auersperg, Juergen; Michel, Bernd

    2006-03-01

    The paper comprises research results obtained for stress determination on micro and nanotechnology components. It meets the concern of controlling stresses introduced to sensors, MEMS and electronics devices during different micromachining processes. The method bases on deformation measurement options made available inside focused ion beam equipment. Removing locally material by ion beam milling existing stresses / residual stresses lead to deformation fields around the milled feature. Digital image correlation techniques are used to extract deformation values from micrographs captured before and after milling. In the paper, two main milling features have been analyzed - through hole and through slit milling. Analytical solutions for stress release fields of in-plane stresses have been derived and compared to respective experimental findings. Their good agreement allows to settle a method for determination of residual stress values, which is demonstrated for thin membranes manufactured by silicon micro technology. Some emphasis is made on the elimination of main error sources for stress determination, like rigid body object displacements and rotations due to drifts of experimental conditions under FIB imaging. In order to illustrate potential application areas of the method residual stress suppression by ion implantation is evaluated by the method and reported here.

  3. Transportation performance measures for outcome based system management and monitoring.

    Science.gov (United States)

    2014-09-01

    The Oregon Department of Transportation (ODOT) is mature in its development and use of : performance measures, however there was not a standard approach for selecting measures nor : evaluating if existing ones were used to inform decision-making. Thi...

  4. Dynamic portfolio managment based on complex quantile risk measures

    Directory of Open Access Journals (Sweden)

    Ekaterina V. Tulupova

    2011-05-01

    Full Text Available The article focuses on effectiveness evaluation combined measures of financial risks, which are convex combinations of measures VaR, CVaR and their analogues for the right distribution tail functions of a portfolio returns.

  5. A Raspberry Pi Based Portable Endoscopic 3D Measurement System

    Directory of Open Access Journals (Sweden)

    Jochen Schlobohm

    2016-07-01

    Full Text Available Geometry measurements are very important to monitor a machine part’s health and performance. Optical measurement system have several advantages for the acquisition of a parts geometry: measurement speed, precision, point density and contactless operation. Measuring parts inside of assembled machines is also desirable to keep maintenance cost low. The Raspberry Pi is a small and cost efficient computer that creates new opportunities for compact measurement systems. We have developed a fringe projection system which is capable of measuring in very limited space. A Raspberry Pi 2 is used to generate the projection patterns, acquire the image and reconstruct the geometry. Together with a small LED projector, the measurement system is small and easy to handle. It consists of off-the-shelf products which are nonetheless capable of measuring with an uncertainty of less than 100 μ m .

  6. Estimation of incidences of infectious diseases based on antibody measurements

    DEFF Research Database (Denmark)

    Simonsen, J; Mølbak, K; Falkenhorst, G

    2009-01-01

    bacterial infections. This study presents a Bayesian approach for obtaining incidence estimates by use of measurements of serum antibodies against Salmonella from a cross-sectional study. By comparing these measurements with antibody measurements from a follow-up study of infected individuals...

  7. Radiation risk estimation based on measurement error models

    CERN Document Server

    Masiuk, Sergii; Shklyar, Sergiy; Chepurny, Mykola; Likhtarov, Illya

    2017-01-01

    This monograph discusses statistics and risk estimates applied to radiation damage under the presence of measurement errors. The first part covers nonlinear measurement error models, with a particular emphasis on efficiency of regression parameter estimators. In the second part, risk estimation in models with measurement errors is considered. Efficiency of the methods presented is verified using data from radio-epidemiological studies.

  8. The Impact of Corporate Governance on Financial Performance: (Measured using Accounting and Value-Added based Measures): Evidence from Malaysia

    OpenAIRE

    Abdul Aziz, Khairul Annuar

    2005-01-01

    This paper aims to test empirically which measure, an accounting based financial performance measure such as Return on Equity, Price to Earnings Ratio, Earnings Per Share and Return on Capital Employed; or value-added based financial performance measures such as Economic Value Added and Market Value Added; is more closely related with Corporate Governance Compliance. This paper also aims to study the level of Corporate Governance Compliance of the Smaller Companies listed on the KLSE, the mea...

  9. AATR an ionospheric activity indicator specifically based on GNSS measurements

    Science.gov (United States)

    Juan, José Miguel; Sanz, Jaume; Rovira-Garcia, Adrià; González-Casado, Guillermo; Ibáñez, D.; Perez, R. Orus

    2018-03-01

    This work reviews an ionospheric activity indicator useful for identifying disturbed periods affecting the performance of Global Navigation Satellite System (GNSS). This index is based in the Along Arc TEC Rate (AATR) and can be easily computed from dual-frequency GNSS measurements. The AATR indicator has been assessed over more than one Solar Cycle (2002-2017) involving about 140 receivers distributed world-wide. Results show that it is well correlated with the ionospheric activity and, unlike other global indicators linked to the geomagnetic activity (i.e. DST or Ap), it is sensitive to the regional behaviour of the ionosphere and identifies specific effects on GNSS users. Moreover, from a devoted analysis of different Satellite Based Augmentation System (SBAS) performances in different ionospheric conditions, it follows that the AATR indicator is a very suitable mean to reveal whether SBAS service availability anomalies are linked to the ionosphere. On this account, the AATR indicator has been selected as the metric to characterise the ionosphere operational conditions in the frame of the European Space Agency activities on the European Geostationary Navigation Overlay System (EGNOS). The AATR index has been adopted as a standard tool by the International Civil Aviation Organization (ICAO) for joint ionospheric studies in SBAS. In this work we explain how the AATR is computed, paying special attention to the cycle-slip detection, which is one of the key issues in the AATR computation, not fully addressed in other indicators such as the Rate Of change of the TEC Index (ROTI). After this explanation we present some of the main conclusions about the ionospheric activity that can extracted from the AATR values during the above mentioned long-term study. These conclusions are: (a) the different spatial correlation related with the MOdified DIP (MODIP) which allows to clearly separate high, mid and low latitude regions, (b) the large spatial correlation in mid

  10. Empirical wind retrieval model based on SAR spectrum measurements

    Science.gov (United States)

    Panfilova, Maria; Karaev, Vladimir; Balandina, Galina; Kanevsky, Mikhail; Portabella, Marcos; Stoffelen, Ad

    ambiguity from polarimetric SAR. A criterion based on the complex correlation coefficient between the VV and VH signals sign is applied to select the wind direction. An additional quality control on the wind speed value retrieved with the spectral method is applied. Here, we use the direction obtained with the spectral method and the backscattered signal for CMOD wind speed estimate. The algorithm described above may be refined by the use of numerous SAR data and wind measurements. In the present preliminary work the first results of SAR images combined with in situ data processing are presented. Our results are compared to the results obtained using previously developed models CMOD, C-2PO for VH polarization and statistical wind retrieval approaches [1]. Acknowledgments. This work is supported by the Russian Foundation of Basic Research (grants 13-05-00852-a). [1] M. Portabella, A. Stoffelen, J. A. Johannessen, Toward an optimal inversion method for synthetic aperture radar wind retrieval, Journal of geophysical research, V. 107, N C8, 2002

  11. Balanced Scorecard Based Performance Measurement & Strategic Management System

    OpenAIRE

    Permatasari, Paulina

    2006-01-01

    Developing strategy and performance measurement are an integral part of management control system. Making strategic decision about planning and controlling require information regarding how different subunits in organization work. To be effective, performance measurement, both financial and non-financial must motivate manager and employees at different levels to force goal accomplishment and organization strategic. An organization's measurement system strongly affects the behavior of people b...

  12. The Importance of Replication in Measurement Research: Using Curriculum-Based Measures with Postsecondary Students with Developmental Disabilities

    Science.gov (United States)

    Hosp, John L.; Ford, Jeremy W.; Huddle, Sally M.; Hensley, Kiersten K.

    2018-01-01

    Replication is a foundation of the development of a knowledge base in an evidence-based field such as education. This study includes two direct replications of Hosp, Hensley, Huddle, and Ford which found evidence of criterion-related validity of curriculum-based measurement (CBM) for reading and mathematics with postsecondary students with…

  13. A Software Behavior Trustworthiness Measurement Method based on Data Mining

    Directory of Open Access Journals (Sweden)

    Yuyu Yuan

    2011-10-01

    carried out in three stages: firstly, defining the concept of trust, software trustworthiness, static and dynamic feature datasets with fundamental calculating criteria; secondly, providing a group of formulas to illustrate congruence measurement approach for comparing the two types of feature datasets; lastly, giving an architecture supported by software trustworthiness measurement algorithm to evaluate conceptualized hierarchical software trustworthiness.

  14. BEAM-BASED MEASUREMENTS OF PERSISTENT CURRENT DECAY IN RHIC

    International Nuclear Information System (INIS)

    FISCHER, W.; JAIN, A.; TEPIKIAN, S.

    2001-01-01

    The two RHIC rings are equipped with superconducting dipole magnets. At injection, induced persistent currents in these magnets lead to a sextupole component. As the persistent currents decay with time, the horizontal and vertical chromaticities change. From magnet measurements of persistent current decays, chromaticity changes in the machine are estimated and compared with chromaticity measurements

  15. Femtosecond frequency comb based distance measurement in air

    NARCIS (Netherlands)

    Balling, P.; Kren, P.; Masika, P.; van den Berg, S.A.

    2009-01-01

    Interferometric measurement of distance using a femtosecond frequency comb is demonstrated and compared with a counting interferometer displacement measurement. A numerical model of pulse propagation in air is developed and the results are compared with experimental data for short distances. The

  16. Ergonomic measures in construction work: enhancing evidence-based implementation

    NARCIS (Netherlands)

    Visser, S.

    2015-01-01

    Despite the development and availability of ergonomic measures in the construction industry, the number of construction workers reporting high physical work demands remains high. A reduction of the high physical work demands can be achieved by using ergonomic measures. However, these ergonomic

  17. Attitude angular measurement system based on MEMS accelerometer

    Science.gov (United States)

    Luo, Lei

    2014-09-01

    For the purpose of monitoring the attitude of aircraft, an angular measurement system using a MEMS heat convection accelerometer is presented in this study. A double layers conditioning circuit that center around the single chip processor is designed and built. Professional display software with the RS232 standard is used to communicate between the sensor and the computer. Calibration experiments were carried out to characterize the measuring system with the range of - 90°to +90°. The curves keep good linearity with the practical angle. The maximum deviation occurs at the 90°where the value is 2.8°.The maximum error is 1.6% and the repeatability is measured to be 2.1%. Experiments proved that the developed measurement system is capable of measuring attitude angle.

  18. Posterior variability of inclusion shape based on tomographic measurement data

    International Nuclear Information System (INIS)

    Watzenig, Daniel; Fox, Colin

    2008-01-01

    We treat the problem of recovering the unknown shape of a single inclusion with unknown constant permittivity in an otherwise uniform background material, from uncertain measurements of trans-capacitance at electrodes outside the material. The ubiquitous presence of measurement noise implies that the practical measurement process is probabilistic, and the inverse problem is naturally stated as statistical inference. Formulating the inverse problem in a Bayesian inferential framework requires accurately modelling the forward map, measurement noise, and specifying a prior distribution for the cross-sectional material distribution. Numerical implementation of the forward map is via the boundary element method (BEM) taking advantage of a piecewise constant representation. Summary statistics are calculated using MCMC sampling to characterize posterior variability for synthetic and measured data sets.

  19. [A novel biologic electricity signal measurement based on neuron chip].

    Science.gov (United States)

    Lei, Yinsheng; Wang, Mingshi; Sun, Tongjing; Zhu, Qiang; Qin, Ran

    2006-06-01

    Neuron chip is a multiprocessor with three pipeline CPU; its communication protocol and control processor are integrated in effect to carry out the function of communication, control, attemper, I/O, etc. A novel biologic electronic signal measurement network system is composed of intelligent measurement nodes with neuron chip at the core. In this study, the electronic signals such as ECG, EEG, EMG and BOS can be synthetically measured by those intelligent nodes, and some valuable diagnostic messages are found. Wavelet transform is employed in this system to analyze various biologic electronic signals due to its strong time-frequency ability of decomposing signal local character. Better effect is gained. This paper introduces the hardware structure of network and intelligent measurement node, the measurement theory and the signal figure of data acquisition and processing.

  20. Femtosecond frequency comb based distance measurement in air.

    Science.gov (United States)

    Balling, Petr; Kren, Petr; Masika, Pavel; van den Berg, S A

    2009-05-25

    Interferometric measurement of distance using a femtosecond frequency comb is demonstrated and compared with a counting interferometer displacement measurement. A numerical model of pulse propagation in air is developed and the results are compared with experimental data for short distances. The relative agreement for distance measurement in known laboratory conditions is better than 10(-7). According to the model, similar precision seems feasible even for long-distance measurement in air if conditions are sufficiently known. It is demonstrated that the relative width of the interferogram envelope even decreases with the measured length, and a fringe contrast higher than 90% could be obtained for kilometer distances in air, if optimal spectral width for that length and wavelength is used. The possibility of comb radiation delivery to the interferometer by an optical fiber is shown by model and experiment, which is important from a practical point of view.

  1. Cross-cultural differences in memory: the role of culture-based stereotypes about aging.

    Science.gov (United States)

    Yoon, C; Hasher, L; Feinberg, F; Rahhal, T A; Winocur, G

    2000-12-01

    The extent to which cultural stereotypes about aging contribute to age differences in memory performance is investigated by comparing younger and older Anglophone Canadians to demographically matched Chinese Canadians, who tend to hold more positive views of aging. Four memory tests were administered. In contrast to B. Levy and E. Langer's (1994) findings, younger adults in both cultural groups outperformed their older comparison group on all memory tests. For 2 tests, which made use of visual stimuli resembling ideographic characters in written Chinese, the older Chinese Canadians approached, but did not reach, the performance achieved by their younger counterparts, as well as outperformed the older Anglophone Canadians. However, on the other two tests, which assess memory for complex figures and abstract designs, no differences were observed between the older Chinese and Anglophone Canadians. Path analysis results suggest that this pattern of findings is not easily attributed to a wholly culturally based account of age differences in memory performance.

  2. Statistical characteristics of surrogate data based on geophysical measurements

    Directory of Open Access Journals (Sweden)

    V. Venema

    2006-01-01

    Full Text Available In this study, the statistical properties of a range of measurements are compared with those of their surrogate time series. Seven different records are studied, amongst others, historical time series of mean daily temperature, daily rain sums and runoff from two rivers, and cloud measurements. Seven different algorithms are used to generate the surrogate time series. The best-known method is the iterative amplitude adjusted Fourier transform (IAAFT algorithm, which is able to reproduce the measured distribution as well as the power spectrum. Using this setup, the measurements and their surrogates are compared with respect to their power spectrum, increment distribution, structure functions, annual percentiles and return values. It is found that the surrogates that reproduce the power spectrum and the distribution of the measurements are able to closely match the increment distributions and the structure functions of the measurements, but this often does not hold for surrogates that only mimic the power spectrum of the measurement. However, even the best performing surrogates do not have asymmetric increment distributions, i.e., they cannot reproduce nonlinear dynamical processes that are asymmetric in time. Furthermore, we have found deviations of the structure functions on small scales.

  3. Measurements of DSD Second Moment Based on Laser Extinction

    Science.gov (United States)

    Lane, John E.; Jones, Linwood; Kasparis, Takis C.; Metzger, Philip

    2013-01-01

    Using a technique recently developed for estimating the density of surface dust dispersed during a rocket landing, measuring the extinction of a laser passing through rain (or dust in the rocket case) yields an estimate of the 2nd moment of the particle cloud, and rainfall drop size distribution (DSD) in the terrestrial meteorological case. With the exception of disdrometers, instruments that measure rainfall make in direct measurements of the DSD. Most common of these instruments are the rainfall rate gauge measuring the 1 1/3 th moment, (when using a D(exp 2/3) dependency on terminal velocity). Instruments that scatter microwaves off of hydrometeors, such as the WSR-880, vertical wind profilers, and microwave disdrometers, measure the 6th moment of the DSD. By projecting a laser onto a target, changes in brightness of the laser spot against the target background during rain, yield a measurement of the DSD 2nd moment, using the Beer-Lambert law. In order to detect the laser attenuation within the 8-bit resolution of most camera image arrays, a minimum path length is required, depending on the intensity of the rainfall rate. For moderate to heavy rainfall, a laser path length of 100 m is sufficient to measure variations in optical extinction using a digital camera. A photo-detector could replace the camera, for automated installations. In order to spatially correlate the 2nd moment measurements to a collocated disdrometer or tipping bucket, the laser's beam path can be reflected multiple times using mirrors to restrict the spatial extent of the measurement. In cases where a disdrometer is not available, complete DSD estimates can be produced by parametric fitting of DSD model to the 2nd moment data in conjunction with tipping bucket data. In cases where a disdrometer is collocated, the laser extinction technique may yield a significant improvement to insitu disdrometer validation and calibration strategies

  4. Neutrosophic Cubic MCGDM Method Based on Similarity Measure

    Directory of Open Access Journals (Sweden)

    Surapati Pramanik

    2017-06-01

    Full Text Available The notion of neutrosophic cubic set is originated from the hybridization of the concept of neutrosophic set and interval valued neutrosophic set. We define similarity measure for neutrosophic cubic sets and prove some of its basic properties.

  5. Ground-based spectral measurements of solar radiation, (2)

    International Nuclear Information System (INIS)

    Murai, Keizo; Kobayashi, Masaharu; Goto, Ryozo; Yamauchi, Toyotaro

    1979-01-01

    A newly designed spectro-pyranometer was used for the measurement of the global (direct + diffuse) and the diffuse sky radiation reaching the ground. By the subtraction of the diffuse component from the global radiation, we got the direct radiation component which leads to the spectral distribution of the optical thickness (extinction coefficient) of the turbid atmosphere. The measurement of the diffuse sky radiation reveals the scattering effect of aerosols and that of the global radiation allows the estimation of total attenuation caused by scattering and absorption of aerosols. The effects of the aerosols are represented by the deviation of the real atmosphere measured from the Rayleigh atmosphere. By the combination of the measured values with those obtained by theoretical calculation for the model atmosphere, we estimated the amount of absorption by the aerosols. Very strong absorption in the ultraviolet region was recognized. (author)

  6. An Antenna Measurement System Based on Optical Feeding

    Directory of Open Access Journals (Sweden)

    Ryohei Hosono

    2013-01-01

    the advantage of the system is demonstrated by measuring an ultra-wideband (UWB antenna both by the optical and electrical feeding systems and comparing with a calculated result. Ripples in radiation pattern due to the electrical feeding are successfully suppressed by the optical feeding. For example, in a radiation measurement on the azimuth plane at 3 GHz, ripple amplitude of 1.0 dB that appeared in the electrical feeding is reduced to 0.3 dB. In addition, a circularly polarized (CP antenna is successfully measured by the proposed system to show that the system is available not only for amplitude but also phase measurements.

  7. TOWARDS MEASURES OF INTELLIGENCE BASED ON SEMIOTIC CONTROL

    Energy Technology Data Exchange (ETDEWEB)

    C. JOSLYN

    2000-08-01

    We address the question of how to identify and measure the degree of intelligence in systems. We define the presence of intelligence as equivalent to the presence of a control relation. We contrast the distinct atomic semioic definitions of models and controls, and discuss hierarchical and anticipatory control. We conclude with a suggestion about moving towards quantitative measures of the degree of such control in systems.

  8. A Vision-Based Sensor for Noncontact Structural Displacement Measurement

    Science.gov (United States)

    Feng, Dongming; Feng, Maria Q.; Ozer, Ekin; Fukuda, Yoshio

    2015-01-01

    Conventional displacement sensors have limitations in practical applications. This paper develops a vision sensor system for remote measurement of structural displacements. An advanced template matching algorithm, referred to as the upsampled cross correlation, is adopted and further developed into a software package for real-time displacement extraction from video images. By simply adjusting the upsampling factor, better subpixel resolution can be easily achieved to improve the measurement accuracy. The performance of the vision sensor is first evaluated through a laboratory shaking table test of a frame structure, in which the displacements at all the floors are measured by using one camera to track either high-contrast artificial targets or low-contrast natural targets on the structural surface such as bolts and nuts. Satisfactory agreements are observed between the displacements measured by the single camera and those measured by high-performance laser displacement sensors. Then field tests are carried out on a railway bridge and a pedestrian bridge, through which the accuracy of the vision sensor in both time and frequency domains is further confirmed in realistic field environments. Significant advantages of the noncontact vision sensor include its low cost, ease of operation, and flexibility to extract structural displacement at any point from a single measurement. PMID:26184197

  9. Web-Based Gerontology Courses: How Do They Measure Up?

    Science.gov (United States)

    Hills, William E.; Brallier, Sara A.; Palm, Linda J.; Graham, Jamie M.

    2009-01-01

    This study compared Web-based and lecture-based Gerontology and Psychology of Aging courses in terms of student performance, demographic and academic characteristics of students enrolled in the courses, and extent to which these characteristics differentially predicted outcomes of learning in the two course types. Participants for this study were…

  10. Method and apparatus of a portable imaging-based measurement with self calibration

    Science.gov (United States)

    Chang, Tzyy-Shuh [Ann Arbor, MI; Huang, Hsun-Hau [Ann Arbor, MI

    2012-07-31

    A portable imaging-based measurement device is developed to perform 2D projection based measurements on an object that is difficult or dangerous to access. This device is equipped with self calibration capability and built-in operating procedures to ensure proper imaging based measurement.

  11. Bite force measurement based on fiber Bragg grating sensor

    Science.gov (United States)

    Padma, Srivani; Umesh, Sharath; Asokan, Sundarrajan; Srinivas, Talabattula

    2017-10-01

    The maximum level of voluntary bite force, which results from the combined action of muscle of mastication, joints, and teeth, i.e., craniomandibular structure, is considered as one of the major indicators for the functional state of the masticatory system. Measurement of voluntary bite force provides useful data for the jaw muscle function and activity along with assessment of prosthetics. This study proposes an in vivo methodology for the dynamic measurement of bite force employing a fiber Bragg grating (FBG) sensor known as bite force measurement device (BFMD). The BFMD developed is a noninvasive intraoral device, which transduces the bite force exerted at the occlusal surface into strain variations on a metal plate. These strain variations are acquired by the FBG sensor bonded over it. The BFMD developed facilitates adjustment of the distance between the biting platform, which is essential to capture the maximum voluntary bite force at three different positions of teeth, namely incisor, premolar, and molar sites. The clinically relevant bite forces are measured at incisor, molar, and premolar position and have been compared against each other. Furthermore, the bite forces measured with all subjects are segregated according to gender and also compared against each other.

  12. Mobile platform of altitude measurement based on a smartphone

    Science.gov (United States)

    Roszkowski, Paweł; Kowalczyk, Marcin

    2016-09-01

    The article presents a low cost, fully - functional meter of altitude and pressure changes in a form of mobile application controlled by Android OS (operating system). The measurements are possible due to pressure sensor inserted in majority of latest modern mobile phones, which are known as smartphones. Using their computing capabilities and other equipment components like GPS receiver in connection with data from the sensor enabled authors to create a sophisticated handheld measuring platform with many unique features. One of them is a drawing altitude maps mode in which user can create maps of altitude changes just by moving around examined area. Another one is a convenient mode for altitude measurement. It is also extended with analysis tools which provide a possibility to compare measured values by displaying the data in a form of plots. The platform consists of external backup server, where the user can secure all gathered data. Moreover, the results of measurement's accuracy examination process which was executed after building the solution were shown. At the end, the realized meter of altitude was compared to other popular altimeters, which are available on the market currently.

  13. Distance-Based Functional Diversity Measures and Their Decomposition: A Framework Based on Hill Numbers

    Science.gov (United States)

    Chiu, Chun-Huo; Chao, Anne

    2014-01-01

    Hill numbers (or the “effective number of species”) are increasingly used to characterize species diversity of an assemblage. This work extends Hill numbers to incorporate species pairwise functional distances calculated from species traits. We derive a parametric class of functional Hill numbers, which quantify “the effective number of equally abundant and (functionally) equally distinct species” in an assemblage. We also propose a class of mean functional diversity (per species), which quantifies the effective sum of functional distances between a fixed species to all other species. The product of the functional Hill number and the mean functional diversity thus quantifies the (total) functional diversity, i.e., the effective total distance between species of the assemblage. The three measures (functional Hill numbers, mean functional diversity and total functional diversity) quantify different aspects of species trait space, and all are based on species abundance and species pairwise functional distances. When all species are equally distinct, our functional Hill numbers reduce to ordinary Hill numbers. When species abundances are not considered or species are equally abundant, our total functional diversity reduces to the sum of all pairwise distances between species of an assemblage. The functional Hill numbers and the mean functional diversity both satisfy a replication principle, implying the total functional diversity satisfies a quadratic replication principle. When there are multiple assemblages defined by the investigator, each of the three measures of the pooled assemblage (gamma) can be multiplicatively decomposed into alpha and beta components, and the two components are independent. The resulting beta component measures pure functional differentiation among assemblages and can be further transformed to obtain several classes of normalized functional similarity (or differentiation) measures, including N-assemblage functional generalizations of

  14. Distance-based functional diversity measures and their decomposition: a framework based on Hill numbers.

    Directory of Open Access Journals (Sweden)

    Chun-Huo Chiu

    Full Text Available Hill numbers (or the "effective number of species" are increasingly used to characterize species diversity of an assemblage. This work extends Hill numbers to incorporate species pairwise functional distances calculated from species traits. We derive a parametric class of functional Hill numbers, which quantify "the effective number of equally abundant and (functionally equally distinct species" in an assemblage. We also propose a class of mean functional diversity (per species, which quantifies the effective sum of functional distances between a fixed species to all other species. The product of the functional Hill number and the mean functional diversity thus quantifies the (total functional diversity, i.e., the effective total distance between species of the assemblage. The three measures (functional Hill numbers, mean functional diversity and total functional diversity quantify different aspects of species trait space, and all are based on species abundance and species pairwise functional distances. When all species are equally distinct, our functional Hill numbers reduce to ordinary Hill numbers. When species abundances are not considered or species are equally abundant, our total functional diversity reduces to the sum of all pairwise distances between species of an assemblage. The functional Hill numbers and the mean functional diversity both satisfy a replication principle, implying the total functional diversity satisfies a quadratic replication principle. When there are multiple assemblages defined by the investigator, each of the three measures of the pooled assemblage (gamma can be multiplicatively decomposed into alpha and beta components, and the two components are independent. The resulting beta component measures pure functional differentiation among assemblages and can be further transformed to obtain several classes of normalized functional similarity (or differentiation measures, including N-assemblage functional

  15. A Pseudorange Measurement Scheme Based on Snapshot for Base Station Positioning Receivers.

    Science.gov (United States)

    Mo, Jun; Deng, Zhongliang; Jia, Buyun; Bian, Xinmei

    2017-12-01

    Digital multimedia broadcasting signal is promised to be a wireless positioning signal. This paper mainly studies a multimedia broadcasting technology, named China mobile multimedia broadcasting (CMMB), in the context of positioning. Theoretical and practical analysis on the CMMB signal suggests that the existing CMMB signal does not have the meter positioning capability. So, the CMMB system has been modified to achieve meter positioning capability by multiplexing the CMMB signal and pseudo codes in the same frequency band. The time difference of arrival (TDOA) estimation method is used in base station positioning receivers. Due to the influence of a complex fading channel and the limited bandwidth of receivers, the regular tracking method based on pseudo code ranging is difficult to provide continuous and accurate TDOA estimations. A pseudorange measurement scheme based on snapshot is proposed to solve the problem. This algorithm extracts the TDOA estimation from the stored signal fragments, and utilizes the Taylor expansion of the autocorrelation function to improve the TDOA estimation accuracy. Monte Carlo simulations and real data tests show that the proposed algorithm can significantly reduce the TDOA estimation error for base station positioning receivers, and then the modified CMMB system achieves meter positioning accuracy.

  16. Toward autonomous measurements of photosynthetic electron transport rates: An evaluation of active fluorescence-based measurements of photochemistry

    NARCIS (Netherlands)

    Silsbe, G.M.; Oxborough, K.; Suggett, D.J.; Forster, R.M.; Ihnken, S.; Komárek, O.; Lawrenz, E.; Prášil, O.; Röttgers, R.; Šicner, M.; Simis, S.G.H.; Van Dijk, M.A.; Kromkamp, J.C.

    2015-01-01

    This study presents a methods evaluation and intercalibration of active fluorescence-based measurements of the quantum yield ( inline image) and absorption coefficient ( inline image) of photosystem II (PSII) photochemistry. Measurements of inline image, inline image, and irradiance (E) can be

  17. Measurement properties of performance-based measures to assess physical function in hip and knee osteoarthritis: a systematic review

    NARCIS (Netherlands)

    Dobson, F.; Hinman, R.S.; Leverstein-van Hall, M.A.; Terwee, C.B.; Roos, E.M.; Bennell, K.L.

    2012-01-01

    Objectives: To systematically review the measurement properties of performance-based measures to assess physical function in people with hip and/or knee osteoarthritis (OA). Methods: Electronic searches were performed in MEDLINE, CINAHL, Embase, and PsycINFO up to the end of June 2012. Two reviewers

  18. Ground-based intercomparison of two isoprene measurement techniques

    Directory of Open Access Journals (Sweden)

    E. Leibrock

    2003-01-01

    Full Text Available An informal intercomparison of two isoprene (C5H8 measurement techniques was carried out during Fall of 1998 at a field site located approximately 3 km west of Boulder, Colorado, USA. A new chemical ionization mass spectrometric technique (CIMS was compared to a well-established gas chromatographic technique (GC. The CIMS technique utilized benzene cation chemistry to ionize isoprene. The isoprene levels measured by the CIMS were often larger than those obtained with the GC. The results indicate that the CIMS technique suffered from an anthropogenic interference associated with air masses from the Denver, CO metropolitan area as well as an additional interference occurring in clean conditions. However, the CIMS technique is also demonstrated to be sensitive and fast. Especially after introduction of a tandem mass spectrometric technique, it is therefore a candidate for isoprene measurements in remote environments near isoprene sources.

  19. A generalized complexity measure based on Rényi entropy

    Science.gov (United States)

    Sánchez-Moreno, Pablo; Angulo, Juan Carlos; Dehesa, Jesus S.

    2014-08-01

    The intrinsic statistical complexities of finite many-particle systems (i.e., those defined in terms of the single-particle density) quantify the degree of structure or patterns, far beyond the entropy measures. They are intuitively constructed to be minima at the opposite extremes of perfect order and maximal randomness. Starting from the pioneering LMC measure, which satisfies these requirements, some extensions of LMC-Rényi type have been published in the literature. The latter measures were shown to describe a variety of physical aspects of the internal disorder in atomic and molecular systems (e.g., quantum phase transitions, atomic shell filling) which are not grasped by their mother LMC quantity. However, they are not minimal for maximal randomness in general. In this communication, we propose a generalized LMC-Rényi complexity which overcomes this problem. Some applications which illustrate this fact are given.

  20. Template measurement for plutonium pit based on neural networks

    International Nuclear Information System (INIS)

    Zhang Changfan; Gong Jian; Liu Suping; Hu Guangchun; Xiang Yongchun

    2012-01-01

    Template measurement for plutonium pit extracts characteristic data from-ray spectrum and the neutron counts emitted by plutonium. The characteristic data of the suspicious object are compared with data of the declared plutonium pit to verify if they are of the same type. In this paper, neural networks are enhanced as the comparison algorithm for template measurement of plutonium pit. Two kinds of neural networks are created, i.e. the BP and LVQ neural networks. They are applied in different aspects for the template measurement and identification. BP neural network is used for classification for different types of plutonium pits, which is often used for management of nuclear materials. LVQ neural network is used for comparison of inspected objects to the declared one, which is usually applied in the field of nuclear disarmament and verification. (authors)

  1. Micro-Structure Measurement and Imaging Based on Digital Holography

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kyeong Suk; Jung, Hyun Chul; Chang, Ho Seob; Akhter, Naseem [Chosun University, Gwangju (Korea, Republic of); Kee, Chang Doo [Chonnam National University, Gwangju (Korea, Republic of)

    2010-06-15

    Advancements in the imaging and computing technology have opened the path to digital holography for non-destructive investigations of technical samples, material property measurement, vibration analysis, flow visualization and stress analysis in aerospace industry which has widened the application of digital holography in the above fields. In this paper, we demonstrate the non-destructive investigation and micro-structure measurement application of digital holography to the small particles and a biological sample. This paper gives a brief description of the digital holograms recorded with this system and illustratively demonstrated

  2. Estimation of piping temperature fluctuations based on external strain measurements

    International Nuclear Information System (INIS)

    Morilhat, P.; Maye, J.P.

    1993-01-01

    Due to the difficulty to carry out measurements at the inner sides of nuclear reactor piping subjected to thermal transients, temperature and stress variations in the pipe walls are estimated by means of external thermocouples and strain-gauges. This inverse problem is solved by spectral analysis. Since the wall harmonic transfer function (response to a harmonic load) is known, the inner side signal will be obtained by convolution of the inverse transfer function of the system and of the strain measurement enables detection of internal temperature fluctuations in a frequency range beyond the scope of the thermocouples. (authors). 5 figs., 3 refs

  3. Micro-Structure Measurement and Imaging Based on Digital Holography

    International Nuclear Information System (INIS)

    Kim, Kyeong Suk; Jung, Hyun Chul; Chang, Ho Seob; Akhter, Naseem; Kee, Chang Doo

    2010-01-01

    Advancements in the imaging and computing technology have opened the path to digital holography for non-destructive investigations of technical samples, material property measurement, vibration analysis, flow visualization and stress analysis in aerospace industry which has widened the application of digital holography in the above fields. In this paper, we demonstrate the non-destructive investigation and micro-structure measurement application of digital holography to the small particles and a biological sample. This paper gives a brief description of the digital holograms recorded with this system and illustratively demonstrated

  4. A measurement-based technique for incipient anomaly detection

    KAUST Repository

    Harrou, Fouzi; Sun, Ying

    2016-01-01

    Fault detection is essential for safe operation of various engineering systems. Principal component analysis (PCA) has been widely used in monitoring highly correlated process variables. Conventional PCA-based methods, nevertheless, often fail to detect small or incipient faults. In this paper, we develop new PCA-based monitoring charts, combining PCA with multivariate memory control charts, such as the multivariate cumulative sum (MCUSUM) and multivariate exponentially weighted moving average (MEWMA) monitoring schemes. The multivariate control charts with memory are sensitive to small and moderate faults in the process mean, which significantly improves the performance of PCA methods and widen their applicability in practice. Using simulated data, we demonstrate that the proposed PCA-based MEWMA and MCUSUM control charts are more effective in detecting small shifts in the mean of the multivariate process variables, and outperform the conventional PCA-based monitoring charts. © 2015 IEEE.

  5. A measurement-based technique for incipient anomaly detection

    KAUST Repository

    Harrou, Fouzi

    2016-06-13

    Fault detection is essential for safe operation of various engineering systems. Principal component analysis (PCA) has been widely used in monitoring highly correlated process variables. Conventional PCA-based methods, nevertheless, often fail to detect small or incipient faults. In this paper, we develop new PCA-based monitoring charts, combining PCA with multivariate memory control charts, such as the multivariate cumulative sum (MCUSUM) and multivariate exponentially weighted moving average (MEWMA) monitoring schemes. The multivariate control charts with memory are sensitive to small and moderate faults in the process mean, which significantly improves the performance of PCA methods and widen their applicability in practice. Using simulated data, we demonstrate that the proposed PCA-based MEWMA and MCUSUM control charts are more effective in detecting small shifts in the mean of the multivariate process variables, and outperform the conventional PCA-based monitoring charts. © 2015 IEEE.

  6. Deterministic Predictions of Vessel Responses Based on Past Measurements

    DEFF Research Database (Denmark)

    Nielsen, Ulrik Dam; Jensen, Jørgen Juncher

    2017-01-01

    The paper deals with a prediction procedure from which global wave-induced responses can be deterministically predicted a short time, 10-50 s, ahead of current time. The procedure relies on the autocorrelation function and takes into account prior measurements only; i.e. knowledge about wave...

  7. Measurement-based local quantum filters and their ability to ...

    Indian Academy of Sciences (India)

    Debmalya Das

    2017-05-30

    May 30, 2017 ... Entanglement; local filters; quantum measurement. PACS No. 03.65 ... ties [4,5], it also plays a key role in quantum computing where it is ... Furthermore, we pro- vide an ..... Corresponding to each of these vectors, we can con-.

  8. Comet: An internet based platform for education in measurement

    NARCIS (Netherlands)

    Regtien, Paulus P.L.; Halaj, Martin; Kureková, Eva; Gabko, Peter

    2005-01-01

    The project COMET provides a multimedia training package for metrology and measurement. The package is developed by a consortium of 10 institutes from 7 European countries. It consists of 31 modules, each dealing with a particular aspect of metrology, and is available in English, German, French and

  9. COMET: A multimedia internet based platform for education in measurement

    NARCIS (Netherlands)

    Grattan, K.T.V.; Regtien, Paulus P.L.; Halaj, M; Kureková, E.; Gabko, P

    2006-01-01

    The project COMET provides a multimedia training package for metrology and measurement. The package is developed by a consortium of 10 institutes from 7 European countries. It consists of 31 modules, each dealing with a particular aspect of metrology, and is available in English, German, French and

  10. TRIM timber projections: an evaluation based on forest inventory measurements.

    Science.gov (United States)

    John R. Mills

    1989-01-01

    Two consecutive timberland inventories collected from permanent plots in the natural pine type in North Carolina were used to evaluate the timber resource inventory model (TRIM). This study compares model predictions with field measurements and examines the effect of inventory data aggregation on the accuracy of projections. Projections were repeated for two geographic...

  11. Psycho-Pedagogical Measuring Bases of Educational Competences of Students

    Science.gov (United States)

    Kenzhegaliev, Kulush K.; Shayakhmetova, Aisulu A.; Zulkarnayeva, Zhamila A.; Iksatova, Balzhan K.; Shonova, Bakytgul A.

    2016-01-01

    The relevance of the research problem is conditioned by the weak development of measurement, assessment of educational competences at an operational level, at the level of actions, by insufficient applications of psycho-pedagogical theories and methods of mathematical statistics. The aim of the work is to develop through teaching experiments the…

  12. Quality measures for HRR alignment based ISAR imaging algorithms

    CSIR Research Space (South Africa)

    Janse van Rensburg, V

    2013-05-01

    Full Text Available Some Inverse Synthetic Aperture Radar (ISAR) algorithms form the image in a two-step process of range alignment and phase conjugation. This paper discusses a comprehensive set of measures used to quantify the quality of range alignment, with the aim...

  13. Measuring Clearance Mechanics Based on Dynamic Leg Length

    Science.gov (United States)

    Khamis, Sam; Danino, Barry; Hayek, Shlomo; Carmeli, Eli

    2018-01-01

    The aim of this study was to quantify clearance mechanics during gait. Seventeen children diagnosed with hemiplegic cerebral palsy underwent a three-dimensional gait analysis evaluation. Dynamic leg lengths were measured from the hip joint center to the heel, to the ankle joint center and to the forefoot throughout the gait cycle. Significant…

  14. Microflown based monopole sound sources for reciprocal measurements

    NARCIS (Netherlands)

    Bree, H.E. de; Basten, T.G.H.

    2008-01-01

    Monopole sound sources (i.e. omni directional sound sources with a known volume velocity) are essential for reciprocal measurements used in vehicle interior panel noise contribution analysis. Until recently, these monopole sound sources use a sound pressure transducer sensor as a reference sensor. A

  15. Image-Based Collection and Measurements for Construction Pay Items

    Science.gov (United States)

    2017-07-01

    Prior to each payment to contractors and suppliers, measurements are made to document the actual amount of pay items placed at the site. This manual process has substantial risk for personnel, and could be made more efficient and less prone to human ...

  16. Coordination of two robot manipulators based on position measurements only

    NARCIS (Netherlands)

    Rodriguez Angeles, A.; Nijmeijer, H.

    2001-01-01

    In this note we propose a controller that solves the problem of coordination of two (or more) robots, under a master-slave scheme, in the case when only position measurements are available. The controller consists of a feedback control law, and two non-linear observers. It is shown that the

  17. Height estimations based on eye measurements throughout a gait cycle

    DEFF Research Database (Denmark)

    Yang, Sylvia X M; Larsen, Peter K; Alkjær, Tine

    2014-01-01

    (EH) measurement, on the other hand, is less prone to concealment. The purpose of the present study was to investigate: (1) how the eye height varies during the gait cycle, and (2) how the eye height changes with head position. The eyes were plotted manually in APAS for 16 test subjects during...

  18. Basing of a complex design measures for protection against fire

    International Nuclear Information System (INIS)

    Kryuger, V.

    1983-01-01

    Fire impact on NPP radiation safety is analyzed. The general industry requirements to the protection system against fire are shown to be insufficient for NPPs. A complex of protection measures against fire is suggested that should be taken into account in the NPP designs [ru

  19. Simulated BRDF based on measured surface topography of metal

    Science.gov (United States)

    Yang, Haiyue; Haist, Tobias; Gronle, Marc; Osten, Wolfgang

    2017-06-01

    The radiative reflective properties of a calibration standard rough surface were simulated by ray tracing and the Finite-difference time-domain (FDTD) method. The simulation results have been used to compute the reflectance distribution functions (BRDF) of metal surfaces and have been compared with experimental measurements. The experimental and simulated results are in good agreement.

  20. Recruitment recommendation system based on fuzzy measure and indeterminate integral

    Science.gov (United States)

    Yin, Xin; Song, Jinjie

    2017-08-01

    In this study, we propose a comprehensive evaluation approach based on indeterminate integral. By introducing the related concepts of indeterminate integral and their formulas into the recruitment recommendation system, we can calculate the suitability of each job for different applicants with the defined importance for each criterion listed in the job advertisements, the association between different criteria and subjective assessment as the prerequisite. Thus we can make recommendations to the applicants based on the score of the suitability of each job from high to low. In the end, we will exemplify the usefulness and practicality of this system with samples.

  1. Remote measurement of microwave distribution based on optical detection

    Energy Technology Data Exchange (ETDEWEB)

    Ji, Zhong; Ding, Wenzheng; Yang, Sihua; Chen, Qun, E-mail: redrocks-chenqun@hotmail.com, E-mail: xingda@scnu.edu.cn; Xing, Da, E-mail: redrocks-chenqun@hotmail.com, E-mail: xingda@scnu.edu.cn [MOE Key Laboratory of Laser Life Science and Institute of Laser Life Science, South China Normal University, Guangzhou 510631 (China)

    2016-01-04

    In this letter, we present the development of a remote microwave measurement system. This method employs an arc discharge lamp that serves as an energy converter from microwave to visible light, which can propagate without transmission medium. Observed with a charge coupled device, quantitative microwave power distribution can be achieved when the operators and electronic instruments are in a distance from the high power region in order to reduce the potential risk. We perform the experiments using pulsed microwaves, and the results show that the system response is dependent on the microwave intensity over a certain range. Most importantly, the microwave distribution can be monitored in real time by optical observation of the response of a one-dimensional lamp array. The characteristics of low cost, a wide detection bandwidth, remote measurement, and room temperature operation make the system a preferred detector for microwave applications.

  2. [Automated measurement of distance vision based on the DIN strategy].

    Science.gov (United States)

    Effert, R; Steinmetz, H; Jansen, W; Rau, G; Reim, M

    1989-07-01

    A method for automated measurement of far vision is described which meets the test requirements laid down in the new DIN standards. The subject sits 5 m from a high-resolution monitor on which either Landolt rings or Snellen's types are generated by a computer. By moving a joystick the subject indicates to the computer whether he can see the critical detail (e.g., the direction of opening of the Landolt ring). Depending on the subject's input and the course of the test so far, the computer generates the next test symbol until the threshold criterion is reached. The sequence of presentation of the symbols and the threshold criterion are also in accordance with the DIN standard. Initial measurements of far vision using this automated system produced similar results to those obtained by conventional methods.

  3. A new consensus measure based on Pearson correlation coefficient

    OpenAIRE

    Chiclana, Francisco; Gonzalez-Arteaga, Teresa; de Andres Calle, Rocio

    2016-01-01

    Obtaining consensual solutions is an important issue in decision making processes. It depends on several factors such as experts’ opinions, principles, knowledge, experience, etc. In the literature we can find a considerable amount of consensus measurement from different research areas (from a Social Choice perspective: Alcalde-Unzu and Vorsatz [1], Alcantud, de Andres Calle and Cascon [2] and Bosch [3], among others and from Decision Making Theory: Gonzalez-Arteaga, Alcantud and ...

  4. EPR-based distance measurements at ambient temperature

    Science.gov (United States)

    Krumkacheva, Olesya; Bagryanskaya, Elena

    2017-07-01

    Pulsed dipolar (PD) EPR spectroscopy is a powerful technique allowing for distance measurements between spin labels in the range of 2.5-10.0 nm. It was proposed more than 30 years ago, and nowadays is widely used in biophysics and materials science. Until recently, PD EPR experiments were limited to cryogenic temperatures (T biomolecules, the influence of a linker between the spin probe and biomolecule, and future opportunities.

  5. Real-time temperature field measurement based on acoustic tomography

    International Nuclear Information System (INIS)

    Bao, Yong; Jia, Jiabin; Polydorides, Nick

    2017-01-01

    Acoustic tomography can be used to measure the temperature field from the time-of-flight (TOF). In order to capture real-time temperature field changes and accurately yield quantitative temperature images, two improvements to the conventional acoustic tomography system are studied: simultaneous acoustic transmission and TOF collection along multiple ray paths, and an offline iteration reconstruction algorithm. During system operation, all the acoustic transceivers send modulated and filtered wideband Kasami sequences simultaneously to facilitate fast and accurate TOF measurements using cross-correlation detection. For image reconstruction, the iteration process is separated and executed offline beforehand to shorten computation time for online temperature field reconstruction. The feasibility and effectiveness of the developed methods are validated in the simulation study. The simulation results demonstrate that the proposed method can reduce the processing time per frame from 160 ms to 20 ms, while the reconstruction error remains less than 5%. Hence, the proposed method has great potential in the measurement of rapid temperature change with good temporal and spatial resolution. (paper)

  6. Grain bulk density measurement based on wireless network

    Directory of Open Access Journals (Sweden)

    Wu Fangming

    2017-01-01

    Full Text Available To know the accurate quantity of stored grain, grain density sensors must be used to measure the grain’s bulk density. However, multi-sensors should be inserted into the storage facility, to quickly collect data during the inventory checking of stored grain. In this study, the ZigBee and Wi-Fi coexistence network’s ability to transmit data collected by density sensors was investigated. A system consisting of six sensor nodes, six router nodes, one gateway and one Android Pad was assembled to measure the grain’s bulk density and calculate its quantity. The CC2530 chip with ZigBee technology was considered as the core of the information processing, and wireless nodes detection in sensor, and router nodes. ZigBee worked in difference signal channel with Wi-Fi to avoid interferences and connected with Wi-Fi module by UART serial communications interfaces in gateway. The Android Pad received the measured data through the gateway and processed this data to calculate quantity. The system enabled multi-point and real-time parameter detection inside the grain storage. Results show that the system has characteristics of good expansibility, networking flexibility and convenience.

  7. Patient Specific Dosimetry based in excreted urine measurements

    Energy Technology Data Exchange (ETDEWEB)

    Barquero, R.; Nunez, C.; Ruiz, A.; Valverde, J.; Basurto, F.

    2006-07-01

    One of the limiting factors in utilising therapeutic radiopharmaceuticals in the I-131 thyroid therapy is the potential hazard to the bone marrow, kidneys, and other internal organs. In this work, by means of daily dose rate measurements at a point in contact of the can with the urine excreted by the patient undergoing radio-iodine therapy, activities and associated absorbed doses in total body are calculated. The urine can is characterised by a geometric and materials model for MC simulation with MCNP. Knowing the conversion factor from activity in urine to dose rate in the measurement point of the can for each filling volume, the urine and patient activity can be obtained at each measurement time. From the fitting of these activities, the time evolution, the effective half life in the patient and the cumulative whole body activity are calculated. The emission characteristics of I-131 are using after to estimate the maximum whole body absorbed dose. The results for 2 hyperthyroidism and 4 carcinoma treatments are presented. The maximum total body absorbed dose are 673 and 149 Gy for the carcinoma and hyperthyroidism. The corresponding range of T1/2 eff is o.2 to 2.5 days (carcinoma) and 5.4 to 6.6 days (hyperthyroidism). (Author)

  8. Smartphone-based quantitative measurements on holographic sensors.

    Science.gov (United States)

    Khalili Moghaddam, Gita; Lowe, Christopher Robin

    2017-01-01

    The research reported herein integrates a generic holographic sensor platform and a smartphone-based colour quantification algorithm in order to standardise and improve the determination of the concentration of analytes of interest. The utility of this approach has been exemplified by analysing the replay colour of the captured image of a holographic pH sensor in near real-time. Personalised image encryption followed by a wavelet-based image compression method were applied to secure the image transfer across a bandwidth-limited network to the cloud. The decrypted and decompressed image was processed through four principal steps: Recognition of the hologram in the image with a complex background using a template-based approach, conversion of device-dependent RGB values to device-independent CIEXYZ values using a polynomial model of the camera and computation of the CIEL*a*b* values, use of the colour coordinates of the captured image to segment the image, select the appropriate colour descriptors and, ultimately, locate the region of interest (ROI), i.e. the hologram in this case, and finally, application of a machine learning-based algorithm to correlate the colour coordinates of the ROI to the analyte concentration. Integrating holographic sensors and the colour image processing algorithm potentially offers a cost-effective platform for the remote monitoring of analytes in real time in readily accessible body fluids by minimally trained individuals.

  9. Smartphone-based quantitative measurements on holographic sensors.

    Directory of Open Access Journals (Sweden)

    Gita Khalili Moghaddam

    Full Text Available The research reported herein integrates a generic holographic sensor platform and a smartphone-based colour quantification algorithm in order to standardise and improve the determination of the concentration of analytes of interest. The utility of this approach has been exemplified by analysing the replay colour of the captured image of a holographic pH sensor in near real-time. Personalised image encryption followed by a wavelet-based image compression method were applied to secure the image transfer across a bandwidth-limited network to the cloud. The decrypted and decompressed image was processed through four principal steps: Recognition of the hologram in the image with a complex background using a template-based approach, conversion of device-dependent RGB values to device-independent CIEXYZ values using a polynomial model of the camera and computation of the CIEL*a*b* values, use of the colour coordinates of the captured image to segment the image, select the appropriate colour descriptors and, ultimately, locate the region of interest (ROI, i.e. the hologram in this case, and finally, application of a machine learning-based algorithm to correlate the colour coordinates of the ROI to the analyte concentration. Integrating holographic sensors and the colour image processing algorithm potentially offers a cost-effective platform for the remote monitoring of analytes in real time in readily accessible body fluids by minimally trained individuals.

  10. Comparing econometric and survey-based methodologies in measuring offshoring

    DEFF Research Database (Denmark)

    Refslund, Bjarke

    2016-01-01

    such as the national or regional level. Most macro analyses are based on proxies and trade statistics with limitations. Drawing on unique Danish survey data, this article demonstrates how survey data can provide important insights into the national scale and impacts of offshoring, including changes of employment...

  11. Modelling of autogenous shrinkage of concrete based on paste measurements

    NARCIS (Netherlands)

    Schlangen, E.; Leegwater, G.; Koenders, E.A.B.

    2006-01-01

    In order to be able to improve concrete modelling based on its constituent, more knowledge is needed about the material behaviour of these constituents. In this research the focus is on the behaviour of hardening concrete, therefore the properties of hardening cement are of most relevance.

  12. Likelihood-based Dynamic Factor Analysis for Measurement and Forecasting

    NARCIS (Netherlands)

    Jungbacker, B.M.J.P.; Koopman, S.J.

    2015-01-01

    We present new results for the likelihood-based analysis of the dynamic factor model. The latent factors are modelled by linear dynamic stochastic processes. The idiosyncratic disturbance series are specified as autoregressive processes with mutually correlated innovations. The new results lead to

  13. Intrusion detection method based on nonlinear correlation measure

    NARCIS (Netherlands)

    Ambusaidi, Mohammed A.; Tan, Zhiyuan; He, Xiangjian; Nanda, Priyadarsi; Lu, Liang Fu; Jamdagni, Aruna

    2014-01-01

    Cyber crimes and malicious network activities have posed serious threats to the entire internet and its users. This issue is becoming more critical, as network-based services, are more widespread and closely related to our daily life. Thus, it has raised a serious concern in individual internet

  14. Two-phase flow measurement based on oblique laser scattering

    Science.gov (United States)

    Vendruscolo, Tiago P.; Fischer, Robert; Martelli, Cícero; Rodrigues, Rômulo L. P.; Morales, Rigoberto E. M.; da Silva, Marco J.

    2015-07-01

    Multiphase flow measurements play a crucial role in monitoring productions processes in many industries. To guarantee the safety of processes involving multiphase flows, it is important to detect changes in the flow conditions before they can cause damage, often in fractions of seconds. Here we demonstrate how the scattering pattern of a laser beam passing a two-phase flow under an oblique angle to the flow direction can be used to detect derivations from the desired flow conditions in microseconds. Applying machine-learning techniques to signals obtained from three photo-detectors we achieve a compact, versatile, low-cost sensor design for safety applications.

  15. Measurement of hepatic steatosis based on magnetic resonance images

    Science.gov (United States)

    Tkaczyk, Adam; Jańczyk, Wojciech; Chełstowska, Sylwia; Socha, Piotr; Mulawka, Jan

    2017-08-01

    The subject of this work is the usage of digital image processing to measure hepatic steatosis. To calculate this value manually it requires a lot of time and precision from the radiologist. In order to resolve this issue, a C++ application has been created. This paper describes the algorithms that have been used to solve the problem. The next chapter presents the application architecture and introduces graphical user interface. The last section describes all the tests which have been carried out to check the correctness of the results.

  16. A Computer Based Moire Technique To Measure Very Small Displacements

    Science.gov (United States)

    Sciammarella, Cesar A.; Amadshahi, Mansour A.; Subbaraman, B.

    1987-02-01

    The accuracy that can be achieved in the measurement of very small displacements in techniques such as moire, holography and speckle is limited by the noise inherent to the utilized optical devices. To reduce the noise to signal ratio, the moire method can be utilized. Two system of carrier fringes are introduced, an initial system before the load is applied and a final system when the load is applied. The moire pattern of these two systems contains the sought displacement information and the noise common to the two patterns is eliminated. The whole process is performed by a computer on digitized versions of the patterns. Examples of application are given.

  17. Sensing line effects on PWR-based differential pressure measurements

    International Nuclear Information System (INIS)

    Evans, R.P.; Neff, G.G.

    1982-01-01

    An incorrrect configuration of the fluid-filled pressure sensing lines connecting differential pressure transducers to the pressure taps in a pressurized water reactor system can cause errors in the measurement and, during rapid pressure transients, could cause the transducer to fail. Testing was performed in both static and dynamic modes to experimentally determine the effects of sensing lines of various lengths, diameters, and materials. Testing was performed at ambient temperature with absolute line pressures at about 17 MPa using water as the pressure transmission fluid

  18. Directed energy deflection laboratory measurements of common space based targets

    Science.gov (United States)

    Brashears, Travis; Lubin, Philip; Hughes, Gary B.; Meinhold, Peter; Batliner, Payton; Motta, Caio; Madajian, Jonathan; Mercer, Whitaker; Knowles, Patrick

    2016-09-01

    We report on laboratory studies of the effectiveness of directed energy planetary defense as a part of the DE-STAR (Directed Energy System for Targeting of Asteroids and exploRation) program. DE-STAR and DE-STARLITE are directed energy "stand-off" and "stand-on" programs, respectively. These systems consist of a modular array of kilowatt-class lasers powered by photovoltaics, and are capable of heating a spot on the surface of an asteroid to the point of vaporization. Mass ejection, as a plume of evaporated material, creates a reactionary thrust capable of diverting the asteroid's orbit. In a series of papers, we have developed a theoretical basis and described numerical simulations for determining the thrust produced by material evaporating from the surface of an asteroid. In the DESTAR concept, the asteroid itself is used as the deflection "propellant". This study presents results of experiments designed to measure the thrust created by evaporation from a laser directed energy spot. We constructed a vacuum chamber to simulate space conditions, and installed a torsion balance that holds a common space target sample. The sample is illuminated with a fiber array laser with flux levels up to 60 MW/m2 , which allows us to simulate a mission level flux but on a small scale. We use a separate laser as well as a position sensitive centroid detector to readout the angular motion of the torsion balance and can thus determine the thrust. We compare the measured thrust to the models. Our theoretical models indicate a coupling coefficient well in excess of 100 μN/Woptical, though we assume a more conservative value of 80 μN/Woptical and then degrade this with an optical "encircled energy" efficiency of 0.75 to 60 μN/Woptical in our deflection modeling. Our measurements discussed here yield about 45 μN/Wabsorbed as a reasonable lower limit to the thrust per optical watt absorbed. Results vary depending on the material tested and are limited to measurements of 1 axis, so

  19. Assessing Therapist Competence : Development of a Performance-Based Measure and Its Comparison With a Web-Based Measure

    NARCIS (Netherlands)

    Cooper, Zafra; Doll, Helen; Bailey-Straebler, Suzanne; Bohn, Kristin; de Vries, Dian; Murphy, Rebecca; O'Connor, Marianne E; Fairburn, Christopher G

    2017-01-01

    BACKGROUND: Recent research interest in how best to train therapists to deliver psychological treatments has highlighted the need for rigorous, but scalable, means of measuring therapist competence. There are at least two components involved in assessing therapist competence: the assessment of their

  20. A framework for grouping nanoparticles based on their measurable characteristics.

    Science.gov (United States)

    Sayes, Christie M; Smith, P Alex; Ivanov, Ivan V

    2013-01-01

    There is a need to take a broader look at nanotoxicological studies. Eventually, the field will demand that some generalizations be made. To begin to address this issue, we posed a question: are metal colloids on the nanometer-size scale a homogeneous group? In general, most people can agree that the physicochemical properties of nanomaterials can be linked and related to their induced toxicological responses. The focus of this study was to determine how a set of selected physicochemical properties of five specific metal-based colloidal materials on the nanometer-size scale - silver, copper, nickel, iron, and zinc - could be used as nanodescriptors that facilitate the grouping of these metal-based colloids. The example of the framework pipeline processing provided in this paper shows the utility of specific statistical and pattern recognition techniques in grouping nanoparticles based on experimental data about their physicochemical properties. Interestingly, the results of the analyses suggest that a seemingly homogeneous group of nanoparticles could be separated into sub-groups depending on interdependencies observed in their nanodescriptors. These particles represent an important category of nanomaterials that are currently mass produced. Each has been reputed to induce toxicological and/or cytotoxicological effects. Here, we propose an experimental methodology coupled with mathematical and statistical modeling that can serve as a prototype for a rigorous framework that aids in the ability to group nanomaterials together and to facilitate the subsequent analysis of trends in data based on quantitative modeling of nanoparticle-specific structure-activity relationships. The computational part of the proposed framework is rather general and can be applied to other groups of nanomaterials as well.

  1. Refractive Index Measurement of Liquids Based on Microstructured Optical Fibers

    Directory of Open Access Journals (Sweden)

    Susana Silva

    2014-12-01

    Full Text Available This review is focused on microstructured optical fiber sensors developed in recent years for liquid RI sensing. The review is divided into three parts: the first section introduces a general view of the most relevant refractometric sensors that have been reported over the last thirty years. Section 2 discusses several microstructured optical fiber designs, namely, suspended-core fiber, photonic crystal fiber, large-core air-clad photonic crystal fiber, and others. This part is also divided into two main groups: the interferometric-based and resonance-based configurations. The sensing methods rely either on full/selective filling of the microstructured fiber air holes with a liquid analyte or by simply immersing the sensing fiber into the liquid analyte. The sensitivities and resolutions are tabled at the end of this section followed by a brief discussion of the obtained results. The last section concludes with some remarks about the microstructured fiber-based configurations developed for RI sensing and their potential for future applications.

  2. Generalized flow and determinism in measurement-based quantum computation

    Energy Technology Data Exchange (ETDEWEB)

    Browne, Daniel E [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PU (United Kingdom); Kashefi, Elham [Computing Laboratory and Christ Church College, University of Oxford, Parks Road, Oxford OX1 3QD (United Kingdom); Mhalla, Mehdi [Laboratoire d' Informatique de Grenoble, CNRS - Centre national de la recherche scientifique, Universite de Grenoble (France); Perdrix, Simon [Preuves, Programmes et Systemes (PPS), Universite Paris Diderot, Paris (France)

    2007-08-15

    We extend the notion of quantum information flow defined by Danos and Kashefi (2006 Phys. Rev. A 74 052310) for the one-way model (Raussendorf and Briegel 2001 Phys. Rev. Lett. 86 910) and present a necessary and sufficient condition for the stepwise uniformly deterministic computation in this model. The generalized flow also applied in the extended model with measurements in the (X, Y), (X, Z) and (Y, Z) planes. We apply both measurement calculus and the stabiliser formalism to derive our main theorem which for the first time gives a full characterization of the stepwise uniformly deterministic computation in the one-way model. We present several examples to show how our result improves over the traditional notion of flow, such as geometries (entanglement graph with input and output) with no flow but having generalized flow and we discuss how they lead to an optimal implementation of the unitaries. More importantly one can also obtain a better quantum computation depth with the generalized flow rather than with flow. We believe our characterization result is particularly valuable for the study of the algorithms and complexity in the one-way model.

  3. Automatic anatomical structures location based on dynamic shape measurement

    Science.gov (United States)

    Witkowski, Marcin; Rapp, Walter; Sitnik, Robert; Kujawinska, Malgorzata; Vander Sloten, Jos; Haex, Bart; Bogaert, Nico; Heitmann, Kjell

    2005-09-01

    New image processing methods and active photonics apparatus have made possible the development of relatively inexpensive optical systems for complex shape and object measurements. We present dynamic 360° scanning method for analysis of human lower body biomechanics, with an emphasis on the analysis of the knee joint. The anatomical structure (of high medical interest) that is possible to scan and analyze, is patella. Tracking of patella position and orientation under dynamic conditions may lead to detect pathological patella movements and help in knee joint disease diagnosis. The processed data is obtained from a dynamic laser triangulation surface measurement system, able to capture slow to normal movements with a scan frequency between 15 and 30 Hz. These frequency rates are enough to capture controlled movements used e.g. for medical examination purposes. The purpose of the work presented is to develop surface analysis methods that may be used as support of diagnosis of motoric abilities of lower limbs. The paper presents algorithms used to process acquired lower limbs surface data in order to find the position and orientation of patella. The algorithms implemented include input data preparation, curvature description methods, knee region discrimination and patella assumed position/orientation calculation. Additionally, a method of 4D (3D + time) medical data visualization is proposed. Also some exemplary results are presented.

  4. New approach to radiation monitoring: citizen based radiation measurement

    International Nuclear Information System (INIS)

    Kuca, P.; Helebrant, J.

    2016-01-01

    Both the Fukushima Dai-chi NPP accident in Japan in 2011 and the Chernobyl NPP accident in USSR in 1986 similarly to the first one have shown a necessity to find a way how to improve confidence of the public to official authorities. It is important especially in such a case of severe accidents with significant consequences in large inhabited areas around the damaged NPP. A lack of public confidence to officials was caused mostly by rather poor communication between official authorities and the public, as well by restricted access to the information for the public. It may have extremely negative impacts on the public understanding of actual situation and its possible risks, on public acceptance of necessary protective measures and participation of the public in remediation of the affected areas. One of possible ways to improve the situation can be implementation of citizen radiation monitoring on voluntary basis. Making sure, the official results are compatible with public self-measured ones, the public probably has more confidence in them. In the Czech Republic the implementation of such an approach is tested in the framework of security research founded by the Czech Ministry of the Interior - the research project RAMESIS solved by SURO. (authors)

  5. Generalized flow and determinism in measurement-based quantum computation

    International Nuclear Information System (INIS)

    Browne, Daniel E; Kashefi, Elham; Mhalla, Mehdi; Perdrix, Simon

    2007-01-01

    We extend the notion of quantum information flow defined by Danos and Kashefi (2006 Phys. Rev. A 74 052310) for the one-way model (Raussendorf and Briegel 2001 Phys. Rev. Lett. 86 910) and present a necessary and sufficient condition for the stepwise uniformly deterministic computation in this model. The generalized flow also applied in the extended model with measurements in the (X, Y), (X, Z) and (Y, Z) planes. We apply both measurement calculus and the stabiliser formalism to derive our main theorem which for the first time gives a full characterization of the stepwise uniformly deterministic computation in the one-way model. We present several examples to show how our result improves over the traditional notion of flow, such as geometries (entanglement graph with input and output) with no flow but having generalized flow and we discuss how they lead to an optimal implementation of the unitaries. More importantly one can also obtain a better quantum computation depth with the generalized flow rather than with flow. We believe our characterization result is particularly valuable for the study of the algorithms and complexity in the one-way model

  6. An AFM-based pit-measuring method for indirect measurements of cell-surface membrane vesicles

    International Nuclear Information System (INIS)

    Zhang, Xiaojun; Chen, Yuan; Chen, Yong

    2014-01-01

    Highlights: • Air drying induced the transformation of cell-surface membrane vesicles into pits. • An AFM-based pit-measuring method was developed to measure cell-surface vesicles. • Our method detected at least two populations of cell-surface membrane vesicles. - Abstract: Circulating membrane vesicles, which are shed from many cell types, have multiple functions and have been correlated with many diseases. Although circulating membrane vesicles have been extensively characterized, the status of cell-surface membrane vesicles prior to their release is less understood due to the lack of effective measurement methods. Recently, as a powerful, micro- or nano-scale imaging tool, atomic force microscopy (AFM) has been applied in measuring circulating membrane vesicles. However, it seems very difficult for AFM to directly image/identify and measure cell-bound membrane vesicles due to the similarity of surface morphology between membrane vesicles and cell surfaces. Therefore, until now no AFM studies on cell-surface membrane vesicles have been reported. In this study, we found that air drying can induce the transformation of most cell-surface membrane vesicles into pits that are more readily detectable by AFM. Based on this, we developed an AFM-based pit-measuring method and, for the first time, used AFM to indirectly measure cell-surface membrane vesicles on cultured endothelial cells. Using this approach, we observed and quantitatively measured at least two populations of cell-surface membrane vesicles, a nanoscale population (<500 nm in diameter peaking at ∼250 nm) and a microscale population (from 500 nm to ∼2 μm peaking at ∼0.8 μm), whereas confocal microscopy only detected the microscale population. The AFM-based pit-measuring method is potentially useful for studying cell-surface membrane vesicles and for investigating the mechanisms of membrane vesicle formation/release

  7. Electrophoresis- and FRET-Based Measures of Serpin Polymerization.

    Science.gov (United States)

    Faull, Sarah V; Brown, Anwen E; Haq, Imran; Irving, James A

    2017-01-01

    Many serpinopathies, including alpha-1 antitrypsin (A1AT) deficiency, are associated with the formation of unbranched polymer chains of mutant serpins. In vivo, this deficiency is the result of mutations that cause kinetic or thermodynamic destabilization of the molecule. However, polymerization can also be induced in vitro from mutant or wild-type serpins under destabilizing conditions. The characteristics of the resulting polymers are dependent upon induction conditions. Due to their relationship to disease, serpin polymers, mainly those formed from A1AT, have been widely studied. Here, we describe Förster resonance energy transfer (FRET) and gel-based approaches for their characterization.

  8. Modal response of interior mass based upon external measurements

    International Nuclear Information System (INIS)

    Chow, C T; Eli, M; Jorgensen, B R; Woehrle, T.

    1999-01-01

    Modal response testing has been used to predict the motion of interior masses of a system in which only external instrumentation is allowed. Testing of this form may occasionally be necessary in validation of a computer model, but also has potential as a tool for validating individual assemblies in a QA process. Examination of the external frequency response and mode shapes can offer insight into interior response. The interpretation of these results is improved through parallel analytical solutions. A simple, three-mass model has been examined experimentally and analytically to demonstrate modal theory. These results show the limitations of the external measurement in predicting internal response due to transmissibility. A procedure for utilizing external testing is described. The question posed through this research is whether or not modal correlation analysis can be adapted for use in systems for which instrumentation of critical components is missing

  9. Long distance measurement with a femtosecond laser based frequency comb

    Science.gov (United States)

    Bhattacharya, N.; Cui, M.; Zeitouny, M. G.; Urbach, H. P.; van den Berg, S. A.

    2017-11-01

    Recent advances in the field of ultra-short pulse lasers have led to the development of reliable sources of carrier envelope phase stabilized femtosecond pulses. The pulse train generated by such a source has a frequency spectrum that consists of discrete, regularly spaced lines known as a frequency comb. In this case both the frequency repetition and the carrier-envelope-offset frequency are referenced to a frequency standard, like an atomic clock. As a result the accuracy of the frequency standard is transferred to the optical domain, with the frequency comb as transfer oscillator. These unique properties allow the frequency comb to be applied as a versatile tool, not only for time and frequency metrology, but also in fundamental physics, high-precision spectroscopy, and laser noise characterization. The pulse-to-pulse phase relationship of the light emitted by the frequency comb has opened up new directions for long range highly accurate distance measurement.

  10. Microscopic oxygen imaging based on fluorescein bleaching efficiency measurements

    DEFF Research Database (Denmark)

    Beutler, Martin; Heisterkamp, Ines M.; Piltz, Bastian

    2014-01-01

    by a charge-coupled-device (ccd) camera mounted on a fluorescence microscope allowed a pixelwise estimation of the ratio function in a microscopic image. Use of a microsensor and oxygen-consuming bacteria in a sample chamber enabled the calibration of the system for quantification of absolute oxygen......Photobleaching of the fluorophore fluorescein in an aqueous solution is dependent on the oxygen concentration. Therefore, the time-dependent bleaching behavior can be used to measure of dissolved oxygen concentrations. The method can be combined with epi-fluorescence microscopy. The molecular...... states of the fluorophore can be expressed by a three-state energy model. This leads to a set of differential equations which describe the photobleaching behavior of fluorescein. The numerical solution of these equations shows that in a conventional wide-field fluorescence microscope, the fluorescence...

  11. Graph-Based Cooperative Localization Using Symmetric Measurement Equations.

    Science.gov (United States)

    Gulati, Dhiraj; Zhang, Feihu; Clarke, Daniel; Knoll, Alois

    2017-06-17

    Precise localization is a key requirement for the success of highly assisted or autonomous vehicles. The diminishing cost of hardware has resulted in a proliferation of the number of sensors in the environment. Cooperative localization (CL) presents itself as a feasible and effective solution for localizing the ego-vehicle and its neighboring vehicles. However, one of the major challenges to fully realize the effective use of infrastructure sensors for jointly estimating the state of a vehicle in cooperative vehicle-infrastructure localization is an effective data association. In this paper, we propose a method which implements symmetric measurement equations within factor graphs in order to overcome the data association challenge with a reduced bandwidth overhead. Simulated results demonstrate the benefits of the proposed approach in comparison with our previously proposed approach of topology factors.

  12. Multiple kernel boosting framework based on information measure for classification

    International Nuclear Information System (INIS)

    Qi, Chengming; Wang, Yuping; Tian, Wenjie; Wang, Qun

    2016-01-01

    The performance of kernel-based method, such as support vector machine (SVM), is greatly affected by the choice of kernel function. Multiple kernel learning (MKL) is a promising family of machine learning algorithms and has attracted many attentions in recent years. MKL combines multiple sub-kernels to seek better results compared to single kernel learning. In order to improve the efficiency of SVM and MKL, in this paper, the Kullback–Leibler kernel function is derived to develop SVM. The proposed method employs an improved ensemble learning framework, named KLMKB, which applies Adaboost to learning multiple kernel-based classifier. In the experiment for hyperspectral remote sensing image classification, we employ feature selected through Optional Index Factor (OIF) to classify the satellite image. We extensively examine the performance of our approach in comparison to some relevant and state-of-the-art algorithms on a number of benchmark classification data sets and hyperspectral remote sensing image data set. Experimental results show that our method has a stable behavior and a noticeable accuracy for different data set.

  13. Integration of a silicon-based microprobe into a gear measuring instrument for accurate measurement of micro gears

    International Nuclear Information System (INIS)

    Ferreira, N; Krah, T; Jeong, D C; Kniel, K; Härtig, F; Metz, D; Dietzel, A; Büttgenbach, S

    2014-01-01

    The integration of silicon micro probing systems into conventional gear measuring instruments (GMIs) allows fully automated measurements of external involute micro spur gears of normal modules smaller than 1 mm. This system, based on a silicon microprobe, has been developed and manufactured at the Institute for Microtechnology of the Technische Universität Braunschweig. The microprobe consists of a silicon sensor element and a stylus which is oriented perpendicularly to the sensor. The sensor is fabricated by means of silicon bulk micromachining. Its small dimensions of 6.5 mm × 6.5 mm allow compact mounting in a cartridge to facilitate the integration into a GMI. In this way, tactile measurements of 3D microstructures can be realized. To enable three-dimensional measurements with marginal forces, four Wheatstone bridges are built with diffused piezoresistors on the membrane of the sensor. On the reverse of the membrane, the stylus is glued perpendicularly to the sensor on a boss to transmit the probing forces to the sensor element during measurements. Sphere diameters smaller than 300 µm and shaft lengths of 5 mm as well as measurement forces from 10 µN enable the measurements of 3D microstructures. Such micro probing systems can be integrated into universal coordinate measuring machines and also into GMIs to extend their field of application. Practical measurements were carried out at the Physikalisch-Technische Bundesanstalt by qualifying the microprobes on a calibrated reference sphere to determine their sensitivity and their physical dimensions in volume. Following that, profile and helix measurements were carried out on a gear measurement standard with a module of 1 mm. The comparison of the measurements shows good agreement between the measurement values and the calibrated values. This result is a promising basis for the realization of smaller probe diameters for the tactile measurement of micro gears with smaller modules. (paper)

  14. Adaptive Voltage Stability Protection Based on Load Identification Using Phasor Measurement Units

    DEFF Research Database (Denmark)

    Liu, Leo; Bak, Claus Leth; Chen, Zhe

    2011-01-01

    collapse. In this paper, the online load identification using measurement-based approach based on Phasor Measurement Units (PMU) was proposed to evaluate the proximity to voltage instability in order to prevent voltage collapse. In the scenarios of disturbances, the proximity to voltage collapse...... scheme based on PMUs is promising, as it prevented the voltage collapse and minimized the load shedding area....

  15. Ranking the Online Documents Based on Relative Credibility Measures

    Directory of Open Access Journals (Sweden)

    Ahmad Dahlan

    2013-09-01

    Full Text Available Information searching is the most popular activity in Internet. Usually the search engine provides the search results ranked by the relevance. However, for a certain purpose that concerns with information credibility, particularly citing information for scientific works, another approach of ranking the search engine results is required. This paper presents a study on developing a new ranking method based on the credibility of information. The method is built up upon two well-known algorithms, PageRank and Citation Analysis. The result of the experiment that used Spearman Rank Correlation Coefficient to compare the proposed rank (generated by the method with the standard rank (generated manually by a group of experts showed that the average Spearman 0 < rS < critical value. It means that the correlation was proven but it was not significant. Hence the proposed rank does not satisfy the standard but the performance could be improved.

  16. Ranking the Online Documents Based on Relative Credibility Measures

    Directory of Open Access Journals (Sweden)

    Ahmad Dahlan

    2009-05-01

    Full Text Available Information searching is the most popular activity in Internet. Usually the search engine provides the search results ranked by the relevance. However, for a certain purpose that concerns with information credibility, particularly citing information for scientific works, another approach of ranking the search engine results is required. This paper presents a study on developing a new ranking method based on the credibility of information. The method is built up upon two well-known algorithms, PageRank and Citation Analysis. The result of the experiment that used Spearman Rank Correlation Coefficient to compare the proposed rank (generated by the method with the standard rank (generated manually by a group of experts showed that the average Spearman 0 < rS < critical value. It means that the correlation was proven but it was not significant. Hence the proposed rank does not satisfy the standard but the performance could be improved.

  17. Transformer Temperature Measurment Using Optical Fiber Based Microbend Sensor

    Directory of Open Access Journals (Sweden)

    Deepika YADAV

    2007-10-01

    Full Text Available Breakdown of transformers proves to be very expensive and inconvenient because it takes a lot of time for their replacement. During breakdown the industry also incurs heavy losses because of stoppage in production line. A system for monitoring the temperature of transformers is required. Existing sensors cannot be used for monitoring the temperature of transformers because they are sensitive to electrical signals and can cause sparking which can trigger fire since there is oil in transformers cooling coils. Optical fibers are electrically inert so this system will prove to be ideal for this application. Results of investigations carried out by simulating a configuration of Optical Fiber Temperature Sensor for transformers based on microbending using Matlab as a simulation tool to evaluate the effectiveness of this sensor have been communicated through this manuscript. The results are in the form of graphs of intensity modulation vs. the temperature.

  18. Natural texture retrieval based on perceptual similarity measurement

    Science.gov (United States)

    Gao, Ying; Dong, Junyu; Lou, Jianwen; Qi, Lin; Liu, Jun

    2018-04-01

    A typical texture retrieval system performs feature comparison and might not be able to make human-like judgments of image similarity. Meanwhile, it is commonly known that perceptual texture similarity is difficult to be described by traditional image features. In this paper, we propose a new texture retrieval scheme based on texture perceptual similarity. The key of the proposed scheme is that prediction of perceptual similarity is performed by learning a non-linear mapping from image features space to perceptual texture space by using Random Forest. We test the method on natural texture dataset and apply it on a new wallpapers dataset. Experimental results demonstrate that the proposed texture retrieval scheme with perceptual similarity improves the retrieval performance over traditional image features.

  19. Glucose Monitoring System Based on Osmotic Pressure Measurements

    Directory of Open Access Journals (Sweden)

    Alexandra LEAL

    2011-02-01

    Full Text Available This paper presents the design and development of a prototype sensor unit for implementation in a long-term glucose monitoring system suitable for estimating glucose levels in people suffering from diabetes mellitus. The system utilizes osmotic pressure as the sensing mechanism and consists of a sensor prototype that is integrated together with a pre-amplifier and data acquisition unit for both data recording and processing. The sensor prototype is based on an embedded silicon absolute pressure transducer and a semipermeable nanoporous membrane that is enclosed in the sensor housing. The glucose monitoring system facilitates the integration of a low power microcontroller that is combined with a wireless inductive powered communication link. Experimental verification have proven that the system is capable of tracking osmotic pressure changes using albumin as a model compound, and thereby show a proof of concept for novel long term tracking of blood glucose from remote sensor nodes.

  20. Estimating spacecraft attitude based on in-orbit sensor measurements

    DEFF Research Database (Denmark)

    Jakobsen, Britt; Lyn-Knudsen, Kevin; Mølgaard, Mathias

    2014-01-01

    of 2014/15. To better evaluate the performance of the payload, it is desirable to couple the payload data with the satellite's orientation. With AAUSAT3 already in orbit it is possible to collect data directly from space in order to evaluate the performance of the attitude estimation. An extended kalman...... filter (EKF) is used for quaternion-based attitude estimation. A Simulink simulation environment developed for AAUSAT3, containing a "truth model" of the satellite and the orbit environment, is used to test the performance The performance is tested using different sensor noise parameters obtained both...... from a controlled environment on Earth as well as in-orbit. By using sensor noise parameters obtained on Earth as the expected parameters in the attitude estimation, and simulating the environment using the sensor noise parameters from space, it is possible to assess whether the EKF can be designed...

  1. PRETTY: Grazing altimetry measurements based on the interferometric method

    DEFF Research Database (Denmark)

    Høeg, Per; Fragner, Heinrich; Dielacher, Andreas

    2017-01-01

    The exploitation of signals stemming from global navigation systems for passive bistatic radar applications has beenproposed and implemented within numerous studies. The fact that such missions do not rely on high power amplifiersand that the need of high gain antennas with large geometrical...... dimensions can be avoided, makes them suitable forsmall satellite missions. Applications where a continuous high coverage is needed, as for example disaster warning,have the demand for a large number of satellites in orbit, which in turn requires small and relatively low cost satellites.The proposed PRETTY...... (Passive Reflectometry and Dosimetry) mission includes a demonstrator payload for passivereflectometry and scatterometry focusing on very low incidence angles whereby the direct and reflected signal will bereceived via the same antenna. The correlation of both signals will be done by a specific FPGA based...

  2. Microrheometric upconversion-based techniques for intracellular viscosity measurements

    Science.gov (United States)

    Rodríguez-Sevilla, Paloma; Zhang, Yuhai; de Sousa, Nuno; Marqués, Manuel I.; Sanz-Rodríguez, Francisco; Jaque, Daniel; Liu, Xiaogang; Haro-González, Patricia

    2017-08-01

    Rheological parameters (viscosity, creep compliance and elasticity) play an important role in cell function and viability. For this reason different strategies have been developed for their study. In this work, two new microrheometric techniques are presented. Both methods take advantage of the analysis of the polarized emission of an upconverting particle to determine its orientation inside the optical trap. Upconverting particles are optical materials that are able to convert infrared radiation into visible light. Their usefulness has been further boosted by the recent demonstration of their three-dimensional control and tracking by single beam infrared optical traps. In this work it is demonstrated that optical torques are responsible of the stable orientation of the upconverting particle inside the trap. Moreover, numerical calculations and experimental data allowed to use the rotation dynamics of the optically trapped upconverting particle for environmental sensing. In particular, the cytoplasm viscosity could be measured by using the rotation time and thermal fluctuations of an intracellular optically trapped upconverting particle, by means of the two previously mentioned microrheometric techniques.

  3. Efficient iris texture analysis method based on Gabor ordinal measures

    Science.gov (United States)

    Tajouri, Imen; Aydi, Walid; Ghorbel, Ahmed; Masmoudi, Nouri

    2017-07-01

    With the remarkably increasing interest directed to the security dimension, the iris recognition process is considered to stand as one of the most versatile technique critically useful for the biometric identification and authentication process. This is mainly due to every individual's unique iris texture. A modestly conceived efficient approach relevant to the feature extraction process is proposed. In the first place, iris zigzag "collarette" is extracted from the rest of the image by means of the circular Hough transform, as it includes the most significant regions lying in the iris texture. In the second place, the linear Hough transform is used for the eyelids' detection purpose while the median filter is applied for the eyelashes' removal. Then, a special technique combining the richness of Gabor features and the compactness of ordinal measures is implemented for the feature extraction process, so that a discriminative feature representation for every individual can be achieved. Subsequently, the modified Hamming distance is used for the matching process. Indeed, the advanced procedure turns out to be reliable, as compared to some of the state-of-the-art approaches, with a recognition rate of 99.98%, 98.12%, and 95.02% on CASIAV1.0, CASIAV3.0, and IIT Delhi V1 iris databases, respectively.

  4. UAV BASED BRDF-MEASUREMENTS OF AGRICULTURAL SURFACES WITH PFIFFIKUS

    Directory of Open Access Journals (Sweden)

    G. J. Grenzdörffer

    2012-09-01

    Full Text Available BRDF is a common problem in remote sensing and also in oblique photogrammetry. Common approaches of BRDF-measurement with a field goniometer are costly and rather cumbersome. UAVs may offer an interesting alternative by using a special flight pattern of oblique and converging images. The main part of this paper is the description of a photogrammetric workflow in order to determine the anisotropic reflection properties of a given surface. Due to the relatively low flying heights standard procedures from close range photogrammetry were adopted for outdoor usage. The photogrammetric processing delivered automatic and highly accurate orientation information with the aid of coded targets. The interior orientation of the consumer grade camera is more or less stable. The radiometrically corrected oblique images are converted into ortho photos. The azimuth and elevation angle of every point may then be computed. The calculated anisotropy of a winter wheat plot is shown. A system four diagonally-looking cameras (Four Vision and an additional nadir looking camera is under development. The multi camera system especially designed for a Micro- UAV with a payload of min 1 kg. The system is composed of five industrial digital frame cameras (1.3 Mpix CCD-chips, 15 fp/s with fixed lenses. Also special problems with the construction of a light weight housing of the multi camera solution are covered in the paper.

  5. Evaluation of the Relationship between Literacy and Mathematics Skills as Assessed by Curriculum-Based Measures

    Science.gov (United States)

    Rutherford-Becker, Kristy J.; Vanderwood, Michael L.

    2009-01-01

    The purpose of this study was to evaluate the extent that reading performance (as measured by curriculum-based measures [CBM] of oral reading fluency [ORF] and Maze reading comprehension), is related to math performance (as measured by CBM math computation and applied math). Additionally, this study examined which of the two reading measures was a…

  6. Potentiometric measurement of polymer-membrane electrodes based on lanthanum

    Energy Technology Data Exchange (ETDEWEB)

    Saefurohman, Asep, E-mail: saefurohman.asep78@Gmail.com; Buchari,, E-mail: saefurohman.asep78@Gmail.com; Noviandri, Indra, E-mail: saefurohman.asep78@Gmail.com [Department of Chemistry, Bandung Institute of Technology (Indonesia); Syoni [Department of Metallurgy Engineering, Bandung Institute of Technology (Indonesia)

    2014-03-24

    Quantitative analysis of rare earth elements which are considered as the standard method that has a high accuracy, and detection limits achieved by the order of ppm is inductively coupled plasma atomic emission spectroscopy (ICPAES). But these tools are expensive and valuable analysis of the high cost of implementation. In this study be made and characterized selective electrode for the determination of rare earth ions is potentiometric. Membrane manufacturing techniques studied is based on immersion (liquid impregnated membrane) in PTFE 0.5 pore size. As ionophores to be used tri butyl phosphate (TBP) and bis(2-etylhexyl) hydrogen phosphate. There is no report previously that TBP used as ionophore in polymeric membrane based lanthanum. Some parameters that affect the performance of membrane electrode such as membrane composition, membrane thickness, and types of membrane materials studied in this research. Manufacturing of Ion Selective Electrodes (ISE) Lanthanum (La) by means of impregnation La membrane in TBP in kerosene solution has been done and showed performance for ISE-La. FTIR spectrum results for PTFE 0.5 pore size which impregnated in TBP and PTFE blank showed difference of spectra in the top 1257 cm{sup −1}, 1031 cm{sup −1} and 794.7 cm{sup −1} for P=O stretching and stretching POC from group −OP =O. The result showed shift wave number for P =O stretching of the cluster (−OP=O) in PTFE-TBP mixture that is at the peak of 1230 cm{sup −1} indicated that no interaction bond between hydroxyl group of molecules with molecular clusters fosforil of TBP or R{sub 3}P = O. The membrane had stable responses in pH range between 1 and 9. Good responses were obtained using 10{sup −3} M La(III) internal solution, which produced relatively high potential. ISE-La showed relatively good performances. The electrode had a response time of 29±4.5 second and could be use for 50 days. The linear range was between 10{sup −5} and 10{sup −1} M.

  7. Detecting concealed information in less than a second: response latency-based measures

    NARCIS (Netherlands)

    Verschuere, B.; de Houwer, J.; Verschuere, B.; Ben-Shakhar, G.; Meijer, E.

    2011-01-01

    Concealed information can be accurately assessed with physiological measures. To overcome the practical limitations of physiological measures, an assessment using response latencies has been proposed. At first sight, research findings on response latency based concealed information tests seem

  8. Statistical shape modeling based renal volume measurement using tracked ultrasound

    Science.gov (United States)

    Pai Raikar, Vipul; Kwartowitz, David M.

    2017-03-01

    Autosomal dominant polycystic kidney disease (ADPKD) is the fourth most common cause of kidney transplant worldwide accounting for 7-10% of all cases. Although ADPKD usually progresses over many decades, accurate risk prediction is an important task.1 Identifying patients with progressive disease is vital to providing new treatments being developed and enable them to enter clinical trials for new therapy. Among other factors, total kidney volume (TKV) is a major biomarker predicting the progression of ADPKD. Consortium for Radiologic Imaging Studies in Polycystic Kidney Disease (CRISP)2 have shown that TKV is an early, and accurate measure of cystic burden and likely growth rate. It is strongly associated with loss of renal function.3 While ultrasound (US) has proven as an excellent tool for diagnosing the disease; monitoring short-term changes using ultrasound has been shown to not be accurate. This is attributed to high operator variability and reproducibility as compared to tomographic modalities such as CT and MR (Gold standard). Ultrasound has emerged as one of the standout modality for intra-procedural imaging and with methods for spatial localization has afforded us the ability to track 2D ultrasound in physical space which it is being used. In addition to this, the vast amount of recorded tomographic data can be used to generate statistical shape models that allow us to extract clinical value from archived image sets. In this work, we aim at improving the prognostic value of US in managing ADPKD by assessing the accuracy of using statistical shape model augmented US data, to predict TKV, with the end goal of monitoring short-term changes.

  9. Customized DSP-based vibration measurement for wind turbines

    Energy Technology Data Exchange (ETDEWEB)

    LaWhite, N.E.; Cohn, K.E. [Second Wind Inc., Somerville, MA (United States)

    1996-12-31

    As part of its Advanced Distributed Monitoring System (ADMS) project funded by NREL, Second Wind Inc. is developing a new vibration measurement system for use with wind turbines. The system uses low-cost accelerometers originally designed for automobile airbag crash-detection coupled with new software executed on a Digital Signal Processor (DSP) device. The system is envisioned as a means to monitor the mechanical {open_quotes}health{close_quotes} of the wind turbine over its lifetime. In addition the system holds promise as a customized emergency vibration detector. The two goals are very different and it is expected that different software programs will be executed for each function. While a fast Fourier transform (FFT) signature under given operating conditions can yield much information regarding turbine condition, the sampling period and processing requirements make it inappropriate for emergency condition monitoring. This paper briefly reviews the development of prototype DSP and accelerometer hardware. More importantly, it reviews our work to design prototype vibration alarm filters. Two-axis accelerometer test data from the experimental FloWind vertical axis wind turbine is analyzed and used as a development guide. Two levels of signal processing are considered. The first uses narrow band pre-processing filters at key fundamental frequencies such as the 1P, 2P and 3P. The total vibration energy in each frequency band is calculated and evaluated as a possible alarm trigger. In the second level of signal processing, the total vibration energy in each frequency band is further decomposed using the two-axis directional information. Directional statistics are calculated to differentiate between linear translations and circular translations. After analyzing the acceleration statistics for normal and unusual operating conditions, the acceleration processing system described could be used in automatic early detection of fault conditions. 9 figs.

  10. An explicit semantic relatedness measure based on random walk

    Directory of Open Access Journals (Sweden)

    HU Sihui

    2016-10-01

    Full Text Available The semantic relatedness calculation of open domain knowledge network is a significant issue.In this paper,pheromone strategy is drawn from the thought of ant colony algorithm and is integrated into the random walk which is taken as the basic framework of calculating the semantic relatedness degree.The pheromone distribution is taken as a criterion of determining the tightness degree of semantic relatedness.A method of calculating semantic relatedness degree based on random walk is proposed and the exploration process of calculating the semantic relatedness degree is presented in a dominant way.The method mainly contains Path Select Model(PSM and Semantic Relatedness Computing Model(SRCM.PSM is used to simulate the path selection of ants and pheromone release.SRCM is used to calculate the semantic relatedness by utilizing the information returned by ants.The result indicates that the method could complete semantic relatedness calculation in linear complexity and extend the feasible strategy of semantic relatedness calculation.

  11. A Clustering-Oriented Closeness Measure Based on Neighborhood Chain and Its Application in the Clustering Ensemble Framework Based on the Fusion of Different Closeness Measures

    Directory of Open Access Journals (Sweden)

    Shaoyi Liang

    2017-09-01

    Full Text Available Closeness measures are crucial to clustering methods. In most traditional clustering methods, the closeness between data points or clusters is measured by the geometric distance alone. These metrics quantify the closeness only based on the concerned data points’ positions in the feature space, and they might cause problems when dealing with clustering tasks having arbitrary clusters shapes and different clusters densities. In this paper, we first propose a novel Closeness Measure between data points based on the Neighborhood Chain (CMNC. Instead of using geometric distances alone, CMNC measures the closeness between data points by quantifying the difficulty for one data point to reach another through a chain of neighbors. Furthermore, based on CMNC, we also propose a clustering ensemble framework that combines CMNC and geometric-distance-based closeness measures together in order to utilize both of their advantages. In this framework, the “bad data points” that are hard to cluster correctly are identified; then different closeness measures are applied to different types of data points to get the unified clustering results. With the fusion of different closeness measures, the framework can get not only better clustering results in complicated clustering tasks, but also higher efficiency.

  12. THE MEASUREMENT BASES AND THE ANALYSIS OFTHOSE FOR QUALITATIVE CHARACTERISTICS OF FINANCIAL STATEMENTS

    OpenAIRE

    Hikmet Ulusan

    2008-01-01

    The measurement bases of assets and liabilities for financial reporting are basically included: historical cost, replacement cost, net realizable value, value in use, deprival value and fair value. The first part of this study deals with the measurement bases of assets and liabilities for financial reporting. In the second part, the measurement bases for the qualitative characteristics that determine the usefulness of information provided in financial statements areanalyzed.

  13. A Laser-Based Measuring System for Online Quality Control of Car Engine Block

    Directory of Open Access Journals (Sweden)

    Xing-Qiang Li

    2016-11-01

    Full Text Available For online quality control of car engine production, pneumatic measurement instrument plays an unshakeable role in measuring diameters inside engine block because of its portability and high-accuracy. To the limitation of its measuring principle, however, the working space between the pneumatic device and measured surface is too small to require manual operation. This lowers the measuring efficiency and becomes an obstacle to perform automatic measurement. In this article, a high-speed, automatic measuring system is proposed to take the place of pneumatic devices by using a laser-based measuring unit. The measuring unit is considered as a set of several measuring modules, where each of them acts like a single bore gauge and is made of four laser triangulation sensors (LTSs, which are installed on different positions and in opposite directions. The spatial relationship among these LTSs was calibrated before measurements. Sampling points from measured shaft holes can be collected by the measuring unit. A unified mathematical model was established for both calibration and measurement. Based on the established model, the relative pose between the measuring unit and measured workpiece does not impact the measuring accuracy. This frees the measuring unit from accurate positioning or adjustment, and makes it possible to realize fast and automatic measurement. The proposed system and method were finally validated by experiments.

  14. A comparison of manual anthropometric measurements with Kinect-based scanned measurements in terms of precision and reliability.

    Science.gov (United States)

    Bragança, Sara; Arezes, Pedro; Carvalho, Miguel; Ashdown, Susan P; Castellucci, Ignacio; Leão, Celina

    2018-01-01

    Collecting anthropometric data for real-life applications demands a high degree of precision and reliability. It is important to test new equipment that will be used for data collectionOBJECTIVE:Compare two anthropometric data gathering techniques - manual methods and a Kinect-based 3D body scanner - to understand which of them gives more precise and reliable results. The data was collected using a measuring tape and a Kinect-based 3D body scanner. It was evaluated in terms of precision by considering the regular and relative Technical Error of Measurement and in terms of reliability by using the Intraclass Correlation Coefficient, Reliability Coefficient, Standard Error of Measurement and Coefficient of Variation. The results obtained showed that both methods presented better results for reliability than for precision. Both methods showed relatively good results for these two variables, however, manual methods had better results for some body measurements. Despite being considered sufficiently precise and reliable for certain applications (e.g. apparel industry), the 3D scanner tested showed, for almost every anthropometric measurement, a different result than the manual technique. Many companies design their products based on data obtained from 3D scanners, hence, understanding the precision and reliability of the equipment used is essential to obtain feasible results.

  15. Automatic calibration system of the temperature instrument display based on computer vision measuring

    Science.gov (United States)

    Li, Zhihong; Li, Jinze; Bao, Changchun; Hou, Guifeng; Liu, Chunxia; Cheng, Fang; Xiao, Nianxin

    2010-07-01

    With the development of computers and the techniques of dealing with pictures and computer optical measurement, various measuring techniques are maturing gradually on the basis of optical picture processing technique and using in practice. On the bases, we make use of the many years' experience and social needs in temperature measurement and computer vision measurement to come up with the completely automatic way of the temperature measurement meter with integration of the computer vision measuring technique. It realizes synchronization collection with theory temperature value, improves calibration efficiency. based on least square fitting principle, integrate data procession and the best optimize theory, rapidly and accurately realizes automation acquisition and calibration of temperature.

  16. Utility of the Canadian Occupational Performance Measure as an admission and outcome measure in interdisciplinary community-based geriatric rehabilitation

    DEFF Research Database (Denmark)

    Larsen, Anette Enemark; Carlsson, Gunilla

    2012-01-01

    In a community-based geriatric rehabilitation project, the Canadian Occupational Performance Measure (COPM) was used to develop a coordinated, interdisciplinary, and client-centred approach focusing on occupational performance. The purpose of this study was to evaluate the utility of the COPM as ...... physician, home care, occupational therapy, physiotherapy...

  17. Testing for Distortions in Performance Measures: An Application to Residual Income Based Measures like Economic Value Added

    NARCIS (Netherlands)

    Sloof, R.; van Praag, M.

    2015-01-01

    Distorted performance measures in compensation contracts elicit suboptimal behavioral responses that may even prove to be dysfunctional (gaming). This paper applies the empirical test developed by Courty and Marschke (2008) to detect whether the widely used class of Residual Income based performance

  18. An efficient binomial model-based measure for sequence comparison and its application.

    Science.gov (United States)

    Liu, Xiaoqing; Dai, Qi; Li, Lihua; He, Zerong

    2011-04-01

    Sequence comparison is one of the major tasks in bioinformatics, which could serve as evidence of structural and functional conservation, as well as of evolutionary relations. There are several similarity/dissimilarity measures for sequence comparison, but challenges remains. This paper presented a binomial model-based measure to analyze biological sequences. With help of a random indicator, the occurrence of a word at any position of sequence can be regarded as a random Bernoulli variable, and the distribution of a sum of the word occurrence is well known to be a binomial one. By using a recursive formula, we computed the binomial probability of the word count and proposed a binomial model-based measure based on the relative entropy. The proposed measure was tested by extensive experiments including classification of HEV genotypes and phylogenetic analysis, and further compared with alignment-based and alignment-free measures. The results demonstrate that the proposed measure based on binomial model is more efficient.

  19. Electron beam based transversal profile measurements of intense ion beams

    International Nuclear Information System (INIS)

    El Moussati, Said

    2014-01-01

    A non-invasive diagnostic method for the experimental determination of the transverse profile of an intense ion beam has been developed and investigated theoretically as well as experimentally within the framework of the present work. The method is based on the deflection of electrons when passing the electromagnetic field of an ion beam. To achieve this an electron beam is employed with a specifically prepared transversal profile. This distinguish this method from similar ones which use thin electron beams for scanning the electromagnetic field [Roy et al. 2005; Blockland10]. The diagnostic method presented in this work will be subsequently called ''Electron-Beam-Imaging'' (EBI). First of all the influence of the electromagnetic field of the ion beam on the electrons has been theoretically analyzed. It was found that the magnetic field causes only a shift of the electrons along the ion beam axis, while the electric field only causes a shift in a plane transverse to the ion beam. Moreover, in the non-relativistic case the magnetic force is significantly smaller than the Coulomb one and the electrons suffer due to the magnetic field just a shift and continue to move parallel to their initial trajectory. Under the influence of the electric field, the electrons move away from the ion beam axis, their resulting trajectory shows a specific angle compared to the original direction. This deflection angle practically depends just on the electric field of the ion beam. Thus the magnetic field has been neglected when analysing the experimental data. The theoretical model provides a relationship between the deflection angle of the electrons and the charge distribution in the cross section of the ion beam. The model however only can be applied for small deflection angles. This implies a relationship between the line-charge density of the ion beam and the initial kinetic energy of the electrons. Numerical investigations have been carried out to clarify the

  20. How Do Undergraduate Students Conceptualize Acid-Base Chemistry? Measurement of a Concept Progression

    Science.gov (United States)

    Romine, William L.; Todd, Amber N.; Clark, Travis B.

    2016-01-01

    We developed and validated a new instrument, called "Measuring Concept progressions in Acid-Base chemistry" (MCAB) and used it to better understand the progression of undergraduate students' understandings about acid-base chemistry. Items were developed based on an existing learning progression for acid-base chemistry. We used the Rasch…

  1. α-Cut method based importance measure for criticality analysis in fuzzy probability – Based fault tree analysis

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry; Sony Tjahyani, D.T.; Widodo, Surip; Tjahjono, Hendro

    2017-01-01

    Highlights: •FPFTA deals with epistemic uncertainty using fuzzy probability. •Criticality analysis is important for reliability improvement. •An α-cut method based importance measure is proposed for criticality analysis in FPFTA. •The α-cut method based importance measure utilises α-cut multiplication, α-cut subtraction, and area defuzzification technique. •Benchmarking confirm that the proposed method is feasible for criticality analysis in FPFTA. -- Abstract: Fuzzy probability – based fault tree analysis (FPFTA) has been recently developed and proposed to deal with the limitations of conventional fault tree analysis. In FPFTA, reliabilities of basic events, intermediate events and top event are characterized by fuzzy probabilities. Furthermore, the quantification of the FPFTA is based on fuzzy multiplication rule and fuzzy complementation rule to propagate uncertainties from basic event to the top event. Since the objective of the fault tree analysis is to improve the reliability of the system being evaluated, it is necessary to find the weakest path in the system. For this purpose, criticality analysis can be implemented. Various importance measures, which are based on conventional probabilities, have been developed and proposed for criticality analysis in fault tree analysis. However, not one of those importance measures can be applied for criticality analysis in FPFTA, which is based on fuzzy probability. To be fully applied in nuclear power plant probabilistic safety assessment, FPFTA needs to have its corresponding importance measure. The objective of this study is to develop an α-cut method based importance measure to evaluate and rank the importance of basic events for criticality analysis in FPFTA. To demonstrate the applicability of the proposed measure, a case study is performed and its results are then benchmarked to the results generated by the four well known importance measures in conventional fault tree analysis. The results

  2. Quantum authentication based on the randomness of measurement bases in BB84

    International Nuclear Information System (INIS)

    Dang Minh Dung; Bellot, P.; Alleaume, R.

    2005-01-01

    Full text: The establishment of a secret key between two legitimate end points of a communication link, let us name them Alice and Bob, using Quantum key distribution (QKD) is unconditionally secure thanks to Quantum Physics laws.However, the various QKD protocols do not intend to provide the authentication of the end points: Alice cannot be sure that she is communicating with Bob and reciprocally. Therefore, these protocols are subjects to various attacks. The most obvious attack is the man-in-the-middle attack in which an eavesdropper, let us name her Eve, stands in the middle of the communication link. Alice communicates with Eve meanwhile she thinks she communicate with Bob. And Bob communicates with Eve meanwhile he thinks he is communicating with Alice. Eve, acting as a relay, can read all the communications between Alice and Bob and retransmit them. To prevent this kind of attack, the solution is to authenticate the two end points of the communication link. One solution is that Alice and Bob share an authentication key prior to the communication. In order to improve the security, Alice and Bob must share a set of authentication one-time keys. One-time key means that the key has to be used only once because each time a key is used, the eavesdropper Eve can gain a few information on the key. Re-using the same key many times would finally reveal the key to Eve. However, Eve can simulate many times the authentication process with Alice. Each time Eve simulates the authentication process, one of the pre-positioned keys is depleted leading to the exhaustion of the set of pre-positioned keys. This type of attack is named Denial of Service attack. In this work, we propose to use the randomness of the measurement bases in BB84 to build an authentication scheme based on the existence of a prepositioned authentication key. This authentication scheme can be used with BB84 but also with any other Quantum Key Distribution protocols. It is protected against the Denial of

  3. Research on Water Velocity Measurement of Reservoir Based on Pressure Sensor

    Directory of Open Access Journals (Sweden)

    Xiaoqiang Zhao

    2014-11-01

    Full Text Available To address the problem that pressure sensor can only measure the liquid level in reservoir, we designed a current velocity measurement system of reservoir based on pressure sensor, analyzed the error of current velocity measurement system, and proposed the error processing method and corresponding program. Several tests and experimental results show that in this measurement system, the liquid level measurement standard deviation is no more than 0.01 cm, and the current velocity measurement standard deviation is no more than 0.35 mL/s, which proves that the pressure sensor can measure both liquid level and current velocity synchronously.

  4. Model-based wear measurements in total knee arthroplasty : development and validation of novel radiographic techniques

    NARCIS (Netherlands)

    IJsseldijk, van E.A.

    2016-01-01

    The primary aim of this work was to develop novel model-based mJSW measurement methods using a 3D reconstruction and compare the accuracy and precision of these methods to conventional mJSW measurement. This thesis contributed to the development, validation and clinical application of model-based

  5. Measuring individual significant change on the Beck Depression Inventory-II through IRT-based statistics.

    NARCIS (Netherlands)

    Brouwer, D.; Meijer, R.R.; Zevalkink, D.J.

    2013-01-01

    Several researchers have emphasized that item response theory (IRT)-based methods should be preferred over classical approaches in measuring change for individual patients. In the present study we discuss and evaluate the use of IRT-based statistics to measure statistical significant individual

  6. NO2 DOAS measurements from ground and space: comparison of ground based measurements and OMI data in Mexico City

    Science.gov (United States)

    Rivera, C.; Stremme, W.; Grutter, M.

    2012-04-01

    The combination of satellite data and ground based measurements can provide valuable information about atmospheric chemistry and air quality. In this work we present a comparison between measured ground based NO2 differential columns at the Universidad Nacional Autónoma de México (UNAM) in Mexico City, using the Differential Optical Absorption Spectroscopy (DOAS) technique and NO2 total columns measured by the Ozone Monitoring Instrument (OMI) onboard the Aura satellite using the same measurement technique. From these data, distribution maps of average NO2 above the Mexico basin were constructed and hot spots inside the city could be identified. In addition, a clear footprint was detected from the Tula industrial area, ~50 km northwest of Mexico City, where a refinery, a power plant and other industries are located. A less defined footprint was identified in the Cuernavaca basin, South of Mexico City, and the nearby cities of Toluca and Puebla do not present strong enhancements in the NO2 total columns. With this study we expect to cross-validate space and ground measurements and provide useful information for future studies.

  7. The Relationship between Video Game Use and a Performance-Based Measure of Persistence

    Science.gov (United States)

    Ventura, Matthew; Shute, Valerie; Zhao, Weinan

    2013-01-01

    An online performance-based measure of persistence was developed using anagrams and riddles. Persistence was measured by recording the time spent on unsolved anagrams and riddles. Time spent on unsolved problems was correlated to a self-report measure of persistence. Additionally, frequent video game players spent longer times on unsolved problems…

  8. The Beast of Aggregating Cognitive Load Measures in Technology-Based Learning

    Science.gov (United States)

    Leppink, Jimmie; van Merriënboer, Jeroen J. G.

    2015-01-01

    An increasing part of cognitive load research in technology-based learning includes a component of repeated measurements, that is: participants are measured two or more times on the same performance, mental effort or other variable of interest. In many cases, researchers aggregate scores obtained from repeated measurements to one single sum or…

  9. Coupon Test of an Elbow Component by Using Vision-based Measurement System

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung Wan; Jeon, Bub Gyu; Choi, Hyoung Suk; Kim, Nam Sik [Pusan National University, Busan (Korea, Republic of)

    2016-05-15

    Among the various methods to overcome this shortcoming, vision-based methods to measure the strain of a structure are being proposed and many studies are being conducted on them. The vision-based measurement method is a noncontact method for measuring displacement and strain of objects by comparing between images before and after deformation. This method offers such advantages as no limitations in the surface condition, temperature, and shape of objects, the possibility of full filed measurement, and the possibility of measuring the distribution of stress or defects of structures based on the measurement results of displacement and strain in a map. The strains were measured with various methods using images in coupon test and the measurements were compared. In the future, the validity of the algorithm will be compared using stain gauge and clip gage, and based on the results, the physical properties of materials will be measured using a vision-based measurement system. This will contribute to the evaluation of reliability and effectiveness which are required for investigating local damages.

  10. Coupon Test of an Elbow Component by Using Vision-based Measurement System

    International Nuclear Information System (INIS)

    Kim, Sung Wan; Jeon, Bub Gyu; Choi, Hyoung Suk; Kim, Nam Sik

    2016-01-01

    Among the various methods to overcome this shortcoming, vision-based methods to measure the strain of a structure are being proposed and many studies are being conducted on them. The vision-based measurement method is a noncontact method for measuring displacement and strain of objects by comparing between images before and after deformation. This method offers such advantages as no limitations in the surface condition, temperature, and shape of objects, the possibility of full filed measurement, and the possibility of measuring the distribution of stress or defects of structures based on the measurement results of displacement and strain in a map. The strains were measured with various methods using images in coupon test and the measurements were compared. In the future, the validity of the algorithm will be compared using stain gauge and clip gage, and based on the results, the physical properties of materials will be measured using a vision-based measurement system. This will contribute to the evaluation of reliability and effectiveness which are required for investigating local damages

  11. Measuring sustainability by Energy Efficiency Analysis for Korean Power Companies: A Sequential Slacks-Based Efficiency Measure

    Directory of Open Access Journals (Sweden)

    Ning Zhang

    2014-03-01

    Full Text Available Improving energy efficiency has been widely regarded as one of the most cost-effective ways to improve sustainability and mitigate climate change. This paper presents a sequential slack-based efficiency measure (SSBM application to model total-factor energy efficiency with undesirable outputs. This approach simultaneously takes into account the sequential environmental technology, total input slacks, and undesirable outputs for energy efficiency analysis. We conduct an empirical analysis of energy efficiency incorporating greenhouse gas emissions of Korean power companies during 2007–2011. The results indicate that most of the power companies are not performing at high energy efficiency. Sequential technology has a significant effect on the energy efficiency measurements. Some policy suggestions based on the empirical results are also presented.

  12. Advanced measurement systems based on digital processing techniques for superconducting LHC magnets

    CERN Document Server

    Masi, Alessandro; Cennamo, Felice

    The Large Hadron Collider (LHC), a particle accelerator aimed at exploring deeper into matter than ever before, is currently being constructed at CERN. Beam optics of the LHC, requires stringent control of the field quality of about 8400 superconducting magnets, including 1232 main dipoles and 360 main quadrupoles to assure the correct machine operation. The measurement challenges are various: accuracy on the field strength measurement up to 50 ppm, harmonics in the ppm range, measurement equipment robustness, low measurement times to characterize fast field phenomena. New magnetic measurement systems, principally based on analog solutions, have been developed at CERN to achieve these goals. This work proposes the introduction of digital technologies to improve measurement performance of three systems, aimed at different measurement target and characterized by different accuracy levels. The high accuracy measurement systems, based on rotating coils, exhibit high performance in static magnetic field. With vary...

  13. Study on Method of Ultrasonic Gas Temperature Measure Based on FPGA

    Energy Technology Data Exchange (ETDEWEB)

    Wen, S H; Xu, F R [Institute of Electrical Engineering, Yanshan University, Qinhuangdao, 066004 (China)

    2006-10-15

    It is always a problem to measure instantaneous temperature of high-temperature and high-pressure gas. There is difficulty for the conventional method of measuring temperature to measure quickly and exactly, and the measuring precision is low, the ability of anti-jamming is bad, etc. So the article introduces a method of measuring burning gas temperature using ultrasonic based on Field-Programmable Gate Array (FPGA). The mathematic model of measuring temperature is built with the relation of velocity of ultrasonic transmitting and gas Kelvin in the ideal gas. The temperature can be figured out by measuring the difference of ultrasonic frequency {delta}f. FPGA is introduced and a high-precision data acquisition system based on digital phase-shift technology is designed. The feasibility of proposed above is confirmed more by measuring pressure of burning gas timely. Experimental result demonstrates that the error is less than 12.. and the precision is heightened to 0.8%.

  14. A nuclear radiation multi-parameter measurement system based on pulse-shape sampling

    International Nuclear Information System (INIS)

    Qiu Xiaolin; Fang Guoming; Xu Peng; Di Yuming

    2007-01-01

    In this paper, A nuclear radiation multi-parameter measurement system based on pulse-shape sampling is introduced, including the system's characteristics, composition, operating principle, experiment data and analysis. Compared with conventional nuclear measuring apparatus, it has some remarkable advantages such as the synchronous detection using multi-parameter measurement in the same measurement platform and the general analysis of signal data by user-defined program. (authors)

  15. The robustness and accuracy of in vivo linear wear measurements for knee prostheses based on model-based RSA.

    Science.gov (United States)

    van Ijsseldijk, E A; Valstar, E R; Stoel, B C; Nelissen, R G H H; Reiber, J H C; Kaptein, B L

    2011-10-13

    Accurate in vivo measurements methods of wear in total knee arthroplasty are required for a timely detection of excessive wear and to assess new implant designs. Component separation measurements based on model-based Roentgen stereophotogrammetric analysis (RSA), in which 3-dimensional reconstruction methods are used, have shown promising results, yet the robustness of these measurements is unknown. In this study, the accuracy and robustness of this measurement for clinical usage was assessed. The validation experiments were conducted in an RSA setup with a phantom setup of a knee in a vertical orientation. 72 RSA images were created using different variables for knee orientations, two prosthesis types (fixed-bearing Duracon knee and fixed-bearing Triathlon knee) and accuracies of the reconstruction models. The measurement error was determined for absolute and relative measurements and the effect of knee positioning and true seperation distance was determined. The measurement method overestimated the separation distance with 0.1mm on average. The precision of the method was 0.10mm (2*SD) for the Duracon prosthesis and 0.20mm for the Triathlon prosthesis. A slight difference in error was found between the measurements with 0° and 10° anterior tilt. (difference=0.08mm, p=0.04). The accuracy of 0.1mm and precision of 0.2mm can be achieved for linear wear measurements based on model-based RSA, which is more than adequate for clinical applications. The measurement is robust in clinical settings. Although anterior tilt seems to influence the measurement, the size of this influence is low and clinically irrelevant. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Note: A resonating reflector-based optical system for motion measurement in micro-cantilever arrays

    International Nuclear Information System (INIS)

    Sathishkumar, P.; Punyabrahma, P.; Sri Muthu Mrinalini, R.; Jayanth, G. R.

    2015-01-01

    A robust, compact optical measurement unit for motion measurement in micro-cantilever arrays enables development of portable micro-cantilever sensors. This paper reports on an optical beam deflection-based system to measure the deflection of micro-cantilevers in an array that employs a single laser source, a single detector, and a resonating reflector to scan the measurement laser across the array. A strategy is also proposed to extract the deflection of individual cantilevers from the acquired data. The proposed system and measurement strategy are experimentally evaluated and demonstrated to measure motion of multiple cantilevers in an array

  17. Novel Agent Based-approach for Industrial Diagnosis: A Combined use Between Case-based Reasoning and Similarity Measure

    Directory of Open Access Journals (Sweden)

    Fatima Zohra Benkaddour

    2016-12-01

    Full Text Available In spunlace nonwovens industry, the maintenance task is very complex, it requires experts and operators collaboration. In this paper, we propose a new approach integrating an agent- based modelling with case-based reasoning that utilizes similarity measures and preferences module. The main purpose of our study is to compare and evaluate the most suitable similarity measure for our case. Furthermore, operators that are usually geographically dispersed, have to collaborate and negotiate to achieve mutual agreements, especially when their proposals (diagnosis lead to a conflicting situation. The experimentation shows that the suggested agent-based approach is very interesting and efficient for operators and experts who collaborate in INOTIS enterprise.

  18. A chiral sensor based on weak measurement for the determination of Proline enantiomers in diverse measuring circumstances.

    Science.gov (United States)

    Li, Dongmei; Guan, Tian; He, Yonghong; Liu, Fang; Yang, Anping; He, Qinghua; Shen, Zhiyuan; Xin, Meiguo

    2018-07-01

    A new chiral sensor based on weak measurement to accurately measure the optical rotation (OR) has been developed for the estimation of a trace amount of chiral molecule. With the principle of optical weak measurement in frequency domain, the central wavelength shift of output spectra is quantitatively relative to the angle of preselected polarization. Hence, a chiral molecule (e.g., L-amino acid, or D-amino acid) can be enantioselectively determined by modifying the preselection angle with the OR, which will cause the rotation of a polarization plane. The concentration of the chiral sample, corresponding to its optical activity, is quantitatively analyzed with the central wavelength shift of output spectra, which can be collected in real time. Immune to the refractive index change, the proposed chiral sensor is valid in complicated measuring circumstance. The detections of Proline enantiomer concentration in different solvents were implemented. The results demonstrated that weak measurement acted as a reliable method to chiral recognition of Proline enantiomers in diverse circumstance with the merits of high precision and good robustness. In addition, this real-time monitoring approach plays a crucial part in asymmetric synthesis and biological systems. Copyright © 2018. Published by Elsevier B.V.

  19. Unobtrusive measurement of indoor energy expenditure using an infrared sensor-based activity monitoring system.

    Science.gov (United States)

    Hwang, Bosun; Han, Jonghee; Choi, Jong Min; Park, Kwang Suk

    2008-11-01

    The purpose of this study was to develop an unobtrusive energy expenditure (EE) measurement system using an infrared (IR) sensor-based activity monitoring system to measure indoor activities and to estimate individual quantitative EE. IR-sensor activation counts were measured with a Bluetooth-based monitoring system and the standard EE was calculated using an established regression equation. Ten male subjects participated in the experiment and three different EE measurement systems (gas analyzer, accelerometer, IR sensor) were used simultaneously in order to determine the regression equation and evaluate the performance. As a standard measurement, oxygen consumption was simultaneously measured by a portable metabolic system (Metamax 3X, Cortex, Germany). A single room experiment was performed to develop a regression model of the standard EE measurement from the proposed IR sensor-based measurement system. In addition, correlation and regression analyses were done to compare the performance of the IR system with that of the Actigraph system. We determined that our proposed IR-based EE measurement system shows a similar correlation to the Actigraph system with the standard measurement system.

  20. Measurement

    NARCIS (Netherlands)

    Boumans, M.; Durlauf, S.N.; Blume, L.E.

    2008-01-01

    Measurement theory takes measurement as the assignment of numbers to properties of an empirical system so that a homomorphism between the system and a numerical system is established. To avoid operationalism, two approaches can be distinguished. In the axiomatic approach it is asserted that if the

  1. Fault-tolerant measurement-based quantum computing with continuous-variable cluster states.

    Science.gov (United States)

    Menicucci, Nicolas C

    2014-03-28

    A long-standing open question about Gaussian continuous-variable cluster states is whether they enable fault-tolerant measurement-based quantum computation. The answer is yes. Initial squeezing in the cluster above a threshold value of 20.5 dB ensures that errors from finite squeezing acting on encoded qubits are below the fault-tolerance threshold of known qubit-based error-correcting codes. By concatenating with one of these codes and using ancilla-based error correction, fault-tolerant measurement-based quantum computation of theoretically indefinite length is possible with finitely squeezed cluster states.

  2. A dental implant-based registration method for measuring mandibular kinematics using cone beam computed tomography-based fluoroscopy.

    Science.gov (United States)

    Lin, Cheng-Chung; Chen, Chien-Chih; Chen, Yunn-Jy; Lu, Tung-Wu; Hong, Shih-Wun

    2014-01-01

    This study aimed to develop and evaluate experimentally an implant-based registration method for measuring three-dimensional (3D) kinematics of the mandible and dental implants in the mandible based on dental cone beam computed tomography (CBCT), modified to include fluoroscopic function. The proposed implant-based registration method was based on the registration of CBCT data of implants/bones with single-plane fluoroscopy images. Seven registration conditions that included one to three implants were evaluated experimentally for their performance in a cadaveric porcine headmodel. The implant-based registration method was shown to have measurement errors (SD) of less than -0.2 (0.3) mm, 1.1 (2.2) mm, and 0.7 degrees (1.3 degrees) for the in-plane translation, out-of-plane translation, and all angular components, respectively, regardless of the number of implants used. The corresponding errors were reduced to less than -0.1 (0.1) mm, -0.3 (1.7) mm, and 0.5 degree (0.4 degree) when three implants were used. An implant-based registration method was developed to measure the 3D kinematics of the mandible/implants. With its high accuracy and reliability, the new method will be useful for measuring the 3D motion of the bones/implants for relevant applications.

  3. Attention-based image similarity measure with application to content-based information retrieval

    Science.gov (United States)

    Stentiford, Fred W. M.

    2003-01-01

    Whilst storage and capture technologies are able to cope with huge numbers of images, image retrieval is in danger of rendering many repositories valueless because of the difficulty of access. This paper proposes a similarity measure that imposes only very weak assumptions on the nature of the features used in the recognition process. This approach does not make use of a pre-defined set of feature measurements which are extracted from a query image and used to match those from database images, but instead generates features on a trial and error basis during the calculation of the similarity measure. This has the significant advantage that features that determine similarity can match whatever image property is important in a particular region whether it be a shape, a texture, a colour or a combination of all three. It means that effort is expended searching for the best feature for the region rather than expecting that a fixed feature set will perform optimally over the whole area of an image and over every image in a database. The similarity measure is evaluated on a problem of distinguishing similar shapes in sets of black and white symbols.

  4. Teleoperation environment based on virtual reality. Application of two-planes method for position measurement

    International Nuclear Information System (INIS)

    Yoshikawa, Hidekazu; Tezuka, Tetsuo; Inoue, Ryuji

    1998-01-01

    A teleoperation system based on virtual environment (VE) is an emergent technology for operating a robot in remote or hazardous environment. We have developed a VE-based teleoperation system for robot-arm manipulation in a simplified real world. The VE for manipulating the robot arm is constructed by measuring the 3D positions of the objects around the robot arm by motion-stereo method. The 3D position is estimated by using two-(calibration) planes method based on images captured by the CCD camera on the robot-arm, since the two-planes method does not need pin-hole-model assumption to the camera system. The precision of this 3D-measurement is evaluated through experiments and then derived is the theoretical model to the error in the measurement. This measurement system is applied to VE-based teleoperation experiment for Peg-in-hole practice by the robot arm. (author)

  5. Hunter versus CIE color measurement systems for analysis of milk-based beverages.

    Science.gov (United States)

    Cheng, Ni; Barbano, David M; Drake, Mary Anne

    2018-06-01

    The objective of our work was to determine the differences in sensitivity of Hunter and International Commission on Illumination (CIE) methods at 2 different viewer angles (2 and 10°) for measurement of whiteness, red/green, and blue/yellow color of milk-based beverages over a range of composition. Sixty combinations of milk-based beverages were formulated (2 replicates) with a range of fat level from 0.2 to 2%, true protein level from 3 to 5%, and casein as a percent of true protein from 5 to 80% to provide a wide range of milk-based beverage color. In addition, commercial skim, 1 and 2% fat high-temperature, short-time pasteurized fluid milks were analyzed. All beverage formulations were HTST pasteurized and cooled to 4°C before analysis. Color measurement viewer angle (2 vs. 10°) had very little effect on objective color measures of milk-based beverages with a wide range of composition for either the Hunter or CIE color measurement system. Temperature (4, 20, and 50°C) of color measurement had a large effect on the results of color measurement in both the Hunter and CIE measurement systems. The effect of milk beverage temperature on color measurement results was the largest for skim milk and the least for 2% fat milk. This highlights the need for proper control of beverage serving temperature for sensory panel analysis of milk-based beverages with very low fat content and for control of milk temperature when doing objective color analysis for quality control in manufacture of milk-based beverages. The Hunter system of color measurement was more sensitive to differences in whiteness among milk-based beverages than the CIE system, whereas the CIE system was much more sensitive to differences in yellowness among milk-based beverages. There was little difference between the Hunter and CIE system in sensitivity to green/red color of milk-based beverages. In defining milk-based beverage product specifications for objective color measures for dairy product

  6. The thin-walled abnormity measurement technology research based on CCD

    International Nuclear Information System (INIS)

    Wang Bin

    2014-01-01

    The character of the thin-walled irregular parts is: the measured parameters for spatial structure size, need to design special measurement positioning fixture to complete detection, the wall thickness is very thin, and the processing is composite, its size is small and shape is complex, it is difficulty to collect image edge by using the optical measurement method. In this paper, a special measurement method of CCD that based on the image measurement technique was advanced, this kind of parts was measured quickly, accurately and automaticly through design the high precision positioning fixture and image acquisition method. At the same time, the comprehensive evaluation standard was given to assess the measurement accuracy method, and the reliability of measurement method was ensured. (author)

  7. Windows pollution problems of the dust concentration measurement based on scattering method

    International Nuclear Information System (INIS)

    Zhao Yanjun; Zhang Yongtao; Shi Xinyue; Xu Chuanlong; Wang Shimin

    2009-01-01

    The windows are separated the measurement system from the dust space in the light Scattering dust concentration measurement system. The windows are polluted unavoidably by the dust and the measurement error is produced. Based on the Mie Scattering theory, the measurement error is researched in this paper. The numerical simulation results show that the measurement error is related to the particles diameter distribution and the refractive index, but is independent of the particles average diameter. A novel photoelectricity sensor is developed in this paper in order to solve the measurement error by the windows pollution. The calculated method is brought out which can amend the measurement errors by the windows pollution and improve the measurement accuracy.

  8. Improving the psychometric properties of dot-probe attention measures using response-based computation.

    Science.gov (United States)

    Evans, Travis C; Britton, Jennifer C

    2018-09-01

    Abnormal threat-related attention in anxiety disorders is most commonly assessed and modified using the dot-probe paradigm; however, poor psychometric properties of reaction-time measures may contribute to inconsistencies across studies. Typically, standard attention measures are derived using average reaction-times obtained in experimentally-defined conditions. However, current approaches based on experimentally-defined conditions are limited. In this study, the psychometric properties of a novel response-based computation approach to analyze dot-probe data are compared to standard measures of attention. 148 adults (19.19 ± 1.42 years, 84 women) completed a standardized dot-probe task including threatening and neutral faces. We generated both standard and response-based measures of attention bias, attentional orientation, and attentional disengagement. We compared overall internal consistency, number of trials necessary to reach internal consistency, test-retest reliability (n = 72), and criterion validity obtained using each approach. Compared to standard attention measures, response-based measures demonstrated uniformly high levels of internal consistency with relatively few trials and varying improvements in test-retest reliability. Additionally, response-based measures demonstrated specific evidence of anxiety-related associations above and beyond both standard attention measures and other confounds. Future studies are necessary to validate this approach in clinical samples. Response-based attention measures demonstrate superior psychometric properties compared to standard attention measures, which may improve the detection of anxiety-related associations and treatment-related changes in clinical samples. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. Radiation Measurement from Mobile Base Stations at a University Campus in Malaysia

    OpenAIRE

    Md. R. Islam; Othman O. Khalifa; Liakot Ali; Amir Azli; Mohd Zulkarnain

    2006-01-01

    The tremendous growth of telecommunication industry results the number of hand phone users increases everyday. In order to support the growing number of users, the mobile base stations can be seen in almost everywhere. This scenario has created uncomfortable feelings to the people that they may be affected by the radiations from antennas. A measurement was done at student hostels and office premises near to base stations in International Islamic University Malaysia, Gombak campus. Measured va...

  10. Flow-based vulnerability measures for network component importance: Experimentation with preparedness planning

    International Nuclear Information System (INIS)

    Nicholson, Charles D.; Barker, Kash; Ramirez-Marquez, Jose E.

    2016-01-01

    This work develops and compares several flow-based vulnerability measures to prioritize important network edges for the implementation of preparedness options. These network vulnerability measures quantify different characteristics and perspectives on enabling maximum flow, creating bottlenecks, and partitioning into cutsets, among others. The efficacy of these vulnerability measures to motivate preparedness options against experimental geographically located disruption simulations is measured. Results suggest that a weighted flow capacity rate, which accounts for both (i) the contribution of an edge to maximum network flow and (ii) the extent to which the edge is a bottleneck in the network, shows most promise across four instances of varying network sizes and densities. - Highlights: • We develop new flow-based measures of network vulnerability. • We apply these measures to determine the importance of edges after disruptions. • Networks of varying size and density are explored.

  11. Quality assurance of in-situ measurements of land surface albedo: A model-based approach

    Science.gov (United States)

    Adams, Jennifer; Gobron, Nadine; Widlowski, Jean-Luc; Mio, Corrado

    2016-04-01

    This paper presents the development of a model-based framework for assessing the quality of in-situ measurements of albedo used to validate land surface albedo products. Using a 3D Monte Carlo Ray Tracing (MCRT) radiative transfer model, a quality assurance framework is built based on simulated field measurements of albedo within complex 3D canopies and under various illumination scenarios. This method provides an unbiased approach in assessing the quality of field measurements, and is also able to trace the contributions of two main sources of uncertainty in field-measurements of albedo; those resulting from 1) the field measurement protocol, such as height or placement of field measurement within the canopy, and 2) intrinsic factors of the 3D canopy under specific illumination characteristics considered, such as the canopy structure and landscape heterogeneity, tree heights, ecosystem type and season.

  12. Latency-Based and Psychophysiological Measures of Sexual Interest Show Convergent and Concurrent Validity.

    Science.gov (United States)

    Ó Ciardha, Caoilte; Attard-Johnson, Janice; Bindemann, Markus

    2018-04-01

    Latency-based measures of sexual interest require additional evidence of validity, as do newer pupil dilation approaches. A total of 102 community men completed six latency-based measures of sexual interest. Pupillary responses were recorded during three of these tasks and in an additional task where no participant response was required. For adult stimuli, there was a high degree of intercorrelation between measures, suggesting that tasks may be measuring the same underlying construct (convergent validity). In addition to being correlated with one another, measures also predicted participants' self-reported sexual interest, demonstrating concurrent validity (i.e., the ability of a task to predict a more validated, simultaneously recorded, measure). Latency-based and pupillometric approaches also showed preliminary evidence of concurrent validity in predicting both self-reported interest in child molestation and viewing pornographic material containing children. Taken together, the study findings build on the evidence base for the validity of latency-based and pupillometric measures of sexual interest.

  13. Electromagnetic fields from base stations for cellular mobile telephones. Measurements around base stations in the Oslo area

    International Nuclear Information System (INIS)

    Hannevik, Merete

    2000-01-01

    Measurements of radio frequent radiation from base station antennas for cellular mobile telephony have been performed. Measurements were performed inside the buildings in the area just behind or below antennas mounted on the wall or rooftop on buildings and on the ground below tower-mounted antennas. Except from the area 2-3 meters just in front of the antennas the electrical field levels were well below the international guidelines. (Author)

  14. Computer based methods for measurement of joint space width: update of an ongoing OMERACT project

    NARCIS (Netherlands)

    Sharp, John T.; Angwin, Jane; Boers, Maarten; Duryea, Jeff; von Ingersleben, Gabriele; Hall, James R.; Kauffman, Joost A.; Landewé, Robert; Langs, Georg; Lukas, Cédric; Maillefert, Jean-Francis; Bernelot Moens, Hein J.; Peloschek, Philipp; Strand, Vibeke; van der Heijde, Désirée

    2007-01-01

    Computer-based methods of measuring joint space width (JSW) could potentially have advantages over scoring joint space narrowing, with regard to increased standardization, sensitivity, and reproducibility. In an early exercise, 4 different methods showed good agreement on measured change in JSW over

  15. Determinants of the use of value-based performance measures for managerial performance evaluation

    NARCIS (Netherlands)

    Dekker, H.C.; Groot, T.L.C.M.; Schoute, M.; Wiersma, E.

    2012-01-01

    As value-based (VB) performance measures include firms' cost of capital, they are considered more congruent than earnings measures. Prior studies, however, find that their use for managerial performance evaluation is less extensive than their presumed benefits would suggest. We examine how the

  16. Computer-based measurement and automatizatio aplication research in nuclear technology fields

    International Nuclear Information System (INIS)

    Jiang Hongfei; Zhang Xiangyang

    2003-01-01

    This paper introduces computer-based measurement and automatization application research in nuclear technology fields. The emphasis of narration are the role of software in the development of system, and the network measurement and control software model which has optimistic application foreground. And presents the application examples of research and development. (authors)

  17. Density-independent algorithm for sensing moisture content of sawdust based on reflection measurements

    Science.gov (United States)

    A density-independent algorithm for moisture content determination in sawdust, based on a one-port reflection measurement technique is proposed for the first time. Performance of this algorithm is demonstrated through measurement of the dielectric properties of sawdust with an open-ended haft-mode s...

  18. Failing Tests: Commentary on "Adapting Educational Measurement to the Demands of Test-Based Accountability"

    Science.gov (United States)

    Thissen, David

    2015-01-01

    In "Adapting Educational Measurement to the Demands of Test-Based Accountability" Koretz takes the time-honored engineering approach to educational measurement, identifying specific problems with current practice and proposing minimal modifications of the system to alleviate those problems. In response to that article, David Thissen…

  19. Assessing the Reliability of Curriculum-Based Measurement: An Application of Latent Growth Modeling

    Science.gov (United States)

    Yeo, Seungsoo; Kim, Dong-Il; Branum-Martin, Lee; Wayman, Miya Miura; Espin, Christine A.

    2012-01-01

    The purpose of this study was to demonstrate the use of Latent Growth Modeling (LGM) as a method for estimating reliability of Curriculum-Based Measurement (CBM) progress-monitoring data. The LGM approach permits the error associated with each measure to differ at each time point, thus providing an alternative method for examining of the…

  20. Item-Level and Construct Evaluation of Early Numeracy Curriculum-Based Measures

    Science.gov (United States)

    Lee, Young-Sun; Lembke, Erica; Moore, Douglas; Ginsburg, Herbert P.; Pappas, Sandra

    2012-01-01

    The present study examined the technical adequacy of curriculum-based measures (CBMs) of early numeracy. Six 1-min early mathematics tasks were administered to 137 kindergarten and first-grade students, along with an omnibus test of early mathematics. The CBM measures included Count Out Loud, Quantity Discrimination, Number Identification, Missing…

  1. Using the Clinical Interview and Curriculum Based Measurement to Examine Risk Levels

    Science.gov (United States)

    Ginsburg, Herbert P.; Lee, Young-Sun; Pappas, Sandra

    2016-01-01

    This paper investigates the power of the computer guided clinical interview (CI) and new curriculum based measurement (CBM) measures to identify and help children at risk of low mathematics achievement. We use data from large numbers of children in Kindergarten through Grade 3 to investigate the construct validity of CBM risk categories. The basic…

  2. A Cartoon-Based Measure of PTSD Symptomatology in Children Exposed to a Disaster

    Science.gov (United States)

    Elklit, Ask; Nielsen, Louise Hjort; Lasgaard, Mathias; Duch, Christina

    2013-01-01

    Research on childhood posttraumatic stress disorder (PTSD) is sparse. This is partly due to the limited availability of empirically validated measures for children who are insecure readers. The present study examined the reliability and validity of a cartoon-based measure of PTSD symptoms in children exposed to a disaster. Cartoons were generated…

  3. Measurement with corrugated tubes of early-age autogenous shrinkage of cement-based material

    DEFF Research Database (Denmark)

    Tian, Qian; Jensen, Ole Mejlhede

    2009-01-01

    The use of a special corrugated mould enables transformation of volume strain into horizontal, linear strain measurement in the fluid stage. This allows continuous measurement of the autogenous shrinkage of cement-based materials since casting, and also effectively eliminates unwanted influence...

  4. The Development of Instruments to Measure Motivational Interviewing Skill Acquisition for School-Based Personnel

    Science.gov (United States)

    Small, Jason W.; Lee, Jon; Frey, Andy J.; Seeley, John R.; Walker, Hill M.

    2014-01-01

    As specialized instructional support personnel begin learning and using motivational interviewing (MI) techniques in school-based settings, there is growing need for context-specific measures to assess initial MI skill development. In this article, we describe the iterative development and preliminary evaluation of two measures of MI skill adapted…

  5. Adjustments of microwave-based measurements on coal moisture using natural radioactivity techniques

    Energy Technology Data Exchange (ETDEWEB)

    Prieto-Fernandez, I.; Luengo-Garcia, J.C.; Alonso-Hidalgo, M.; Folgueras-Diaz, B. [University of Oviedo, Gijon (Spain)

    2006-01-07

    The use of nonconventional on-line measurements of moisture and ash content in coal is presented. The background research is briefly reviewed. The possibilities of adjusting microwave-based moisture measurements using natural radioactive techniques, and vice versa, are proposed. The results obtained from the simultaneous analysis of moisture and ash content as well as the correlation improvements are shown.

  6. Scientific Opinion on the use of animal-based measures to assess welfare in pigs

    DEFF Research Database (Denmark)

    Broom, D.; Doherr, M.G.; Edwards, S.

    2013-01-01

    Animal-based measures, identified on the basis of scientific evidence, can be used effectively in the evaluation of the welfare of on-farm pigs in relation to laws, codes of practice, quality assurance schemes and management. Some of these measures are also appropriate for ante-mortem inspection ...

  7. Machine-Learning Identification of Airborne UAV-UEs Based on LTE Radio Measurements

    DEFF Research Database (Denmark)

    Amorim, Rafhael Medeiros de; Wigard, Jeroen; Nguyen, Huan Cong

    2017-01-01

    , which use standard LTE measurements from the UE as input, for detecting the presence of airborne users in the network. The algorithms are evaluated based on measurements done with mobile phones attached under a flying drone and on a car. Results are discussed showing the advantages and drawbacks...

  8. Measuring Education Inequalities: Concentration and Dispersion-Based Approach. Lessons from Kuznets Curve in MENA Region

    Science.gov (United States)

    Ibourk, Aomar; Amaghouss, Jabrane

    2012-01-01

    Although the quantity of education is widely used to measure the economical and social performances of educative systems, only a few works have addressed the issue of equity in education. In this work, we have calculated two measures of inequality in education based on Barro and Lee's (2010) data: the Gini index of education and the standard…

  9. Method for 3D profilometry measurement based on contouring moire fringe

    Science.gov (United States)

    Shi, Zhiwei; Lin, Juhua

    2007-12-01

    3D shape measurement is one of the most active branches of optical research recently. A method of 3D profilometry measurement by the combination of Moire projection method and phase-shifting technology based on SCM (Single Chip Microcomputer) control is presented in the paper. Automatic measurement of 3D surface profiles can be carried out by applying this method with high speed and high precision.

  10. Measurement Bases for Acquisitions and Mergers in Financial Accounting and in Commercial Law

    OpenAIRE

    Vomáčková, Hana

    2011-01-01

    In association with transactions involving businesses, acquisitions and mergers, etc., commercial law stipulates the new measurement of business assets and thus also net business assets. Similarly, financial accounting stipulates the new measurement of assets, liabilities and net assets with an impact on the amount and structure of equity. It is a principal question as to whether the new measurement bases required by both commercial law and financial accounting are in principal identical. Pra...

  11. Radiologic total lung capacity measurement. Development and evaluation of a computer-based system

    Energy Technology Data Exchange (ETDEWEB)

    Seeley, G.W.; Mazzeo, J.; Borgstrom, M.; Hunter, T.B.; Newell, J.D.; Bjelland, J.C.

    1986-11-01

    The development of a computer-based radiologic total lung capacity (TLC) measurement system designed to be used by non-physician personnel is detailed. Four operators tested the reliability and validity of the system by measuring inspiratory PA and lateral pediatric chest radiographs with a Graf spark pen interfaced to a DEC VAX 11/780 computer. First results suggest that the ultimate goal of developing an accurate and easy to use TLC measurement system for non-physician personnel is attainable.

  12. Enhancement of Edge-based Image Quality Measures Using Entropy for Histogram Equalization-based Contrast Enhancement Techniques

    Directory of Open Access Journals (Sweden)

    H. T. R. Kurmasha

    2017-12-01

    Full Text Available An Edge-based image quality measure (IQM technique for the assessment of histogram equalization (HE-based contrast enhancement techniques has been proposed that outperforms the Absolute Mean Brightness Error (AMBE and Entropy which are the most commonly used IQMs to evaluate Histogram Equalization based techniques, and also the two prominent fidelity-based IQMs which are Multi-Scale Structural Similarity (MSSIM and Information Fidelity Criterion-based (IFC measures. The statistical evaluation results show that the Edge-based IQM, which was designed for detecting noise artifacts distortion, has a Person Correlation Coefficient (PCC > 0.86 while the others have poor or fair correlation to human opinion, considering the Human Visual Perception (HVP. Based on HVP, this paper propose an enhancement to classic Edge-based IQM by taking into account the brightness saturation distortion which is the most prominent distortion in HE-based contrast enhancement techniques. It is tested and found to have significantly well correlation (PCC > 0.87, Spearman rank order correlation coefficient (SROCC > 0.92, Root Mean Squared Error (RMSE < 0.1054, and Outlier Ratio (OR = 0%.

  13. 40 CFR 761.298 - Decisions based on PCB concentration measurements resulting from sampling.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Decisions based on PCB concentration... Cleanup and On-Site Disposal of Bulk PCB Remediation Waste and Porous Surfaces in Accordance With § 761.61(a)(6) § 761.298 Decisions based on PCB concentration measurements resulting from sampling. (a) For...

  14. A laser interferometer for measuring straightness and its position based on heterodyne interferometry

    International Nuclear Information System (INIS)

    Chen Benyong; Zhang Enzheng; Yan Liping; Li Chaorong; Tang Wuhua; Feng Qibo

    2009-01-01

    Not only the magnitude but also the position of straightness errors are of concern to users. However, current laser interferometers used for measuring straightness seldom give the relative position of the straightness error. To solve this problem, a laser interferometer for measuring straightness and its position based on heterodyne interferometry is proposed. The optical configuration of the interferometer is designed and the measurement principle is analyzed theoretically. Two experiments were carried out. The first experiment verifies the validity and repeatability of the interferometer by measuring a linear stage. Also, the second one for measuring a flexure-hinge stage demonstrates that the interferometer is capable of nanometer measurement accuracy. These results show that this interferometer has advantages of simultaneously measuring straightness error and the relative position with high precision, and a compact structure.

  15. Multi-beam synchronous measurement based on PSD phase detection using frequency-domain multiplexing

    Science.gov (United States)

    Duan, Ying; Qin, Lan; Xue, Lian; Xi, Feng; Mao, Jiubing

    2013-10-01

    According to the principle of centroid measurement, position-sensitive detectors (PSD) are commonly used for micro displacement detection. However, single-beam detection method cannot satisfy such tasks as multi-dimension position measurement, three dimension vision reconstruction, and robot precision positioning, which require synchronous measurement of multiple light beams. Consequently, we designed PSD phase detection method using frequency-domain multiplexing for synchronous detection of multiple modulated light beams. Compared to previous PSD amplitude detection method, the phase detection method using FDM has advantages of simplified measuring system, low cost, high capability of resistance to light interference as well as improved resolution. The feasibility of multi-beam synchronous measurement based on PSD phase detection using FDM was validated by multi-beam measuring experiments. The maximum non-linearity error of the multi-beam synchronous measurement is 6.62%.

  16. Nonstructural urban stormwater quality measures: building a knowledge base to improve their use.

    Science.gov (United States)

    Taylor, André C; Fletcher, Tim D

    2007-05-01

    This article summarizes a research project that investigated the use, performance, cost, and evaluation of nonstructural measures to improve urban stormwater quality. A survey of urban stormwater managers from Australia, New Zealand, and the United States revealed a widespread trend of increasing use of nonstructural measures among leading stormwater management agencies, with at least 76% of 41 types of nonstructural measures being found to be increasing in use. Data gathered from the survey, an international literature review, and a multicriteria analysis highlighted four nonstructural measures of greatest potential value: mandatory town planning controls that promote the adoption of low-impact development principles and techniques; development of strategic urban stormwater management plans for a city, shire, or catchment; stormwater management measures and programs for construction/building sites; and stormwater management activities related to municipal maintenance operations such as maintenance of the stormwater drainage network and manual litter collections. Knowledge gained on the use and performance of nonstructural measures from the survey, literature review, and three trial evaluation projects was used to develop tailored monitoring and evaluation guidelines for these types of measure. These guidelines incorporate a new evaluation framework based on seven alternative styles of evaluation that range from simply monitoring whether a nonstructural measure has been fully implemented to monitoring its impact on waterway health. This research helps to build the stormwater management industry's knowledge base concerning nonstructural measures and provides a practical tool to address common impediments associated with monitoring and evaluating the performance and cost of these measures.

  17. Parallelism measurement for base plate of standard artifact with multiple tactile approaches

    Science.gov (United States)

    Ye, Xiuling; Zhao, Yan; Wang, Yiwen; Wang, Zhong; Fu, Luhua; Liu, Changjie

    2018-01-01

    Nowadays, as workpieces become more precise and more specialized which results in more sophisticated structures and higher accuracy for the artifacts, higher requirements have been put forward for measuring accuracy and measuring methods. As an important method to obtain the size of workpieces, coordinate measuring machine (CMM) has been widely used in many industries. In order to achieve the calibration of a self-developed CMM, it is found that the parallelism of the base plate used for fixing the standard artifact is an important factor which affects the measurement accuracy in the process of studying self-made high-precision standard artifact. And aimed to measure the parallelism of the base plate, by using the existing high-precision CMM, gauge blocks, dial gauge and marble platform with the tactile approach, three methods for parallelism measurement of workpieces are employed, and comparisons are made within the measurement results. The results of experiments show that the final accuracy of all the three methods is able to reach micron level and meets the measurement requirements. Simultaneously, these three approaches are suitable for different measurement conditions which provide a basis for rapid and high-precision measurement under different equipment conditions.

  18. Measurements and modelling of base station power consumption under real traffic loads.

    Science.gov (United States)

    Lorincz, Josip; Garma, Tonko; Petrovic, Goran

    2012-01-01

    Base stations represent the main contributor to the energy consumption of a mobile cellular network. Since traffic load in mobile networks significantly varies during a working or weekend day, it is important to quantify the influence of these variations on the base station power consumption. Therefore, this paper investigates changes in the instantaneous power consumption of GSM (Global System for Mobile Communications) and UMTS (Universal Mobile Telecommunications System) base stations according to their respective traffic load. The real data in terms of the power consumption and traffic load have been obtained from continuous measurements performed on a fully operated base station site. Measurements show the existence of a direct relationship between base station traffic load and power consumption. According to this relationship, we develop a linear power consumption model for base stations of both technologies. This paper also gives an overview of the most important concepts which are being proposed to make cellular networks more energy-efficient.

  19. Measurements and Modelling of Base Station Power Consumption under Real Traffic Loads

    Directory of Open Access Journals (Sweden)

    Goran Petrovic

    2012-03-01

    Full Text Available Base stations represent the main contributor to the energy consumption of a mobile cellular network. Since traffic load in mobile networks significantly varies during a working or weekend day, it is important to quantify the influence of these variations on the base station power consumption. Therefore, this paper investigates changes in the instantaneous power consumption of GSM (Global System for Mobile Communications and UMTS (Universal Mobile Telecommunications System base stations according to their respective traffic load. The real data in terms of the power consumption and traffic load have been obtained from continuous measurements performed on a fully operated base station site. Measurements show the existence of a direct relationship between base station traffic load and power consumption. According to this relationship, we develop a linear power consumption model for base stations of both technologies. This paper also gives an overview of the most important concepts which are being proposed to make cellular networks more energy-efficient.

  20. Comparison of diffusion charging and mobility-based methods for measurement of aerosol agglomerate surface area.

    Science.gov (United States)

    Ku, Bon Ki; Kulkarni, Pramod

    2012-05-01

    We compare different approaches to measure surface area of aerosol agglomerates. The objective was to compare field methods, such as mobility and diffusion charging based approaches, with laboratory approach, such as Brunauer, Emmett, Teller (BET) method used for bulk powder samples. To allow intercomparison of various surface area measurements, we defined 'geometric surface area' of agglomerates (assuming agglomerates are made up of ideal spheres), and compared various surface area measurements to the geometric surface area. Four different approaches for measuring surface area of agglomerate particles in the size range of 60-350 nm were compared using (i) diffusion charging-based sensors from three different manufacturers, (ii) mobility diameter of an agglomerate, (iii) mobility diameter of an agglomerate assuming a linear chain morphology with uniform primary particle size, and (iv) surface area estimation based on tandem mobility-mass measurement and microscopy. Our results indicate that the tandem mobility-mass measurement, which can be applied directly to airborne particles unlike the BET method, agrees well with the BET method. It was also shown that the three diffusion charging-based surface area measurements of silver agglomerates were similar within a factor of 2 and were lower than those obtained from the tandem mobility-mass and microscopy method by a factor of 3-10 in the size range studied. Surface area estimated using the mobility diameter depended on the structure or morphology of the agglomerate with significant underestimation at high fractal dimensions approaching 3.

  1. A circular feature-based pose measurement method for metal part grasping

    International Nuclear Information System (INIS)

    Wu, Chenrui; He, Zaixing; Zhang, Shuyou; Zhao, Xinyue

    2017-01-01

    The grasping of circular metal parts such as bearings and flanges is a common task in industry. Limited by low texture and repeated features, the point-feature-based method is not applicable in pose measurement of these parts. In this paper, we propose a novel pose measurement method for grasping circular metal parts. This method is based on cone degradation and involves a monocular camera. To achieve higher measurement accuracy, a position-based visual servoing method is presented to continuously control an eye-in-hand, six-degrees-of-freedom robot arm to grasp the part. The uncertainty of the part’s coordinate frame during the control process is solved by defining a fixed virtual coordinate frame. Experimental results are provided to illustrate the effectiveness of the proposed method and the factors that affect measurement accuracy are analyzed. (paper)

  2. The Economic Base of North Dakota: A Measure of the State’s Economy in 2012

    OpenAIRE

    Coon, Randal C.; Bangsund, Dean A.; Hodur, Nancy M.

    2014-01-01

    The growth and composition of the North Dakota economy can be measured using economic base analysis. Economic base is defined as the value of goods and services exported from an economic unit. Economic base also can be called a region’s export base because industries (or ‘basic’ sectors) earn income from outside the area. North Dakota’s economic base is comprised of those activities that produce a product or a service purchased by individuals, governments, and businesses located outside of th...

  3. The Research of Screw Thread Parameter Measurement Based on Position Sensitive Detector and Laser

    International Nuclear Information System (INIS)

    Tong, Q B; Ding, Z L; Chen, J C; Ai, L L; Yuan, F

    2006-01-01

    A technique and system of measuring screw thread parameter based on the theory of laser measurement is presented in this paper, which can be carried out the automated measurement of screw thread parameter. An inspection instrument was designed and produced, which included exterior imaging system of optical path, transverse displacement measurement system, axial displacement measurement system, and a module to deal with, control and assess the data in the upper system. The inspection and estimate of the screw thread contour curve were completed by using position sensitive device (PSD) as photoelectric detector to measure the coordinate data of the screw thread contour curve in the transverse section, and using precise raster to measure the axial displacement of the precision worktable under the screw thread test criterion., computer can gives a measured result according to coordinate data of the screw thread obtained by PSD. The relation between measured spot and image is established, and optimum design of the system organization are introduced, including the image length of receiving lens focal length optical system and the choice of PSD , and some main factor affected measuring precision are analyzed. The experimental results show that the measurement uncertainty of screw thread minor diameter can reach 0. 5μm, which can meet most requests for the measurement of screw thread parameter

  4. Overview of Boundary Layer Clouds Using Satellite and Ground-Based Measurements

    Science.gov (United States)

    Xi, B.; Dong, X.; Wu, P.; Qiu, S.

    2017-12-01

    A comprehensive summary of boundary layer clouds properties based on our few recently studies will be presented. The analyses include the global cloud fractions and cloud macro/micro- physical properties based on satellite measurements using both CERES-MODIS and CloudSat/Caliposo data products,; the annual/seasonal/diurnal variations of stratocumulus clouds over different climate regions (mid-latitude land, mid-latitude ocean, and Arctic region) using DOE ARM ground-based measurements over Southern great plain (SGP), Azores (GRW), and North slope of Alaska (NSA) sites; the impact of environmental conditions to the formation and dissipation process of marine boundary layer clouds over Azores site; characterizing Arctice mixed-phase cloud structure and favorable environmental conditions for the formation/maintainess of mixed-phase clouds over NSA site. Though the presentation has widely spread topics, we will focus on the representation of the ground-based measurements over different climate regions; evaluation of satellite retrieved cloud properties using these ground-based measurements, and understanding the uncertainties of both satellite and ground-based retrievals and measurements.

  5. Determination of delayed neutrons source in the frequency domain based on in-pile oscillation measurements

    International Nuclear Information System (INIS)

    Yedvab, Y.; Reiss, I.; Bettan, M.; Harari, R.; Grober, A.; Ettedgui, H.; Caspi, E. N.

    2006-01-01

    A method for determining delayed neutrons source in the frequency domain based on measuring power oscillations in a non-critical reactor is presented. This method is unique in the sense that the delayed neutrons source is derived from the dynamic behavior of the reactor, which serves as the measurement system. An algorithm for analyzing power oscillation measurements was formulated, which avoids the need for a multi-parameter non-linear fit process used by other methods. Using this algorithm results of two sets of measurements performed in IRR-I and IRR-II (Israeli Research Reactors I and II) are presented. The agreement between measured values from both reactors and calculated values based on Keepin (and JENDL-3.3) group parameters is very good. (authors)

  6. A new measure of uncertainty importance based on distributional sensitivity analysis for PSA

    International Nuclear Information System (INIS)

    Han, Seok Jung; Tak, Nam Il; Chun, Moon Hyun

    1996-01-01

    The main objective of the present study is to propose a new measure of uncertainty importance based on distributional sensitivity analysis. The new measure is developed to utilize a metric distance obtained from cumulative distribution functions (cdfs). The measure is evaluated for two cases: one is a cdf given by a known analytical distribution and the other given by an empirical distribution generated by a crude Monte Carlo simulation. To study its applicability, the present measure has been applied to two different cases. The results are compared with those of existing three methods. The present approach is a useful measure of uncertainty importance which is based on cdfs. This method is simple and easy to calculate uncertainty importance without any complex process. On the basis of the results obtained in the present work, the present method is recommended to be used as a tool for the analysis of uncertainty importance

  7. Optimization of Power Consumption for Centrifugation Process Based on Attenuation Measurements

    Science.gov (United States)

    Salim, M. S.; Abd Malek, M. F.; Sabri, Naseer; Omar, M. Iqbal bin; Mohamed, Latifah; Juni, K. M.

    2013-04-01

    The main objective of this research is to produce a mathematical model that allows decreasing the electrical power consumption of centrifugation process based on attenuation measurements. The centrifugation time for desired separation efficiency may be measured to determine the power consumed of laboratory centrifuge device. The power consumption is one of several parameters that affect the system reliability and productivity. Attenuation measurements of wave propagated through blood sample during centrifugation process were used indirectly to measure the power consumption of device. A mathematical model for power consumption was derived and used to modify the speed profile of centrifuge controller. The power consumption model derived based on attenuation measurements has successfully save the power consumption of centrifugation process keeping high separation efficiency. 18kW.h monthly for 100 daily time device operation had been saved using the proposed model.

  8. Optimization of Power Consumption for Centrifugation Process Based on Attenuation Measurements

    International Nuclear Information System (INIS)

    Salim, M S; Iqbal bin Omar, M; Malek, M F Abd; Mohamed, Latifah; Sabri, Naseer; Juni, K M

    2013-01-01

    The main objective of this research is to produce a mathematical model that allows decreasing the electrical power consumption of centrifugation process based on attenuation measurements. The centrifugation time for desired separation efficiency may be measured to determine the power consumed of laboratory centrifuge device. The power consumption is one of several parameters that affect the system reliability and productivity. Attenuation measurements of wave propagated through blood sample during centrifugation process were used indirectly to measure the power consumption of device. A mathematical model for power consumption was derived and used to modify the speed profile of centrifuge controller. The power consumption model derived based on attenuation measurements has successfully save the power consumption of centrifugation process keeping high separation efficiency. 18kW.h monthly for 100 daily time device operation had been saved using the proposed model.

  9. Self-recalibration of a robot-assisted structured-light-based measurement system.

    Science.gov (United States)

    Xu, Jing; Chen, Rui; Liu, Shuntao; Guan, Yong

    2017-11-10

    The structured-light-based measurement method is widely employed in numerous fields. However, for industrial inspection, to achieve complete scanning of a work piece and overcome occlusion, the measurement system needs to be moved to different viewpoints. Moreover, frequent reconfiguration of the measurement system may be needed based on the size of the measured object, making the self-recalibration of extrinsic parameters indispensable. To this end, this paper proposes an automatic self-recalibration and reconstruction method, wherein a robot arm is employed to move the measurement system for complete scanning; the self-recalibration is achieved using fundamental matrix calculations and point cloud registration without the need for an accurate calibration gauge. Experimental results demonstrate the feasibility and accuracy of our method.

  10. Validation of a photography-based goniometry method for measuring joint range of motion.

    Science.gov (United States)

    Blonna, Davide; Zarkadas, Peter C; Fitzsimmons, James S; O'Driscoll, Shawn W

    2012-01-01

    A critical component of evaluating the outcomes after surgery to restore lost elbow motion is the range of motion (ROM) of the elbow. This study examined if digital photography-based goniometry is as accurate and reliable as clinical goniometry for measuring elbow ROM. Instrument validity and reliability for photography-based goniometry were evaluated for a consecutive series of 50 elbow contractures by 4 observers with different levels of elbow experience. Goniometric ROM measurements were taken with the elbows in full extension and full flexion directly in the clinic (once) and from digital photographs (twice in a blinded random manner). Instrument validity for photography-based goniometry was extremely high (intraclass correlation coefficient: extension = 0.98, flexion = 0.96). For extension and flexion measurements by the expert surgeon, systematic error was negligible (0° and 1°, respectively). Limits of agreement were 7° (95% confidence interval [CI], 5° to 9°) and -7° (95% CI, -5° to -9°) for extension and 8° (95% CI, 6° to 10°) and -7° (95% CI, -5° to -9°) for flexion. Interobserver reliability for photography-based goniometry was better than that for clinical goniometry. The least experienced observer's photographic goniometry measurements were closer to the reference measurements than the clinical goniometry measurements. Photography-based goniometry is accurate and reliable for measuring elbow ROM. The photography-based method relied less on observer expertise than clinical goniometry. This validates an objective measure of patient outcome without requiring doctor-patient contact at a tertiary care center, where most contracture surgeries are done. Copyright © 2012 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Mosby, Inc. All rights reserved.

  11. Rapid bead-based immunoassay for measurement of mannose-binding lectin

    DEFF Research Database (Denmark)

    Bay, J T; Garred, P

    2009-01-01

    have been developed more automated platforms for MBL analysis is urgently needed. To pursue this, we set out to develop a flexible bead-based MBL immunoassay. Serum was obtained from 98 healthy individuals and 50 patients investigated for possible immunodeficiencies. We used the Luminex xMAP bead array...... coefficient were found be 7.88% and 5.70%, respectively. A close correlation between the new assay and a reference MBL measurement ELISA was found (rho 0.9381, P bead-based assay was less sensitive to interfering anti-murine antibodies in the blood samples than when the antibodies employed were...... used in the reference polystyrene-based ELISA. The new assay could be performed in 3 h with less than 25 microl serum required of each sample. These results show that MBL can be measured readily using a bead-based platform, which may form an efficient basis for a multiplex approach to measure different...

  12. Exposure estimates based on broadband elf magnetic field measurements versus the ICNIRP multiple frequency rule

    International Nuclear Information System (INIS)

    Paniagua, Jesus M.; Rufo, Montana; Jimenez, Antonio; Pachon, Fernando T.; Carrero, Julian

    2015-01-01

    The evaluation of exposure to extremely low-frequency (ELF) magnetic fields using broadband measurement techniques gives satisfactory results when the field has essentially a single frequency. Nevertheless, magnetic fields are in most cases distorted by harmonic components. This work analyses the harmonic components of the ELF magnetic field in an outdoor urban context and compares the evaluation of the exposure based on broadband measurements with that based on spectral analysis. The multiple frequency rule of the International Commission on Non-ionizing Radiation Protection (ICNIRP) regulatory guidelines was applied. With the 1998 ICNIRP guideline, harmonics dominated the exposure with a 55 % contribution. With the 2010 ICNIRP guideline, however, the primary frequency dominated the exposure with a 78 % contribution. Values of the exposure based on spectral analysis were significantly higher than those based on broadband measurements. Hence, it is clearly necessary to determine the harmonic components of the ELF magnetic field to assess exposure in urban contexts. (authors)

  13. Towards the XML schema measurement based on mapping between XML and OO domain

    Science.gov (United States)

    Rakić, Gordana; Budimac, Zoran; Heričko, Marjan; Pušnik, Maja

    2017-07-01

    Measuring quality of IT solutions is a priority in software engineering. Although numerous metrics for measuring object-oriented code already exist, measuring quality of UML models or XML Schemas is still developing. One of the research questions in the overall research leaded by ideas described in this paper is whether we can apply already defined object-oriented design metrics on XML schemas based on predefined mappings. In this paper, basic ideas for mentioned mapping are presented. This mapping is prerequisite for setting the future approach to XML schema quality measuring with object-oriented metrics.

  14. Comparison of co-expression measures: mutual information, correlation, and model based indices.

    Science.gov (United States)

    Song, Lin; Langfelder, Peter; Horvath, Steve

    2012-12-09

    Co-expression measures are often used to define networks among genes. Mutual information (MI) is often used as a generalized correlation measure. It is not clear how much MI adds beyond standard (robust) correlation measures or regression model based association measures. Further, it is important to assess what transformations of these and other co-expression measures lead to biologically meaningful modules (clusters of genes). We provide a comprehensive comparison between mutual information and several correlation measures in 8 empirical data sets and in simulations. We also study different approaches for transforming an adjacency matrix, e.g. using the topological overlap measure. Overall, we confirm close relationships between MI and correlation in all data sets which reflects the fact that most gene pairs satisfy linear or monotonic relationships. We discuss rare situations when the two measures disagree. We also compare correlation and MI based approaches when it comes to defining co-expression network modules. We show that a robust measure of correlation (the biweight midcorrelation transformed via the topological overlap transformation) leads to modules that are superior to MI based modules and maximal information coefficient (MIC) based modules in terms of gene ontology enrichment. We present a function that relates correlation to mutual information which can be used to approximate the mutual information from the corresponding correlation coefficient. We propose the use of polynomial or spline regression models as an alternative to MI for capturing non-linear relationships between quantitative variables. The biweight midcorrelation outperforms MI in terms of elucidating gene pairwise relationships. Coupled with the topological overlap matrix transformation, it often leads to more significantly enriched co-expression modules. Spline and polynomial networks form attractive alternatives to MI in case of non-linear relationships. Our results indicate that MI

  15. Validity of Cognitive Load Measures in Simulation-Based Training: A Systematic Review.

    Science.gov (United States)

    Naismith, Laura M; Cavalcanti, Rodrigo B

    2015-11-01

    Cognitive load theory (CLT) provides a rich framework to inform instructional design. Despite the applicability of CLT to simulation-based medical training, findings from multimedia learning have not been consistently replicated in this context. This lack of transferability may be related to issues in measuring cognitive load (CL) during simulation. The authors conducted a review of CLT studies across simulation training contexts to assess the validity evidence for different CL measures. PRISMA standards were followed. For 48 studies selected from a search of MEDLINE, EMBASE, PsycInfo, CINAHL, and ERIC databases, information was extracted about study aims, methods, validity evidence of measures, and findings. Studies were categorized on the basis of findings and prevalence of validity evidence collected, and statistical comparisons between measurement types and research domains were pursued. CL during simulation training has been measured in diverse populations including medical trainees, pilots, and university students. Most studies (71%; 34) used self-report measures; others included secondary task performance, physiological indices, and observer ratings. Correlations between CL and learning varied from positive to negative. Overall validity evidence for CL measures was low (mean score 1.55/5). Studies reporting greater validity evidence were more likely to report that high CL impaired learning. The authors found evidence that inconsistent correlations between CL and learning may be related to issues of validity in CL measures. Further research would benefit from rigorous documentation of validity and from triangulating measures of CL. This can better inform CLT instructional design for simulation-based medical training.

  16. Consequences of Market-Based Measures CO2-emission Reduction Maritime Transport for the Netherlands; Gevolgen Market Based Measures CO2-emissiereductie zeevaart voor Nederland

    Energy Technology Data Exchange (ETDEWEB)

    Wortelboer-van Donselaar, P.; Kansen, M.; Moorman, S. [Kennisinstituut voor Mobiliteitsbeleid KiM, Den Haag (Netherlands); Faber, J.; Koopman, M.; Smit, M. [CE Delft, Delft (Netherlands)

    2013-11-15

    The introduction of Market Based Measures (MBMs) to reduce the CO2 emissions of international sea shipping will have relatively limited economic effects for the Netherlands. Moreover, these effects are largely in line with those in other countries. For the Netherlands, however, the manner in which MBMS are organised and enforced is likely to be particularly important, given the importance of ports to the Dutch economy, the country's relatively large bunker sector, and the fact that Dutch shipowners operate relatively small vessels and on a relatively small scale. MBMs include pricing measures in the form of tax or trade systems, as well as other market-related proposals. In this research study, the consequences are analysed of four international MBM proposals for the Netherlands [Dutch] Om de CO2-uitstoot van de internationale zeevaartsector terug te dringen worden momenteel zogeheten Market Based Measures (MBMs), zoals bijvoorbeeld het veilen van emissierechten of het invoeren van een heffing, overwogen. De invoering van de MBMs zal voor Nederland relatief beperkte economische effecten hebben. Deze effecten wijken bovendien niet bijzonder af van die voor andere landen. De wijze waarop de MBMs worden georganiseerd en gehandhaafd, is voor Nederland mogelijk wel van onderscheidend belang. Dit gezien het belang van de havens voor de Nederlandse economie, de relatief grote bunkersector, en de relatief kleine schepen en kleinschaligheid van de Nederlandse reders.

  17. Measuring $\

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Jessica Sarah [Univ. of Cambridge (United Kingdom)

    2011-01-01

    The MINOS Experiment consists of two steel-scintillator calorimeters, sampling the long baseline NuMI muon neutrino beam. It was designed to make a precise measurement of the ‘atmospheric’ neutrino mixing parameters, Δm2 atm. and sin2 (2 atm.). The Near Detector measures the initial spectrum of the neutrino beam 1km from the production target, and the Far Detector, at a distance of 735 km, measures the impact of oscillations in the neutrino energy spectrum. Work performed to validate the quality of the data collected by the Near Detector is presented as part of this thesis. This thesis primarily details the results of a vμ disappearance analysis, and presents a new sophisticated fitting software framework, which employs a maximum likelihood method to extract the best fit oscillation parameters. The software is entirely decoupled from the extrapolation procedure between the detectors, and is capable of fitting multiple event samples (defined by the selections applied) in parallel, and any combination of energy dependent and independent sources of systematic error. Two techniques to improve the sensitivity of the oscillation measurement were also developed. The inclusion of information on the energy resolution of the neutrino events results in a significant improvement in the allowed region for the oscillation parameters. The degree to which sin2 (2θ )= 1.0 could be disfavoured with the exposure of the current dataset if the true mixing angle was non-maximal, was also investigated, with an improved neutrino energy reconstruction for very low energy events. The best fit oscillation parameters, obtained by the fitting software and incorporating resolution information were: | Δm2| = 2.32+0.12 -0.08×10-3 eV2 and sin2 (2θ ) > 0.90(90% C.L.). The analysis provides the current world best measurement of the atmospheric neutrino mass

  18. A Performance Measurement-Based Company Officer Management Information System Prototype for the United States Naval Academy

    National Research Council Canada - National Science Library

    Boone, Michael

    1999-01-01

    .... A performance-measurement-based management information system will greatly enhance the company officer's ability to develop, maintain, and use information technology for purposes of performance measurement...

  19. A new accuracy measure based on bounded relative error for time series forecasting.

    Science.gov (United States)

    Chen, Chao; Twycross, Jamie; Garibaldi, Jonathan M

    2017-01-01

    Many accuracy measures have been proposed in the past for time series forecasting comparisons. However, many of these measures suffer from one or more issues such as poor resistance to outliers and scale dependence. In this paper, while summarising commonly used accuracy measures, a special review is made on the symmetric mean absolute percentage error. Moreover, a new accuracy measure called the Unscaled Mean Bounded Relative Absolute Error (UMBRAE), which combines the best features of various alternative measures, is proposed to address the common issues of existing measures. A comparative evaluation on the proposed and related measures has been made with both synthetic and real-world data. The results indicate that the proposed measure, with user selectable benchmark, performs as well as or better than other measures on selected criteria. Though it has been commonly accepted that there is no single best accuracy measure, we suggest that UMBRAE could be a good choice to evaluate forecasting methods, especially for cases where measures based on geometric mean of relative errors, such as the geometric mean relative absolute error, are preferred.

  20. Approach for Self-Calibrating CO2 Measurements with Linear Membrane-Based Gas Sensors

    Directory of Open Access Journals (Sweden)

    Detlef Lazik

    2016-11-01

    Full Text Available Linear membrane-based gas sensors that can be advantageously applied for the measurement of a single gas component in large heterogeneous systems, e.g., for representative determination of CO2 in the subsurface, can be designed depending on the properties of the observation object. A resulting disadvantage is that the permeation-based sensor response depends on operating conditions, the individual site-adapted sensor geometry, the membrane material, and the target gas component. Therefore, calibration is needed, especially of the slope, which could change over several orders of magnitude. A calibration-free approach based on an internal gas standard is developed to overcome the multi-criterial slope dependency. This results in a normalization of sensor response and enables the sensor to assess the significance of measurement. The approach was proofed on the example of CO2 analysis in dry air with tubular PDMS membranes for various CO2 concentrations of an internal standard. Negligible temperature dependency was found within an 18 K range. The transformation behavior of the measurement signal and the influence of concentration variations of the internal standard on the measurement signal were shown. Offsets that were adjusted based on the stated theory for the given measurement conditions and material data from the literature were in agreement with the experimentally determined offsets. A measurement comparison with an NDIR reference sensor shows an unexpectedly low bias (<1% of the non-calibrated sensor response, and comparable statistical uncertainty.

  1. Approach for Self-Calibrating CO₂ Measurements with Linear Membrane-Based Gas Sensors.

    Science.gov (United States)

    Lazik, Detlef; Sood, Pramit

    2016-11-17

    Linear membrane-based gas sensors that can be advantageously applied for the measurement of a single gas component in large heterogeneous systems, e.g., for representative determination of CO₂ in the subsurface, can be designed depending on the properties of the observation object. A resulting disadvantage is that the permeation-based sensor response depends on operating conditions, the individual site-adapted sensor geometry, the membrane material, and the target gas component. Therefore, calibration is needed, especially of the slope, which could change over several orders of magnitude. A calibration-free approach based on an internal gas standard is developed to overcome the multi-criterial slope dependency. This results in a normalization of sensor response and enables the sensor to assess the significance of measurement. The approach was proofed on the example of CO₂ analysis in dry air with tubular PDMS membranes for various CO₂ concentrations of an internal standard. Negligible temperature dependency was found within an 18 K range. The transformation behavior of the measurement signal and the influence of concentration variations of the internal standard on the measurement signal were shown. Offsets that were adjusted based on the stated theory for the given measurement conditions and material data from the literature were in agreement with the experimentally determined offsets. A measurement comparison with an NDIR reference sensor shows an unexpectedly low bias (sensor response, and comparable statistical uncertainty.

  2. Comparison of measurement- and proxy-based Vs30 values in California

    Science.gov (United States)

    Yong, Alan K.

    2016-01-01

    This study was prompted by the recent availability of a significant amount of openly accessible measured VS30 values and the desire to investigate the trend of using proxy-based models to predict VS30 in the absence of measurements. Comparisons between measured and model-based values were performed. The measured data included 503 VS30 values collected from various projects for 482 seismographic station sites in California. Six proxy-based models—employing geologic mapping, topographic slope, and terrain classification—were also considered. Included was a new terrain class model based on the Yong et al. (2012) approach but recalibrated with updated measured VS30 values. Using the measured VS30 data as the metric for performance, the predictive capabilities of the six models were determined to be statistically indistinguishable. This study also found three models that tend to underpredict VS30 at lower velocities (NEHRP Site Classes D–E) and overpredict at higher velocities (Site Classes B–C).

  3. The Production Measurement Model of Open Pit Mine Based on Truck Operation Diagram

    Directory of Open Access Journals (Sweden)

    Sun Xiao-Yu

    2016-01-01

    Full Text Available Conventional production measurement of truck dispatching system in open pit mine has not been effectively expressed by a mathematical model, which brings a negative effect on the subsequent data mining and a compatibility issue to apply the production measurement with fixed assignment of truck. In this study, based on the proposed concept that truck is not only the carrier of transport material, but also act as the bridges and linkages between the loading sites and the unloading sites, a new truck operation diagram was established, which was further developed to a basic data matrix and a production measurement model. The new model allowed to calculatethe production measurement of the transport, loading, unloading, material and etc, respectively, as well as with any calculation in combination of more than one factor as needed.It solved the compatibility issue between conventional production measurement and the production measurement of fixed assignment of truck with good practical results.

  4. The error model and experiment of measuring angular position error based on laser collimation

    Science.gov (United States)

    Cai, Yangyang; Yang, Jing; Li, Jiakun; Feng, Qibo

    2018-01-01

    Rotary axis is the reference component of rotation motion. Angular position error is the most critical factor which impair the machining precision among the six degree-of-freedom (DOF) geometric errors of rotary axis. In this paper, the measuring method of angular position error of rotary axis based on laser collimation is thoroughly researched, the error model is established and 360 ° full range measurement is realized by using the high precision servo turntable. The change of space attitude of each moving part is described accurately by the 3×3 transformation matrices and the influences of various factors on the measurement results is analyzed in detail. Experiments results show that the measurement method can achieve high measurement accuracy and large measurement range.

  5. Calibration-free absolute frequency response measurement of directly modulated lasers based on additional modulation.

    Science.gov (United States)

    Zhang, Shangjian; Zou, Xinhai; Wang, Heng; Zhang, Yali; Lu, Rongguo; Liu, Yong

    2015-10-15

    A calibration-free electrical method is proposed for measuring the absolute frequency response of directly modulated semiconductor lasers based on additional modulation. The method achieves the electrical domain measurement of the modulation index of directly modulated lasers without the need for correcting the responsivity fluctuation in the photodetection. Moreover, it doubles measuring frequency range by setting a specific frequency relationship between the direct and additional modulation. Both the absolute and relative frequency response of semiconductor lasers are experimentally measured from the electrical spectrum of the twice-modulated optical signal, and the measured results are compared to those obtained with conventional methods to check the consistency. The proposed method provides calibration-free and accurate measurement for high-speed semiconductor lasers with high-resolution electrical spectrum analysis.

  6. Multi-dimensional grating interferometer based on fibre-fed measurement heads arranged in Littrow configuration

    Science.gov (United States)

    Šiaudinytė, Lauryna; Molnar, Gabor; Köning, Rainer; Flügge, Jens

    2018-05-01

    Industrial application versatility of interferometric encoders increases the urge to measure several degrees of freedom. A novel grating interferometer containing a commercially available, minimized Michelson interferometer and three fibre-fed measurement heads is presented in this paper. Moreover, the arrangement is designed for simultaneous displacement measurements in two perpendicular planes. In the proposed setup, beam splitters are located in the fibre heads, therefore the grating is separated from the light source and the photo detector, which influence measurement results by generated heat. The operating principle of the proposed system as well as error sources influencing measurement results are discussed in this paper. Further, the benefits and shortcomings of the setup are presented. A simple Littrow-configuration-based design leads to a compact-size interferometric encoder suitable for multidimensional measurements.

  7. Pipeline inwall 3D measurement system based on the cross structured light

    Science.gov (United States)

    Shen, Da; Lin, Zhipeng; Xue, Lei; Zheng, Qiang; Wang, Zichi

    2014-01-01

    In order to accurately realize the defect detection of pipeline inwall, this paper proposes a measurement system made up of cross structured light, single CCD camera and a smart car, etc. Based on structured light measurement technology, this paper mainly introduces the structured light measurement system, the imaging mathematical model, and the parameters and method of camera calibration. Using these measuring principles and methods, the camera in remote control car platform achieves continuous shooting of objects and real-time rebound processing as well as utilizing established model to extract 3D point cloud coordinate to reconstruct pipeline defects, so it is possible to achieve 3D automatic measuring, and verifies the correctness and feasibility of this system. It has been found that this system has great measurement accuracy in practice.

  8. Analysis of measured data of human body based on error correcting frequency

    Science.gov (United States)

    Jin, Aiyan; Peipei, Gao; Shang, Xiaomei

    2014-04-01

    Anthropometry is to measure all parts of human body surface, and the measured data is the basis of analysis and study of the human body, establishment and modification of garment size and formulation and implementation of online clothing store. In this paper, several groups of the measured data are gained, and analysis of data error is gotten by analyzing the error frequency and using analysis of variance method in mathematical statistics method. Determination of the measured data accuracy and the difficulty of measured parts of human body, further studies of the causes of data errors, and summarization of the key points to minimize errors possibly are also mentioned in the paper. This paper analyses the measured data based on error frequency, and in a way , it provides certain reference elements to promote the garment industry development.

  9. Model-based failure detection for cylindrical shells from noisy vibration measurements.

    Science.gov (United States)

    Candy, J V; Fisher, K A; Guidry, B L; Chambers, D H

    2014-12-01

    Model-based processing is a theoretically sound methodology to address difficult objectives in complex physical problems involving multi-channel sensor measurement systems. It involves the incorporation of analytical models of both physical phenomenology (complex vibrating structures, noisy operating environment, etc.) and the measurement processes (sensor networks and including noise) into the processor to extract the desired information. In this paper, a model-based methodology is developed to accomplish the task of online failure monitoring of a vibrating cylindrical shell externally excited by controlled excitations. A model-based processor is formulated to monitor system performance and detect potential failure conditions. The objective of this paper is to develop a real-time, model-based monitoring scheme for online diagnostics in a representative structural vibrational system based on controlled experimental data.

  10. FPGA-Based Smart Sensor for Online Displacement Measurements Using a Heterodyne Interferometer

    Science.gov (United States)

    Vera-Salas, Luis Alberto; Moreno-Tapia, Sandra Veronica; Garcia-Perez, Arturo; de Jesus Romero-Troncoso, Rene; Osornio-Rios, Roque Alfredo; Serroukh, Ibrahim; Cabal-Yepez, Eduardo

    2011-01-01

    The measurement of small displacements on the nanometric scale demands metrological systems of high accuracy and precision. In this context, interferometer-based displacement measurements have become the main tools used for traceable dimensional metrology. The different industrial applications in which small displacement measurements are employed requires the use of online measurements, high speed processes, open architecture control systems, as well as good adaptability to specific process conditions. The main contribution of this work is the development of a smart sensor for large displacement measurement based on phase measurement which achieves high accuracy and resolution, designed to be used with a commercial heterodyne interferometer. The system is based on a low-cost Field Programmable Gate Array (FPGA) allowing the integration of several functions in a single portable device. This system is optimal for high speed applications where online measurement is needed and the reconfigurability feature allows the addition of different modules for error compensation, as might be required by a specific application. PMID:22164040

  11. Resist-based measurement of contrast transfer function in a 0.3-NA microfield optic

    International Nuclear Information System (INIS)

    Cain, Jason P.; Naulleau, Patrick; Spanos, Costas J.

    2005-01-01

    Although extreme ultraviolet (EUV) lithography offers the possibility of very high-resolution patterning, the projection optics must be of extremely high quality in order to meet this potential. One key metric of the projection optic quality is the contrast transfer function (CTF), which is a measure of the aerial image contrast as a function of pitch. A static microfield exposure tool based on the 0.3-NA MET optic and operating at a wavelength of 13.5 nm has been installed at the Advanced Light Source, a synchrotron facility at the Lawrence Berkeley National Laboratory. This tool provides a platform for a wide variety of research into EUV lithography. In this work we present resist-based measurements of the contrast transfer function for the MET optic. These measurements are based upon line/space patterns printed in several different EUV photoresists. The experimental results are compared with the CTF in aerial-image simulations using the aberrations measured in the projection optic using interferometry. In addition, the CTF measurements are conducted for both bright-field and dark-field mask patterns. Finally, the orientation dependence of the CTF is measured in order to evaluate the effect of non-rotationally symmetric lens aberrations. These measurements provide valuable information in interpreting the results of other experiments performed using the MET and similar systems

  12. Comparing Science Virtual and Paper-Based Test to Measure Students’ Critical Thinking based on VAK Learning Style Model

    Science.gov (United States)

    Rosyidah, T. H.; Firman, H.; Rusyati, L.

    2017-02-01

    This research was comparing virtual and paper-based test to measure students’ critical thinking based on VAK (Visual-Auditory-Kynesthetic) learning style model. Quasi experiment method with one group post-test only design is applied in this research in order to analyze the data. There was 40 eight grade students at one of public junior high school in Bandung becoming the sample in this research. The quantitative data was obtained through 26 questions about living thing and environment sustainability which is constructed based on the eight elements of critical thinking and be provided in the form of virtual and paper-based test. Based on analysis of the result, it is shown that within visual, auditory, and kinesthetic were not significantly difference in virtual and paper-based test. Besides, all result was supported by quistionnaire about students’ respond on virtual test which shows 3.47 in the scale of 4. Means that student showed positive respond in all aspet measured, which are interest, impression, and expectation.

  13. Non-Contact Plant Growth Measurement Method and System Based on Ubiquitous Sensor Network Technologies

    Directory of Open Access Journals (Sweden)

    Intae Ryoo

    2011-04-01

    Full Text Available This paper proposes a non-contact plant growth measurement system using infrared sensors based on the ubiquitous sensor network (USN technology. The proposed system measures plant growth parameters such as the stem radius of plants using real-time non-contact methods, and generates diameter, cross-sectional area and thickening form of plant stems using this measured data. Non-contact sensors have been used not to cause any damage to plants during measurement of the growth parameters. Once the growth parameters are measured, they are transmitted to a remote server using the sensor network technology and analyzed in the application program server. The analyzed data are then provided for administrators and a group of interested users. The proposed plant growth measurement system has been designed and implemented using fixed-type and rotary-type infrared sensor based measurement methods and devices. Finally, the system performance is compared and verified with the measurement data that have been obtained by practical field experiments.

  14. Indirect measurement of molten steel level in tundish based on laser triangulation

    Energy Technology Data Exchange (ETDEWEB)

    Su, Zhiqi; He, Qing, E-mail: heqing@ise.neu.edu.cn; Xie, Zhi [State Key Laboratory of Synthetical Automation for Process Industries, School of Information Science and Engineering, Northeastern University, Shenyang 110819 (China)

    2016-03-15

    For real-time and precise measurement of molten steel level in tundish during continuous casting, slag level and slag thickness are needed. Among which, the problem of slag thickness measurement has been solved in our previous work. In this paper, a systematic solution for slag level measurement based on laser triangulation is proposed. Being different from traditional laser triangulation, several aspects for measuring precision and robustness have been done. First, laser line is adopted for multi-position measurement to overcome the deficiency of single point laser range finder caused by the uneven surface of the slag. Second, the key parameters, such as installing angle and minimum requirement of the laser power, are analyzed and determined based on the gray-body radiation theory to fulfill the rigorous requirement of measurement accuracy. Third, two kinds of severe noises in the acquired images, which are, respectively, caused by heat radiation and Electro-Magnetic Interference (EMI), are cleaned via morphological characteristic of the liquid slag and color difference between EMI and the laser signals, respectively. Fourth, as false target created by stationary slag usually disorders the measurement, valid signals of the slag are distinguished from the false ones to calculate the slag level. Then, molten steel level is obtained by the slag level minus the slag thickness. The measuring error of this solution is verified by the applications in steel plants, which is ±2.5 mm during steady casting and ±3.2 mm at the end of casting.

  15. Spherical aberration compensation method for long focal-length measurement based on Talbot interferometry

    Science.gov (United States)

    Luo, Yupeng; Huang, Xiao; Bai, Jian; Du, Juan; Liu, Qun; Luo, Yujie; Luo, Jia

    2017-08-01

    Large-aperture and long focal-length lens is widely used in high energy laser system. The method based on Talbot interferometry is a reliable method to measure the focal length of such elements. By employing divergent beam and two gratings of different periods, this method could realize full-aperture measurement, higher accuracy and better repeatability. However, it does not take into account the spherical aberration of the measured lens resulting in the moiré fringes bending, which will introduce measurement error. Furthermore, in long-focal measurement with divergent beam, this error is an important factor affecting the measurement accuracy. In this paper, we propose a new spherical aberration compensation method, which could significantly reduce the measurement error. Characterized by central-symmetric scanning window, the proposed method is based on the relationship between spherical aberration and the lens aperture. Angle data of moiré fringes in each scanning window is retrieved by Fourier analysis and statistically fitted to estimate a globally optimum value for spherical-aberration-free focal length calculation. Simulation and experiment have been carried out. Compared to the previous work, the proposed method is able to reduce the relative measurement error by 50%. The effect of scanning window size and shift step length on the results is also discussed.

  16. Indirect measurement of molten steel level in tundish based on laser triangulation

    Science.gov (United States)

    Su, Zhiqi; He, Qing; Xie, Zhi

    2016-03-01

    For real-time and precise measurement of molten steel level in tundish during continuous casting, slag level and slag thickness are needed. Among which, the problem of slag thickness measurement has been solved in our previous work. In this paper, a systematic solution for slag level measurement based on laser triangulation is proposed. Being different from traditional laser triangulation, several aspects for measuring precision and robustness have been done. First, laser line is adopted for multi-position measurement to overcome the deficiency of single point laser range finder caused by the uneven surface of the slag. Second, the key parameters, such as installing angle and minimum requirement of the laser power, are analyzed and determined based on the gray-body radiation theory to fulfill the rigorous requirement of measurement accuracy. Third, two kinds of severe noises in the acquired images, which are, respectively, caused by heat radiation and Electro-Magnetic Interference (EMI), are cleaned via morphological characteristic of the liquid slag and color difference between EMI and the laser signals, respectively. Fourth, as false target created by stationary slag usually disorders the measurement, valid signals of the slag are distinguished from the false ones to calculate the slag level. Then, molten steel level is obtained by the slag level minus the slag thickness. The measuring error of this solution is verified by the applications in steel plants, which is ±2.5 mm during steady casting and ±3.2 mm at the end of casting.

  17. Laser-based Relative Navigation Using GPS Measurements for Spacecraft Formation Flying

    Science.gov (United States)

    Lee, Kwangwon; Oh, Hyungjik; Park, Han-Earl; Park, Sang-Young; Park, Chandeok

    2015-12-01

    This study presents a precise relative navigation algorithm using both laser and Global Positioning System (GPS) measurements in real time. The measurement model of the navigation algorithm between two spacecraft is comprised of relative distances measured by laser instruments and single differences of GPS pseudo-range measurements in spherical coordinates. Based on the measurement model, the Extended Kalman Filter (EKF) is applied to smooth the pseudo-range measurements and to obtain the relative navigation solution. While the navigation algorithm using only laser measurements might become inaccurate because of the limited accuracy of spacecraft attitude estimation when the distance between spacecraft is rather large, the proposed approach is able to provide an accurate solution even in such cases by employing the smoothed GPS pseudo-range measurements. Numerical simulations demonstrate that the errors of the proposed algorithm are reduced by more than about 12% compared to those of an algorithm using only laser measurements, as the accuracy of angular measurements is greater than 0.001° at relative distances greater than 30 km.

  18. The I-V Measurement System for Solar Cells Based on MCU

    International Nuclear Information System (INIS)

    Chen Fengxiang; Ai Yu; Wang Jiafu; Wang Lisheng

    2011-01-01

    In this paper, an I-V measurement system for solar cells based on Single-chip Microcomputer (MCU) is presented. According to the test principles of solar cells, this measurement system mainly comprises of two parts-data collecting, data processing and displaying. The MCU mainly used as to acquire data, then the collecting results is sent to the computer by serial port. The I-V measurement results of our test system are shown in the human-computer interaction interface based on our hardware circuit. By comparing the test results of our I-V tester and the results of other commercial I-V tester, we found errors for most parameters are less than 5%, which shows our I-V test result is reliable. Because the MCU can be applied in many fields, this I-V measurement system offers a simple prototype for portable I-V tester for solar cells.

  19. The I-V Measurement System for Solar Cells Based on MCU

    Energy Technology Data Exchange (ETDEWEB)

    Chen Fengxiang; Ai Yu; Wang Jiafu; Wang Lisheng, E-mail: phonixchen79@yahoo.com.cn [Department of physics science and technology, Wuhan University of Technology, Wuhan city, Hubei Province, 430070 (China)

    2011-02-01

    In this paper, an I-V measurement system for solar cells based on Single-chip Microcomputer (MCU) is presented. According to the test principles of solar cells, this measurement system mainly comprises of two parts-data collecting, data processing and displaying. The MCU mainly used as to acquire data, then the collecting results is sent to the computer by serial port. The I-V measurement results of our test system are shown in the human-computer interaction interface based on our hardware circuit. By comparing the test results of our I-V tester and the results of other commercial I-V tester, we found errors for most parameters are less than 5%, which shows our I-V test result is reliable. Because the MCU can be applied in many fields, this I-V measurement system offers a simple prototype for portable I-V tester for solar cells.

  20. Computer based system for measuring the minority carrier lifetime in the solar cells

    International Nuclear Information System (INIS)

    Morales A, A.; Casados C, G.

    1994-01-01

    We show the development of a computer based system for measuring the minority carrier lifetime in the base of silicon solar cells. The system allows using two different techniques for such kind of measurements:the open circuit voltage decay (OCVD) and the surface voltage decay SVD. The equipment is based on internal cards for IBM-Pc or compatible computers that work as an oscilloscope and as a function generator, in addition to a synchronization and signal conditioning circuit. The system is fully controlled by a 'c' language program that optimizes the used of the instrument built in this way, and makes the analysis of the measurement data by curve fitting techniques. We show typical results obtained with silicon solar cells made in our laboratories. (Author)

  1. Soft Measurement Modeling Based on Chaos Theory for Biochemical Oxygen Demand (BOD

    Directory of Open Access Journals (Sweden)

    Junfei Qiao

    2016-12-01

    Full Text Available The precision of soft measurement for biochemical oxygen demand (BOD is always restricted due to various factors in the wastewater treatment plant (WWTP. To solve this problem, a new soft measurement modeling method based on chaos theory is proposed and is applied to BOD measurement in this paper. Phase space reconstruction (PSR based on Takens embedding theorem is used to extract more information from the limited datasets of the chaotic system. The WWTP is first testified as a chaotic system by the correlation dimension (D, the largest Lyapunov exponents (λ1, the Kolmogorov entropy (K of the BOD and other water quality parameters time series. Multivariate chaotic time series modeling method with principal component analysis (PCA and artificial neural network (ANN is then adopted to estimate the value of the effluent BOD. Simulation results show that the proposed approach has higher accuracy and better prediction ability than the corresponding modeling approaches not based on chaos theory.

  2. Study on the algorithm of computational ghost imaging based on discrete fourier transform measurement matrix

    Science.gov (United States)

    Zhang, Leihong; Liang, Dong; Li, Bei; Kang, Yi; Pan, Zilan; Zhang, Dawei; Gao, Xiumin; Ma, Xiuhua

    2016-07-01

    On the basis of analyzing the cosine light field with determined analytic expression and the pseudo-inverse method, the object is illuminated by a presetting light field with a determined discrete Fourier transform measurement matrix, and the object image is reconstructed by the pseudo-inverse method. The analytic expression of the algorithm of computational ghost imaging based on discrete Fourier transform measurement matrix is deduced theoretically, and compared with the algorithm of compressive computational ghost imaging based on random measurement matrix. The reconstruction process and the reconstruction error are analyzed. On this basis, the simulation is done to verify the theoretical analysis. When the sampling measurement number is similar to the number of object pixel, the rank of discrete Fourier transform matrix is the same as the one of the random measurement matrix, the PSNR of the reconstruction image of FGI algorithm and PGI algorithm are similar, the reconstruction error of the traditional CGI algorithm is lower than that of reconstruction image based on FGI algorithm and PGI algorithm. As the decreasing of the number of sampling measurement, the PSNR of reconstruction image based on FGI algorithm decreases slowly, and the PSNR of reconstruction image based on PGI algorithm and CGI algorithm decreases sharply. The reconstruction time of FGI algorithm is lower than that of other algorithms and is not affected by the number of sampling measurement. The FGI algorithm can effectively filter out the random white noise through a low-pass filter and realize the reconstruction denoising which has a higher denoising capability than that of the CGI algorithm. The FGI algorithm can improve the reconstruction accuracy and the reconstruction speed of computational ghost imaging.

  3. An integrative framework for sensor-based measurement of teamwork in healthcare.

    Science.gov (United States)

    Rosen, Michael A; Dietz, Aaron S; Yang, Ting; Priebe, Carey E; Pronovost, Peter J

    2015-01-01

    There is a strong link between teamwork and patient safety. Emerging evidence supports the efficacy of teamwork improvement interventions. However, the availability of reliable, valid, and practical measurement tools and strategies is commonly cited as a barrier to long-term sustainment and spread of these teamwork interventions. This article describes the potential value of sensor-based technology as a methodology to measure and evaluate teamwork in healthcare. The article summarizes the teamwork literature within healthcare, including team improvement interventions and measurement. Current applications of sensor-based measurement of teamwork are reviewed to assess the feasibility of employing this approach in healthcare. The article concludes with a discussion highlighting current application needs and gaps and relevant analytical techniques to overcome the challenges to implementation. Compelling studies exist documenting the feasibility of capturing a broad array of team input, process, and output variables with sensor-based methods. Implications of this research are summarized in a framework for development of multi-method team performance measurement systems. Sensor-based measurement within healthcare can unobtrusively capture information related to social networks, conversational patterns, physical activity, and an array of other meaningful information without having to directly observe or periodically survey clinicians. However, trust and privacy concerns present challenges that need to be overcome through engagement of end users in healthcare. Initial evidence exists to support the feasibility of sensor-based measurement to drive feedback and learning across individual, team, unit, and organizational levels. Future research is needed to refine methods, technologies, theory, and analytical strategies. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions

  4. Study of Influencing Factors of Dynamic Measurements Based on SnO2 Gas Sensor

    Directory of Open Access Journals (Sweden)

    Jinhuai Liu

    2004-08-01

    Full Text Available Abstract: The gas-sensing behaviour based on a dynamic measurement method of a single SnO2 gas sensor was investigated by comparison with the static measurement. The influencing factors of nonlinear response such as modulation temperature, duty ratio, heating waveform (rectangular, sinusoidal, saw-tooth, pulse, etc. were also studied. Experimental data showed that temperature was the most essential factor because the changes of frequency and heating waveform could result in the changes of temperature essentially.

  5. Ground based mobile isotopic methane measurements in the Front Range, Colorado

    Science.gov (United States)

    Vaughn, B. H.; Rella, C.; Petron, G.; Sherwood, O.; Mielke-Maday, I.; Schwietzke, S.

    2014-12-01

    Increased development of unconventional oil and gas resources in North America has given rise to attempts to monitor and quantify fugitive emissions of methane from the industry. Emission estimates of methane from oil and gas basins can vary significantly from one study to another as well as from EPA or State estimates. New efforts are aimed at reconciling bottom-up, or inventory-based, emission estimates of methane with top-down estimates based on atmospheric measurements from aircraft, towers, mobile ground-based vehicles, and atmospheric models. Attributing airborne measurements of regional methane fluxes to specific sources is informed by ground-based measurements of methane. Stable isotopic measurements (δ13C) of methane help distinguish between emissions from the O&G industry, Confined Animal Feed Operations (CAFO), and landfills, but analytical challenges typically limit meaningful isotopic measurements to individual point sampling. We are developing a toolbox to use δ13CH4 measurements to assess the partitioning of methane emissions for regions with multiple methane sources. The method was applied to the Denver-Julesberg Basin. Here we present data from continuous isotopic measurements obtained over a wide geographic area by using MegaCore, a 1500 ft. tube that is constantly filled with sample air while driving, then subsequently analyzed at slower rates using cavity ring down spectroscopy (CRDS). Pressure, flow and calibration are tightly controlled allowing precise attribution of methane enhancements to their point of collection. Comparisons with point measurements are needed to confirm regional values and further constrain flux estimates and models. This effort was made in conjunction with several major field campaigns in the Colorado Front Range in July-August 2014, including FRAPPÉ (Front Range Air Pollution and Photochemistry Experiment), DISCOVER-AQ, and the Air Water Gas NSF Sustainability Research Network at the University of Colorado.

  6. Simple Radiowave-Based Method For Measuring Peripheral Blood Flow Project

    Science.gov (United States)

    Oliva-Buisson, Yvette J.

    2014-01-01

    Project objective is to design small radio frequency based flow probes for the measurement of blood flow velocity in peripheral arteries such as the femoral artery and middle cerebral artery. The result will be the technological capability to measure peripheral blood flow rates and flow changes during various environmental stressors such as microgravity without contact to the individual being monitored. This technology may also lead to an easier method of detecting venous gas emboli during extravehicular activities.

  7. Credit Rating via Dynamic Slack-Based Measure And It´s Optimal Investment Strategy

    OpenAIRE

    A. Delavarkhalafi; A. Poursherafatan

    2015-01-01

    In this paper we check the credit rating of firms applied for a loan. In this regard we introduce a model, named Dynamic Slack-Based Measure (DSBM) for measuring credit rating of applicant companies. Selection of financial ratios that represent the financial state of a company -in the best possible way- is one of the most challenging parts of any credit rating analysis. At first, ranking needs to identify the appropriate variables. Therefore we introduce five financial variables to provide a ...

  8. Automatic dosimeter for kerma measurement based on commercial PIN photo diodes

    International Nuclear Information System (INIS)

    Kushpil, V.; Kushpil, S.; Huna, Z.

    2011-01-01

    A new automatic dosimeter for measurement of radiation dose from neutron and ionization radiation is presented. The dosimeter (kerma meter) uses commercial PIN diodes with long base as its active element. Later it provides a maximal dependence of the minority carriers life time versus absorbed dose. The characteristics of the dosimeter were measured for several types of commercial diodes. Device can be useful in many environmental or industrial applications. (authors)

  9. Transitions in the computational power of thermal states for measurement-based quantum computation

    International Nuclear Information System (INIS)

    Barrett, Sean D.; Bartlett, Stephen D.; Jennings, David; Doherty, Andrew C.; Rudolph, Terry

    2009-01-01

    We show that the usefulness of the thermal state of a specific spin-lattice model for measurement-based quantum computing exhibits a transition between two distinct 'phases' - one in which every state is a universal resource for quantum computation, and another in which any local measurement sequence can be simulated efficiently on a classical computer. Remarkably, this transition in computational power does not coincide with any phase transition, classical, or quantum in the underlying spin-lattice model.

  10. Reliability and Validity of an Internet-based Questionnaire Measuring Lifetime Physical Activity

    OpenAIRE

    De Vera, Mary A.; Ratzlaff, Charles; Doerfling, Paul; Kopec, Jacek

    2010-01-01

    Lifetime exposure to physical activity is an important construct for evaluating associations between physical activity and disease outcomes, given the long induction periods in many chronic diseases. The authors' objective in this study was to evaluate the measurement properties of the Lifetime Physical Activity Questionnaire (L-PAQ), a novel Internet-based, self-administered instrument measuring lifetime physical activity, among Canadian men and women in 2005–2006. Reliability was examined u...

  11. Survey of radiofrequency radiation levels around GSM base stations and evaluation of measurement uncertainty

    Directory of Open Access Journals (Sweden)

    Vulević Branislav D.

    2011-01-01

    Full Text Available This paper is a summary of broadband measurement values of radiofrequency radiation around GSM base stations in the vicinity of residential areas in Belgrade and 12 other cities in Serbia. It will be useful for determining non-ionizing radiation exposure levels of the general public in the future. The purpose of this paper is also an appropriate representation of basic information on the evaluation of measurement uncertainty.

  12. Noncontacting acoustics-based temperature measurement techniques in rapid thermal processing

    Science.gov (United States)

    Lee, Yong J.; Chou, Ching-Hua; Khuri-Yakub, Butrus T.; Saraswat, Krishna C.

    1991-04-01

    Temperature measurement of silicon wafers based on the temperature dependence of acoustic waves is studied. The change in the temperature-dependent dispersion relations of the plate modes through the wafer can be exploited to provide a viable temperature monitoring scheme with advantages over both thermocouples and pyrometers. Velocity measurements of acoustic waves through a thin layer of ambient directly above the wafer provides the temperature of the wafer-ambient interface. 1.

  13. Model-based bootstrapping when correcting for measurement error with application to logistic regression.

    Science.gov (United States)

    Buonaccorsi, John P; Romeo, Giovanni; Thoresen, Magne

    2018-03-01

    When fitting regression models, measurement error in any of the predictors typically leads to biased coefficients and incorrect inferences. A plethora of methods have been proposed to correct for this. Obtaining standard errors and confidence intervals using the corrected estimators can be challenging and, in addition, there is concern about remaining bias in the corrected estimators. The bootstrap, which is one option to address these problems, has received limited attention in this context. It has usually been employed by simply resampling observations, which, while suitable in some situations, is not always formally justified. In addition, the simple bootstrap does not allow for estimating bias in non-linear models, including logistic regression. Model-based bootstrapping, which can potentially estimate bias in addition to being robust to the original sampling or whether the measurement error variance is constant or not, has received limited attention. However, it faces challenges that are not present in handling regression models with no measurement error. This article develops new methods for model-based bootstrapping when correcting for measurement error in logistic regression with replicate measures. The methodology is illustrated using two examples, and a series of simulations are carried out to assess and compare the simple and model-based bootstrap methods, as well as other standard methods. While not always perfect, the model-based approaches offer some distinct improvements over the other methods. © 2017, The International Biometric Society.

  14. Volatility and correlation-based systemic risk measures in the US market

    Science.gov (United States)

    Civitarese, Jamil

    2016-10-01

    This paper deals with the problem of how to use simple systemic risk measures to assess portfolio risk characteristics. Using three simple examples taken from previous literature, one based on raw and partial correlations, another based on the eigenvalue decomposition of the covariance matrix and the last one based on an eigenvalue entropy, a Granger-causation analysis revealed some of them are not always a good measure of risk in the S&P 500 and in the VIX. The measures selected do not Granger-cause the VIX index in all windows selected; therefore, in the sense of risk as volatility, the indicators are not always suitable. Nevertheless, their results towards returns are similar to previous works that accept them. A deeper analysis has shown that any symmetric measure based on eigenvalue decomposition of correlation matrices, however, is not useful as a measure of "correlation" risk. The empirical counterpart analysis of this proposition stated that negative correlations are usually small and, therefore, do not heavily distort the behavior of the indicator.

  15. A micromachined membrane-based active probe for biomolecular mechanics measurement

    Science.gov (United States)

    Torun, H.; Sutanto, J.; Sarangapani, K. K.; Joseph, P.; Degertekin, F. L.; Zhu, C.

    2007-04-01

    A novel micromachined, membrane-based probe has been developed and fabricated as assays to enable parallel measurements. Each probe in the array can be individually actuated, and the membrane displacement can be measured with high resolution using an integrated diffraction-based optical interferometer. To illustrate its application in single-molecule mechanics experiments, this membrane probe was used to measure unbinding forces between L-selectin reconstituted in a polymer-cushioned lipid bilayer on the probe membrane and an antibody adsorbed on an atomic force microscope cantilever. Piconewton range forces between single pairs of interacting molecules were measured from the cantilever bending while using the membrane probe as an actuator. The integrated diffraction-based optical interferometer of the probe was demonstrated to have floor for frequencies as low as 3 Hz with a differential readout scheme. With soft probe membranes, this low noise level would be suitable for direct force measurements without the need for a cantilever. Furthermore, the probe membranes were shown to have 0.5 µm actuation range with a flat response up to 100 kHz, enabling measurements at fast speeds.

  16. In situ measurement of the energy gap of a semiconductor using a microcontroller-based system

    International Nuclear Information System (INIS)

    Mukaro, R; Taele, B M; Tinarwo, D

    2006-01-01

    This paper describes a microcontroller-based laboratory technique for automatic in situ measurement of the energy gap of germanium. The design is based on the original undergraduate laboratory experiment in which students manually measure the variation of the reverse saturation current of a germanium diode with temperature using a current-to-voltage converter. After collecting the results students later analyse them to determine the energy gap of the semiconductor. The objective of this work was to introduce interfacing and computerized measurement systems in the undergraduate laboratory. The microcontroller-based data acquisition system and its implementation in automatic in situ measurement of the band gap of germanium diode is presented. The system which uses an LM335 temperature sensor for measuring temperature transmits the measured data to the computer via the RS232 serial port while a C++ software program developed to run on the computer monitors the serial port for incoming information sent by the microcontroller. This information is displayed on the computer screen as it comes and automatically saved to a data file. Once all the data are received, the computer performs least-squares fit to the data to compute the energy gap which is displayed on the screen together with its error estimate. For the IN34A germanium diode used the value of the energy gap obtained was 0.50 ± 0.02 eV

  17. In situ measurement of the energy gap of a semiconductor using a microcontroller-based system

    Energy Technology Data Exchange (ETDEWEB)

    Mukaro, R [Department of Physics, Bindura University of Science, P/Bag 1020, Bindura (Zimbabwe); Taele, B M [Department of Physics and Electronics, National University of Lesotho, Roma 180 (Lesotho); Tinarwo, D [Department of Physics, Bindura University of Science, P/Bag 1020, Bindura (Zimbabwe)

    2006-05-01

    This paper describes a microcontroller-based laboratory technique for automatic in situ measurement of the energy gap of germanium. The design is based on the original undergraduate laboratory experiment in which students manually measure the variation of the reverse saturation current of a germanium diode with temperature using a current-to-voltage converter. After collecting the results students later analyse them to determine the energy gap of the semiconductor. The objective of this work was to introduce interfacing and computerized measurement systems in the undergraduate laboratory. The microcontroller-based data acquisition system and its implementation in automatic in situ measurement of the band gap of germanium diode is presented. The system which uses an LM335 temperature sensor for measuring temperature transmits the measured data to the computer via the RS232 serial port while a C++ software program developed to run on the computer monitors the serial port for incoming information sent by the microcontroller. This information is displayed on the computer screen as it comes and automatically saved to a data file. Once all the data are received, the computer performs least-squares fit to the data to compute the energy gap which is displayed on the screen together with its error estimate. For the IN34A germanium diode used the value of the energy gap obtained was 0.50 {+-} 0.02 eV.

  18. Low-cost vibration sensor based on dual fiber Bragg gratings and light intensity measurement.

    Science.gov (United States)

    Gao, Xueqing; Wang, Yongjiao; Yuan, Bo; Yuan, Yinquan; Dai, Yawen; Xu, Gang

    2013-09-20

    A vibration monitoring system based on light intensity measurement has been constructed, and the designed accelerometer is based on steel cantilever frame and dual fiber Bragg gratings (FBGs). By using numerical simulations for the dual FBGs, the dependence relationship of the area of main lobes on the difference of initial central wavelengths is obtained and the most optimal choice for the initial value and the vibration amplitude of the difference of central wavelengths of two FBGs is suggested. The vibration monitoring experiments are finished, and the measured data are identical to the simulated results.

  19. 100 GHz pulse waveform measurement based on electro-optic sampling

    Science.gov (United States)

    Feng, Zhigang; Zhao, Kejia; Yang, Zhijun; Miao, Jingyuan; Chen, He

    2018-05-01

    We present an ultrafast pulse waveform measurement system based on an electro-optic sampling technique at 1560 nm and prepare LiTaO3-based electro-optic modulators with a coplanar waveguide structure. The transmission and reflection characteristics of electrical pulses on a coplanar waveguide terminated with an open circuit and a resistor are investigated by analyzing the corresponding time-domain pulse waveforms. We measure the output electrical pulse waveform of a 100 GHz photodiode and the obtained rise times of the impulse and step responses are 2.5 and 3.4 ps, respectively.

  20. Measurement of Non-Invasive Blood Glucose Level Based Sensor Color TCS3200 and Arduino

    Science.gov (United States)

    Kurniadi Wardana, Humaidillah; Indahwati, Elly; Arifah Fitriyah, Lina

    2018-04-01

    Design and measurement of Arduino-based urinary (non-invasive) urine glucose using RGB tcs3200 sensor. This research was conducted by making use of the urine in diabetes patients detected by sensor colours then measured levels of colour based on the RGB colour of the urine of diabetics. The detection is done on 4 urine samples with each consisting of 3 diabetics and 1 non-diabetics. Equipment used in this research, among others, Arduino Uno, colour sensor tcs3200, LCD 16x4. The results showed that the detection of RGB values in diabetics 230 with blue and not diabetics 200 with red.

  1. A review of instruments to measure interprofessional team-based primary care.

    Science.gov (United States)

    Shoemaker, Sarah J; Parchman, Michael L; Fuda, Kathleen Kerwin; Schaefer, Judith; Levin, Jessica; Hunt, Meaghan; Ricciardi, Richard

    2016-07-01

    Interprofessional team-based care is increasingly regarded as an important feature of delivery systems redesigned to provide more efficient and higher quality care, including primary care. Measurement of the functioning of such teams might enable improvement of team effectiveness and could facilitate research on team-based primary care. Our aims were to develop a conceptual framework of high-functioning primary care teams to identify and review instruments that measure the constructs identified in the framework, and to create a searchable, web-based atlas of such instruments (available at: http://primarycaremeasures.ahrq.gov/team-based-care/ ). Our conceptual framework was developed from existing frameworks, the teamwork literature, and expert input. The framework is based on an Input-Mediator-Output model and includes 12 constructs to which we mapped both instruments as a whole, and individual instrument items. Instruments were also reviewed for relevance to measuring team-based care, and characterized. Instruments were identified from peer-reviewed and grey literature, measure databases, and expert input. From nearly 200 instruments initially identified, we found 48 to be relevant to measuring team-based primary care. The majority of instruments were surveys (n = 44), and the remainder (n = 4) were observational checklists. Most instruments had been developed/tested in healthcare settings (n = 30) and addressed multiple constructs, most commonly communication (n = 42), heedful interrelating (n = 42), respectful interactions (n = 40), and shared explicit goals (n = 37). The majority of instruments had some reliability testing (n = 39) and over half included validity testing (n = 29). Currently available instruments offer promise to researchers and practitioners to assess teams' performance, but additional work is needed to adapt these instruments for primary care settings.

  2. TECHNICAL NOTE: Portable audio electronics for impedance-based measurements in microfluidics

    Science.gov (United States)

    Wood, Paul; Sinton, David

    2010-08-01

    We demonstrate the use of audio electronics-based signals to perform on-chip electrochemical measurements. Cell phones and portable music players are examples of consumer electronics that are easily operated and are ubiquitous worldwide. Audio output (play) and input (record) signals are voltage based and contain frequency and amplitude information. A cell phone, laptop soundcard and two compact audio players are compared with respect to frequency response; the laptop soundcard provides the most uniform frequency response, while the cell phone performance is found to be insufficient. The audio signals in the common portable music players and laptop soundcard operate in the range of 20 Hz to 20 kHz and are found to be applicable, as voltage input and output signals, to impedance-based electrochemical measurements in microfluidic systems. Validated impedance-based measurements of concentration (0.1-50 mM), flow rate (2-120 µL min-1) and particle detection (32 µm diameter) are demonstrated. The prevailing, lossless, wave audio file format is found to be suitable for data transmission to and from external sources, such as a centralized lab, and the cost of all hardware (in addition to audio devices) is ~10 USD. The utility demonstrated here, in combination with the ubiquitous nature of portable audio electronics, presents new opportunities for impedance-based measurements in portable microfluidic systems.

  3. Design and test of 4πβ-γ coincidence measurement device based on DSP technology

    International Nuclear Information System (INIS)

    Zeng Herong; Feng Qijie; Leng Jun; Qian Dazhi; Bai Lixin; Zhang Yiyun

    2012-01-01

    The paper illustrates the hardware and software of the 4πβ-γ coincidence measurement device based on DSP technology in detail. In such device, the single-channel analyzer, gate generator, coincidence circuit and scalar in the traditional coincidence measurement device are replaced by the digital coincidence acquirer which is researched and manufactured by ourselves. Doing so, the measurement efficiency will be respectively improved, and the hardware cost will be lowered. The comparison experiment shows that the design of such device is a success. (authors)

  4. Investigating Measures of Social Context on 2 Population-Based Health Surveys, Hawaii, 2010-2012.

    Science.gov (United States)

    Pobutsky, Ann M; Baker, Kathleen Kromer; Reyes-Salvail, Florentina

    2015-12-17

    Measures from the Social Context Module of the Centers for Disease Control and Prevention's Behavioral Risk Factor Surveillance System were used on 2 population-based health surveys in Hawaii to explicate the role of the nonmedical and social determinants of health; these measures were also compared with conventional socioeconomic status (SES) variables. Results showed that the self-reported SES vulnerabilities of food and housing insecurity are both linked to demographic factors and physical and mental health status and significant when controlling for the conventional measures of SES. The social context module indicators should be increasingly used so results can inform appropriate interventions for vulnerable populations.

  5. Calorimetric Measurement for Internal Conversion Efficiency of Photovoltaic Cells/Modules Based on Electrical Substitution Method

    Science.gov (United States)

    Saito, Terubumi; Tatsuta, Muneaki; Abe, Yamato; Takesawa, Minato

    2018-02-01

    We have succeeded in the direct measurement for solar cell/module internal conversion efficiency based on a calorimetric method or electrical substitution method by which the absorbed radiant power is determined by replacing the heat absorbed in the cell/module with the electrical power. The technique is advantageous in that the reflectance and transmittance measurements, which are required in the conventional methods, are not necessary. Also, the internal quantum efficiency can be derived from conversion efficiencies by using the average photon energy. Agreements of the measured data with the values estimated from the nominal values support the validity of this technique.

  6. A dynamic-based measurement of a spring constant with a smartphone light sensor

    Science.gov (United States)

    Pili, Unofre

    2018-05-01

    An accessible smartphone-based experimental set-up for measuring a spring constant is presented. Using the smartphone ambient light sensor as the motion timer that allows for the measurement of the period of oscillations of a vertical spring-mass oscillator we found the spring constant to be 27.3 +/- 0.2 N m-1. This measurement is in a satisfactory agreement with another experimental value, 26.7 +/- 0.1 N m-1, obtained via the traditional static method.

  7. Measurement of the activity of an artificial neutrino source based on 37Ar

    International Nuclear Information System (INIS)

    Abdurashitov, D. N.; Veretenkin, E. P.; Gavrin, V. N.; Gorbachev, V. V.; Ibragimova, T. V.; Kalikhov, A. V.; Mirmov, I. N.; Shikhin, A. A.; Yants, V. E.; Barsanov, V. I.; Dzhanelidze, A. A.; Zlokazov, S. B.; Markov, S. Yu.; Shakirov, Z. N.; Cleveland, B. T.

    2007-01-01

    The activity of an artificial neutrino source based on 37 Ar was measured by a specially developed method of directly counting 37 Ar decays in a proportional counter. This source was used to irradiate the target of the SAGE radiochemical gallium-germanium neutrino telescope at the Baksan Neutrino Observatory (Institute for Nuclear Research, Russian Academy of Sciences, Moscow), whereupon the measurements were performed at the Institute of Reactor Materials (Zarechny, Sverdlovsk oblast, Russia). The method used to prepare gaseous samples for measurements in proportional counters and the counting procedure are described. The measured activity of the 37 Ar neutrino source is 405.1 ± 3.7 kCi (corrected for decays that occurred within the period between the instant of activity measurement and the commencement of the irradiation of Ga target at 04:00 Moscow time, 30.04.2004)

  8. Atmospheric SO{sub 2}. Global measurements using aircraft-based CIMS

    Energy Technology Data Exchange (ETDEWEB)

    Fiedler, V.

    2008-06-27

    Aircraft based measurements of tropospheric sulfur dioxide, SO{sub 2}, have been carried out during four campaigns in South America (TROCCINOX), Australia (SCOUT-O3), Europe (INTEX/MEGAPLUME) and Africa (AMMA). SO{sub 2} has been measured by chemical ionization mass spectrometry (CIMS), permanently online calibrated with isotopically labelled SO{sub 2}. The measurement method is described thoroughly in this work and the measured data are presented. Moreover, the data of the different regions are compared in general and typical air mass situations with SO{sub 2} enhancement are shown. A detailed analysis of four SO{sub 2} pollution plume cases emphasizes the main features: long-range transport, SO{sub 2} from metal smelters/volcanoes or from biomass burning. The SO{sub 2} measurements are analyzed in the light of simultaneously measured trace gas, particle and meteorological data. Air mass trajectory models (FLEXPART or HYSPLIT) are employed for a determination of the pollution origin. Further going evaluations with the aerosol model AEROFOR complete the analyses and point out, that the measured SO{sub 2} mole fractions are sufficient to explain new particle formation and growth. Finally, a first comparison of the measured SO{sub 2} to results from a global circulation model (ECHAM) with implemented sulfur chemistry showed a significant underestimation of the measured SO{sub 2} mole fraction by the model in the free troposphere. (orig.)

  9. High-precision diode-laser-based temperature measurement for air refractive index compensation.

    Science.gov (United States)

    Hieta, Tuomas; Merimaa, Mikko; Vainio, Markku; Seppä, Jeremias; Lassila, Antti

    2011-11-01

    We present a laser-based system to measure the refractive index of air over a long path length. In optical distance measurements, it is essential to know the refractive index of air with high accuracy. Commonly, the refractive index of air is calculated from the properties of the ambient air using either Ciddor or Edlén equations, where the dominant uncertainty component is in most cases the air temperature. The method developed in this work utilizes direct absorption spectroscopy of oxygen to measure the average temperature of air and of water vapor to measure relative humidity. The method allows measurement of temperature and humidity over the same beam path as in optical distance measurement, providing spatially well-matching data. Indoor and outdoor measurements demonstrate the effectiveness of the method. In particular, we demonstrate an effective compensation of the refractive index of air in an interferometric length measurement at a time-variant and spatially nonhomogeneous temperature over a long time period. Further, we were able to demonstrate 7 mK RMS noise over a 67 m path length using a 120 s sample time. To our knowledge, this is the best temperature precision reported for a spectroscopic temperature measurement. © 2011 Optical Society of America

  10. High-precision diode-laser-based temperature measurement for air refractive index compensation

    International Nuclear Information System (INIS)

    Hieta, Tuomas; Merimaa, Mikko; Vainio, Markku; Seppae, Jeremias; Lassila, Antti

    2011-01-01

    We present a laser-based system to measure the refractive index of air over a long path length. In optical distance measurements, it is essential to know the refractive index of air with high accuracy. Commonly, the refractive index of air is calculated from the properties of the ambient air using either Ciddor or Edlen equations, where the dominant uncertainty component is in most cases the air temperature. The method developed in this work utilizes direct absorption spectroscopy of oxygen to measure the average temperature of air and of water vapor to measure relative humidity. The method allows measurement of temperature and humidity over the same beam path as in optical distance measurement, providing spatially well-matching data. Indoor and outdoor measurements demonstrate the effectiveness of the method. In particular, we demonstrate an effective compensation of the refractive index of air in an interferometric length measurement at a time-variant and spatially nonhomogeneous temperature over a long time period. Further, we were able to demonstrate 7 mK RMS noise over a 67 m path length using a 120 s sample time. To our knowledge, this is the best temperature precision reported for a spectroscopic temperature measurement.

  11. Photosensor-Based Latency Measurement System for Head-Mounted Displays

    Directory of Open Access Journals (Sweden)

    Min-Woo Seo

    2017-05-01

    Full Text Available In this paper, a photosensor-based latency measurement system for head-mounted displays (HMDs is proposed. The motion-to-photon latency is the greatest reason for motion sickness and dizziness felt by users when wearing an HMD system. Therefore, a measurement system is required to accurately measure and analyze the latency to reduce these problems. The existing measurement system does not consider the actual physical movement in humans, and its accuracy is also very low. However, the proposed system considers the physical head movement and is highly accurate. Specifically, it consists of a head position model-based rotary platform, pixel luminance change detector, and signal analysis and calculation modules. Using these modules, the proposed system can exactly measure the latency, which is the time difference between the physical movement for a user and the luminance change of an output image. In the experiment using a commercial HMD, the latency was measured to be up to 47.05 ms. In addition, the measured latency increased up to 381.17 ms when increasing the rendering workload in the HMD.

  12. Optimization of dynamic envelope measurement system for high speed train based on monocular vision

    Science.gov (United States)

    Wu, Bin; Liu, Changjie; Fu, Luhua; Wang, Zhong

    2018-01-01

    The definition of dynamic envelope curve is the maximum limit outline caused by various adverse effects during the running process of the train. It is an important base of making railway boundaries. At present, the measurement work of dynamic envelope curve of high-speed vehicle is mainly achieved by the way of binocular vision. There are some problems of the present measuring system like poor portability, complicated process and high cost. A new measurement system based on the monocular vision measurement theory and the analysis on the test environment is designed and the measurement system parameters, the calibration of camera with wide field of view, the calibration of the laser plane are designed and optimized in this paper. The accuracy has been verified to be up to 2mm by repeated tests and experimental data analysis. The feasibility and the adaptability of the measurement system is validated. There are some advantages of the system like lower cost, a simpler measurement and data processing process, more reliable data. And the system needs no matching algorithm.

  13. Research on Damage Identification of Bridge Based on Digital Image Measurement

    Science.gov (United States)

    Liang, Yingjing; Huan, Shi; Tao, Weijun

    2017-12-01

    In recent years, the number of the damage bridge due to excessive deformation gradually increased, which caused significant property damage and casualties. Hence health monitoring and the damage detection of the bridge structure based on the deflection measurement are particularly important. The current conventional deflection measurement methods, such as total station, connected pipe, GPS, etc., have many shortcomings as low efficiency, heavy workload, low degree of automation, operating frequency and working time constrained. GPS has a low accuracy in the vertical displacement measurement and cannot meet the dynamic measured requirements of the current bridge engineering. This paper presents a bridge health monitoring and damage detection technology based on digital image measurement method in which the measurement accuracy is sub-millimeter level and can achieve the 24-hour automatic non-destructive monitoring for the deflection. It can be concluded from this paper that it is feasible to use digital image measurement method for identification of the damage in the bridge structure, because it has been validated by the theoretical analysis, the laboratory model and the application of the real bridge.

  14. MGI-oriented High-throughput Measurement of Interdiffusion Coefficient Matrices in Ni-based Superalloys

    Directory of Open Access Journals (Sweden)

    TANG Ying

    2017-01-01

    Full Text Available One of the research hotspots in the field of high-temperature alloys was to search the substitutional elements for Re in order to prepare the single-crystal Ni-based superalloys with less or even no Re addition. To find the elements with similar or even lower diffusion coefficients in comparison with that of Re was one of the effective strategies. In multicomponent alloys, the interdiffusivity matrix were used to comprehensively characterize the diffusion ability of any alloying elements. Therefore, accurate determination of the composition-dependant and temperature-dependent interdiffusivities matrices of different elements in γ and γ' phases of Ni-based superalloys was high priority. The paper briefly introduces of the status of the interdiffusivity matrices determination in Ni-based superalloys, and the methods for determining the interdiffusivities in multicomponent alloys, including the traditional Matano-Kirkaldy method and recently proposed numerical inverse method. Because the traditional Matano-Kirkaldy method is of low efficiency, the experimental reports on interdiffusivity matrices in ternary and higher order sub-systems of the Ni-based superalloys were very scarce in the literature. While the numerical inverse method newly proposed in our research group based on Fick's second law can be utilized for high-throughput measurement of accurate interdiffusivity matrices in alloys with any number of components. After that, the successful application of the numerical inverse method in the high-throughput measurement of interdiffusivity matrices in alloys is demonstrated in fcc (γ phase of the ternary Ni-Al-Ta system. Moreover, the validation of the resulting composition-dependant and temperature-dependent interdiffusivity matrices is also comprehensively made. Then, this paper summarizes the recent progress in the measurement of interdiffusivity matrices in γ and γ' phases of a series of core ternary Ni-based superalloys achieved in

  15. Associations of genetic risk scores based on adult adiposity pathways with childhood growth and adiposity measures

    OpenAIRE

    Monnereau, Claire; Vogelezang, Suzanne; Kruithof, Claudia J.; Jaddoe, Vincent W. V.; Felix, Janine F.

    2016-01-01

    Background Results from genome-wide association studies (GWAS) identified many loci and biological pathways that influence adult body mass index (BMI). We aimed to identify if biological pathways related to adult BMI also affect infant growth and childhood adiposity measures. Methods We used data from a population-based prospective cohort study among 3,975 children with a mean age of 6?years. Genetic risk scores were constructed based on the 97 SNPs associated with adult BMI previously identi...

  16. Operational Risk Measurement of Chinese Commercial Banks Based on Extreme Value Theory

    Science.gov (United States)

    Song, Jiashan; Li, Yong; Ji, Feng; Peng, Cheng

    The financial institutions and supervision institutions have all agreed on strengthening the measurement and management of operational risks. This paper attempts to build a model on the loss of operational risks basing on Peak Over Threshold model, emphasizing on weighted least square, which improved Hill’s estimation method, while discussing the situation of small sample, and fix the sample threshold more objectively basing on the media-published data of primary banks loss on operational risk from 1994 to 2007.

  17. Measurements of MIMO Indoor Channels at 1800 MHz with Multiple Indoor and Outdoor Base Stations

    Directory of Open Access Journals (Sweden)

    Jaldén Niklas

    2007-01-01

    Full Text Available This paper proposes several configurations for multiple base stations in indoor MIMO systems and compares their performance. The results are based on channel measurements realized with a MIMO testbed. The receiver was moved along several routes and floors on an office building. Both outdoor and indoor locations are considered for the transmitters or base stations, which allow the analysis of not only indoor but also outdoor-to-indoor environment. The use of 2 base stations with different system level combinations of the two is analyzed. We show that the configuration with base station selection provides almost as good performance as a full water-filling scheme when the 2 base stations are placed at different locations. Also the spatial correlation properties for the different configurations are analyzed and the importance of considering path loss when evaluating capacity is highlighted.

  18. Correction of self-reported BMI based on objective measurements: a Belgian experience.

    Science.gov (United States)

    Drieskens, S; Demarest, S; Bel, S; De Ridder, K; Tafforeau, J

    2018-01-01

    Based on successive Health Interview Surveys (HIS), it has been demonstrated that also in Belgium obesity, measured by means of a self-reported body mass index (BMI in kg/m 2 ), is a growing public health problem that needs to be monitored as accurately as possible. Studies have shown that a self-reported BMI can be biased. Consequently, if the aim is to rely on a self-reported BMI, adjustment is recommended. Data on measured and self-reported BMI, derived from the Belgian Food Consumption Survey (FCS) 2014 offers the opportunity to do so. The HIS and FCS are cross-sectional surveys based on representative population samples. This study focused on adults aged 18-64 years (sample HIS = 6545 and FCS = 1213). Measured and self-reported BMI collected in FCS were used to assess possible misreporting. Using FCS data, correction factors (measured BMI/self-reported BMI) were calculated in function of a combination of background variables (region, gender, educational level and age group). Individual self-reported BMI of the HIS 2013 were then multiplied with the corresponding correction factors to produce a corrected BMI-classification. When compared with the measured BMI, the self-reported BMI in the FCS was underestimated (mean 0.97 kg/m 2 ). 28% of the obese people underestimated their BMI. After applying the correction factors, the prevalence of obesity based on HIS data significantly increased (from 13% based on the original HIS data to 17% based on the corrected HIS data) and approximated the measured one derived from the FCS data. Since self-reported calculations of BMI are underestimated, it is recommended to adjust them to obtain accurate estimates which are important for decision making.

  19. Development of a stress sensor based on the piezoelectric lead zirconate titanate for impact stress measurement

    Science.gov (United States)

    Liu, Yiming; Xu, Bin; Li, Lifei; Li, Bing

    2012-04-01

    The measurement of stress of concrete structures under impact loading and other strong dynamic loadings is crucial for the monitoring of health and damage detection. Due to its main advantages including availability, extremely high rigidity, high natural frequency, wide measuring range, high stability, high reproducibility, high linearity and wide operating temperature range, piezoelectric (Lead Zirconate Titanate, PZT) ceramic materials has been a widely used smart material for both sensing and actuation for monitoring and control in engineering structures. In this paper, a kind of stress sensor based on piezoelectric ceramics for impact stress measuring of concrete structures is developed. Because the PZT is fragile, in order to employ it for the health monitoring of concrete structures, special handling and treatment should be taken to protect the PZT and to make it survive and work properly in concrete. The commercially available PZT patch with lead wires is first applied with an insulation coating to prevent water and moisture damage, and then is packaged by jacketing it by two small precasted cylinder concrete blocks with enough strength to form a smart aggregate (SA). The employed PZT patch has a dimension of 10mm x 10mm x 0.3mm. In order to calibrate the PZT based stress sensor for impact stress measuring, a dropping hammer was designed and calibration test on the sensitivity of the proposed transducer was carried out with an industry charge amplifier. The voltage output of the stress sensor and the impact force under different free falling heights and impact mass were recorded with a high sampling rate data acquisition system. Based on the test measurements, the sensibility of the PZT based stress sensor was determined. Results show that the output of the PZT based stress sensor is proportional to the stress level and the repeatability of the measurement is very good. The self-made piezoelectric stress sensor can be easily embedded in concrete and provide

  20. A Modernized UDM-600 Dynamometer-Based Setup for the Cutting Force Measurement

    Directory of Open Access Journals (Sweden)

    Ya. I. Shuliak

    2016-01-01

    Full Text Available The article considers development of a modernized UDM-600 dynamometer-based setup for measuring the cutting force components. Modernization of existing equipment to improve the method of recording the cutting force components in the automated mode is of relevance. The measuring setup allows recording the cutting force components in turning and milling, as well as the axial force and the torque in the drilling and milling operations.The article presents a block diagram and a schematic diagram of the setup to measure the cutting force components, and describes a basic principle of measuring units within the modernized setup. The developed setup uses a half-bridge strain gauge measuring circuit to record the cutting forces. To enhance the measuring circuit output voltage is used a 16-channel amplifier of LA-UN16 model with a discretely adjustable gain. To record and process electrical signals is used a data acquisition device of NI USB-6009 model, which enables transmitting the received data to a PC via USB-interface. The data acquisition device has a built-in stabilized DC power supply that is used to power the strain gauge bridges. A developed schematic diagram of the measuring setup allows us to realize this measuring device and implement its modernization.Final processing of recorded data is provided through the software developed in visual programming environment LabVIEW 9.0. The program allows us to show the real-time measuring values of the cutting force components graphically and to record the taken data to a text file.The measuring setup modernization enabled increasing measurement accuracy and reducing time for processing and analysis of experimental data obtained when measuring the cutting force components. The MT2 Department of BMSTU uses it in education and research activities and in experimental efforts and laboratory classes.

  1. Measurement of renal function in a kidney donor: a comparison of creatinine-based and volume-based GFRs

    International Nuclear Information System (INIS)

    Choi, Don Kyoung; Choi, See Min; Jeong, Byong Chang; Seo, Seong Il; Jeon, Seong Soo; Lee, Hyun Moo; Choi, Han-Yong; Jeon, Hwang Gyun; Park, Bong Hee

    2015-01-01

    We aimed to evaluate the performance of various GFR estimates compared with direct measurement of GFR (dGFR). We also sought to create a new formula for volume-based GFR (new-vGFR) using kidney volume determined by CT. GFR was measured using creatinine-based methods (MDRD, the Cockcroft-Gault equation, CKD-EPI formula, and the Mayo clinic formula) and the Herts method, which is volume-based (vGFR). We compared performance between GFR estimates and created a new vGFR model by multiple linear regression analysis. Among the creatinine-based GFR estimates, the MDRD and C-G equations were similarly associated with dGFR (correlation and concordance coefficients of 0.359 and 0.369 and 0.354 and 0.318, respectively). We developed the following new kidney volume-based GFR formula: 217.48-0.39XA + 0.25XW-0.46XH-54.01XsCr + 0.02XV-19.89 (if female) (A = age, W = weight, H = height, sCr = serum creatinine level, V = total kidney volume). The MDRD and CKD-EPI had relatively better accuracy than the other creatinine-based methods (30.7 % vs. 32.3 % within 10 % and 78.0 % vs. 73.0 % within 30 %, respectively). However, the new-vGFR formula had the most accurate results among all of the analyzed methods (37.4 % within 10 % and 84.6 % within 30 %). The new-vGFR can replace dGFR or creatinine-based GFR for assessing kidney function in donors and healthy individuals. (orig.)

  2. Measurement of renal function in a kidney donor: a comparison of creatinine-based and volume-based GFRs

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Don Kyoung; Choi, See Min; Jeong, Byong Chang; Seo, Seong Il; Jeon, Seong Soo; Lee, Hyun Moo; Choi, Han-Yong; Jeon, Hwang Gyun [Sungkyunkwan University School of Medicine, Department of Urology, Samsung Medical Center, Seoul (Korea, Republic of); Park, Bong Hee [The Catholic University of Korea College of Medicine, Department of Urology, Incheon St. Mary' s Hospital, Seoul (Korea, Republic of)

    2015-11-15

    We aimed to evaluate the performance of various GFR estimates compared with direct measurement of GFR (dGFR). We also sought to create a new formula for volume-based GFR (new-vGFR) using kidney volume determined by CT. GFR was measured using creatinine-based methods (MDRD, the Cockcroft-Gault equation, CKD-EPI formula, and the Mayo clinic formula) and the Herts method, which is volume-based (vGFR). We compared performance between GFR estimates and created a new vGFR model by multiple linear regression analysis. Among the creatinine-based GFR estimates, the MDRD and C-G equations were similarly associated with dGFR (correlation and concordance coefficients of 0.359 and 0.369 and 0.354 and 0.318, respectively). We developed the following new kidney volume-based GFR formula: 217.48-0.39XA + 0.25XW-0.46XH-54.01XsCr + 0.02XV-19.89 (if female) (A = age, W = weight, H = height, sCr = serum creatinine level, V = total kidney volume). The MDRD and CKD-EPI had relatively better accuracy than the other creatinine-based methods (30.7 % vs. 32.3 % within 10 % and 78.0 % vs. 73.0 % within 30 %, respectively). However, the new-vGFR formula had the most accurate results among all of the analyzed methods (37.4 % within 10 % and 84.6 % within 30 %). The new-vGFR can replace dGFR or creatinine-based GFR for assessing kidney function in donors and healthy individuals. (orig.)

  3. Measures of Coupling between Neural Populations Based on Granger Causality Principle.

    Science.gov (United States)

    Kaminski, Maciej; Brzezicka, Aneta; Kaminski, Jan; Blinowska, Katarzyna J

    2016-01-01

    This paper shortly reviews the measures used to estimate neural synchronization in experimental settings. Our focus is on multivariate measures of dependence based on the Granger causality (G-causality) principle, their applications and performance in respect of robustness to noise, volume conduction, common driving, and presence of a "weak node." Application of G-causality measures to EEG, intracranial signals and fMRI time series is addressed. G-causality based measures defined in the frequency domain allow the synchronization between neural populations and the directed propagation of their electrical activity to be determined. The time-varying G-causality based measure Short-time Directed Transfer Function (SDTF) supplies information on the dynamics of synchronization and the organization of neural networks. Inspection of effective connectivity patterns indicates a modular structure of neural networks, with a stronger coupling within modules than between them. The hypothetical plausible mechanism of information processing, suggested by the identified synchronization patterns, is communication between tightly coupled modules intermitted by sparser interactions providing synchronization of distant structures.

  4. Calibration of a flexible measurement system based on industrial articulated robot and structured light sensor

    Science.gov (United States)

    Mu, Nan; Wang, Kun; Xie, Zexiao; Ren, Ping

    2017-05-01

    To realize online rapid measurement for complex workpieces, a flexible measurement system based on an articulated industrial robot with a structured light sensor mounted on the end-effector is developed. A method for calibrating the system parameters is proposed in which the hand-eye transformation parameters and the robot kinematic parameters are synthesized in the calibration process. An initial hand-eye calibration is first performed using a standard sphere as the calibration target. By applying the modified complete and parametrically continuous method, we establish a synthesized kinematic model that combines the initial hand-eye transformation and distal link parameters as a whole with the sensor coordinate system as the tool frame. According to the synthesized kinematic model, an error model is constructed based on spheres' center-to-center distance errors. Consequently, the error model parameters can be identified in a calibration experiment using a three-standard-sphere target. Furthermore, the redundancy of error model parameters is eliminated to ensure the accuracy and robustness of the parameter identification. Calibration and measurement experiments are carried out based on an ER3A-C60 robot. The experimental results show that the proposed calibration method enjoys high measurement accuracy, and this efficient and flexible system is suitable for online measurement in industrial scenes.

  5. Cloud fraction and cloud base measurements from scanning Doppler lidar during WFIP-2

    Science.gov (United States)

    Bonin, T.; Long, C.; Lantz, K. O.; Choukulkar, A.; Pichugina, Y. L.; McCarty, B.; Banta, R. M.; Brewer, A.; Marquis, M.

    2017-12-01

    The second Wind Forecast Improvement Project (WFIP-2) consisted of an 18-month field deployment of a variety of instrumentation with the principle objective of validating and improving NWP forecasts for wind energy applications in complex terrain. As a part of the set of instrumentation, several scanning Doppler lidars were installed across the study domain to primarily measure profiles of the mean wind and turbulence at high-resolution within the planetary boundary layer. In addition to these measurements, Doppler lidar observations can be used to directly quantify the cloud fraction and cloud base, since clouds appear as a high backscatter return. These supplementary measurements of clouds can then be used to validate cloud cover and other properties in NWP output. Herein, statistics of the cloud fraction and cloud base height from the duration of WFIP-2 are presented. Additionally, these cloud fraction estimates from Doppler lidar are compared with similar measurements from a Total Sky Imager and Radiative Flux Analysis (RadFlux) retrievals at the Wasco site. During mostly cloudy to overcast conditions, estimates of the cloud radiating temperature from the RadFlux methodology are also compared with Doppler lidar measured cloud base height.

  6. Measures of coupling between neural populations based on Granger causality principle

    Directory of Open Access Journals (Sweden)

    Maciej Kaminski

    2016-10-01

    Full Text Available This paper shortly reviews the measures used to estimate neural synchronization in experimental settings. Our focus is on multivariate measures of dependence based on the Granger causality (G-causality principle, their applications and performance in respect of robustness to noise, volume conduction, common driving, and presence of a weak node. Application of G-causality measures to EEG, intracranial signals and fMRI time series is addressed. G-causality based measures defined in the frequency domain allow the synchronization between neural populations and the directed propagation of their electrical activity to be determined. The time-varying G-causality based measure Short-time Directed Transfer Function (SDTF supplies information on the dynamics of synchronization and the organization of neural networks. Inspection of effective connectivity patterns indicates a modular structure of neural networks, with a stronger coupling within modules than between them. The hypothetical plausible mechanism of information processing, suggested by the identified synchronization patterns, is communication between tightly coupled modules intermitted by sparser interactions providing synchronization of distant structures.

  7. A compact wideband precision impedance measurement system based on digital auto-balancing bridge

    International Nuclear Information System (INIS)

    Hu, Binxin; Wang, Jinyu; Song, Guangdong; Zhang, Faxiang

    2016-01-01

    The ac impedance spectroscopy measurements are predominantly taken by using impedance analyzers based on analog auto-balancing bridge. However, those bench-top analyzers are generally complicated, bulky and expensive, thus limiting their usage in industrial field applications. This paper presents the development of a compact wideband precision measurement system based on digital auto-balancing bridge. The methods of digital auto-balancing bridge and digital lock-in amplifier are analyzed theoretically. The overall design and several key sections including null detector, direct digital synthesizer-based sampling clock, and digital control unit are introduced in detail. The results show that the system achieves a basic measurement accuracy of 0.05% with a frequency range of 20 Hz–2 MHz. The advantages of versatile measurement capacity, fast measurement speed, small size and low cost make it quite suitable for industrial field applications. It is demonstrated that this system is practical and effective by applying in determining the impedance-temperature characteristic of a motor starter PTC thermistor. (paper)

  8. Beam Based RF Voltage Measurements and Longitudinal Beam Tomography at the Fermilab Booster

    Energy Technology Data Exchange (ETDEWEB)

    Bhat, C. M. [Fermilab; Bhat, S. [Fermilab

    2017-10-19

    Increasing proton beam power on neutrino production targets is one of the major goals of the Fermilab long term accelerator programs. In this effort, the Fermilab 8 GeV Booster synchrotron plays a critical role for at least the next two decades. Therefore, understanding the Booster in great detail is important as we continue to improve its performance. For example, it is important to know accurately the available RF power in the Booster by carrying out beam-based measurements in order to specify the needed upgrades to the Booster RF system. Since the Booster magnetic field is changing continuously measuring/calibrating the RF voltage is not a trivial task. Here, we present a beam based method for the RF voltage measurements. Data analysis is carried out using computer programs developed in Python and MATLAB. The method presented here is applicable to any RCS which do not have flat-bottom and flat-top in the acceleration magnetic ramps. We have also carried out longitudinal beam tomography at injection and extraction energies with the data used for RF voltage measurements. Beam based RF voltage measurements and beam tomography were never done before for the Fermilab Booster. The results from these investigations will be very useful in future intensity upgrades.

  9. Magneto-acousto-electrical Measurement Based Electrical Conductivity Reconstruction for Tissues.

    Science.gov (United States)

    Zhou, Yan; Ma, Qingyu; Guo, Gepu; Tu, Juan; Zhang, Dong

    2018-05-01

    Based on the interaction of ultrasonic excitation and magnetoelectrical induction, magneto-acousto-electrical (MAE) technology was demonstrated to have the capability of differentiating conductivity variations along the acoustic transmission. By applying the characteristics of the MAE voltage, a simplified algorithm of MAE measurement based conductivity reconstruction was developed. With the analyses of acoustic vibration, ultrasound propagation, Hall effect, and magnetoelectrical induction, theoretical and experimental studies of MAE measurement and conductivity reconstruction were performed. The formula of MAE voltage was derived and simplified for the transducer with strong directivity. MAE voltage was simulated for a three-layer gel phantom and the conductivity distribution was reconstructed using the modified Wiener inverse filter and Hilbert transform, which was also verified by experimental measurements. The experimental results are basically consistent with the simulations, and demonstrate that the wave packets of MAE voltage are generated at tissue interfaces with the amplitudes and vibration polarities representing the values and directions of conductivity variations. With the proposed algorithm, the amplitude and polarity of conductivity gradient can be restored and the conductivity distribution can also be reconstructed accurately. The favorable results demonstrate the feasibility of accurate conductivity reconstruction with improved spatial resolution using MAE measurement for tissues with conductivity variations, especially suitable for nondispersive tissues with abrupt conductivity changes. This study demonstrates that the MAE measurement based conductivity reconstruction algorithm can be applied as a new strategy for nondestructive real-time monitoring of conductivity variations in biomedical engineering.

  10. Active load reduction using individual pitch, based on local blade flow measurements

    DEFF Research Database (Denmark)

    Larsen, Torben J.; Aagaard Madsen, H.; Thomsen, K.

    2005-01-01

    -of-the-art load-reducing concepts. Since the new flow-based concept deviates significantly from previous published load-reducing strategies, a comparison of the performance based on aeroelastic simulations is included. Advantages and drawbacks of the systems are discussed. Copyright (C) 2004 John Wiley Sons, Ltd.......A new load-reducing control strategy for individual blade control of large pitch-controlled wind turbines is presented This control concept is based on local blade inflow measurements and offers the possibility of larger load reductions, without loss of power production, than seen in other state...

  11. Weighted Evidence Combination Rule Based on Evidence Distance and Uncertainty Measure: An Application in Fault Diagnosis

    Directory of Open Access Journals (Sweden)

    Lei Chen

    2018-01-01

    Full Text Available Conflict management in Dempster-Shafer theory (D-S theory is a hot topic in information fusion. In this paper, a novel weighted evidence combination rule based on evidence distance and uncertainty measure is proposed. The proposed approach consists of two steps. First, the weight is determined based on the evidence distance. Then, the weight value obtained in first step is modified by taking advantage of uncertainty. Our proposed method can efficiently handle high conflicting evidences with better performance of convergence. A numerical example and an application based on sensor fusion in fault diagnosis are given to demonstrate the efficiency of our proposed method.

  12. Multiple Measures of Outcome in Assessing a Prison-Based Drug Treatment Program

    Science.gov (United States)

    Prendergast, Michael L.; Hall, Elizabeth A.; Wexler, Harry K.

    2003-01-01

    Evaluations of prison-based drug treatment programs typically focus on one or two dichotomous outcome variables related to recidivism. In contrast, this paper uses multiple measures of outcomes related to crime and drug use to examine the impact of prison treatment. Crime variables included self-report data of time to first illegal activity,…

  13. Measurement of reaction heats using a polysilicon-based microcalorimetric sensor

    NARCIS (Netherlands)

    Vereshchagina, E.; Wolters, Robertus A.M.; Gardeniers, Johannes G.E.

    2011-01-01

    In this work we present a low-cost, low-power, small sample volume microcalorimetric sensor for the measurement of reaction heats. The polysilicon-based microcalorimetric sensor combines several advantages: (i) complementary metal oxide semiconductor technology (CMOS) for future integration; (ii)

  14. Hydrocarbon Fuel Thermal Performance Modeling based on Systematic Measurement and Comprehensive Chromatographic Analysis

    Science.gov (United States)

    2016-07-31

    distribution unlimited Hydrocarbon Fuel Thermal Performance Modeling based on Systematic Measurement and Comprehensive Chromatographic Analysis Matthew...vital importance for hydrocarbon -fueled propulsion systems: fuel thermal performance as indicated by physical and chemical effects of cooling passage... analysis . The selection and acquisition of a set of chemically diverse fuels is pivotal for a successful outcome since test method validation and

  15. An Entropy-Based Measure for Assessing Fuzziness in Logistic Regression

    Science.gov (United States)

    Weiss, Brandi A.; Dardick, William

    2016-01-01

    This article introduces an entropy-based measure of data-model fit that can be used to assess the quality of logistic regression models. Entropy has previously been used in mixture-modeling to quantify how well individuals are classified into latent classes. The current study proposes the use of entropy for logistic regression models to quantify…

  16. Complete methodology on generating realistic wind speed profiles based on measurements

    DEFF Research Database (Denmark)

    Gavriluta, Catalin; Spataru, Sergiu; Mosincat, Ioan

    2012-01-01

    , wind modelling for medium and large time scales is poorly treated in the present literature. This paper presents methods for generating realistic wind speed profiles based on real measurements. The wind speed profile is divided in a low- frequency component (describing long term variations...

  17. Quantification of in situ temperature measurements on a PBI-based high temperature PEMFC unit cell

    DEFF Research Database (Denmark)

    Lebæk, Jesper; Ali, Syed Talat; Møller, Per

    2010-01-01

    The temperature is a very important operating parameter for all types of fuel cells. In the present work distributed in situ temperature measurements are presented on a polybenzimidazole based high temperature PEM fuel cell (HT-PEM). A total of 16 T-type thermocouples were embedded on both the an...

  18. Measuring Collaboration and Communication to Increase Implementation of Evidence-Based Practices: The Cultural Exchange Inventory

    Science.gov (United States)

    Palinkas, Lawrence A.; Garcia, Antonio; Aarons, Gregory; Finno-Velasquez, Megan; Fuentes, Dahlia; Holloway, Ian; Chamberlain, Patricia

    2018-01-01

    The Cultural Exchange Inventory (CEI) is a 15-item instrument designed to measure the process (7 items) and outcomes (8 items) of exchanges of knowledge, attitudes and practices between members of different organisations collaborating in implementing evidence-based practice. We conducted principal axis factor analyses and parallel analyses of data…

  19. Generalizability Theory Reliability of Written Expression Curriculum-Based Measurement in Universal Screening

    Science.gov (United States)

    Keller-Margulis, Milena A.; Mercer, Sterett H.; Thomas, Erin L.

    2016-01-01

    The purpose of this study was to examine the reliability of written expression curriculum-based measurement (WE-CBM) in the context of universal screening from a generalizability theory framework. Students in second through fifth grade (n = 145) participated in the study. The sample included 54% female students, 49% White students, 23% African…

  20. Measurement-based Evaluation of the Impact of Large Vehicle Shadowing on V2X Communications

    DEFF Research Database (Denmark)

    Rodriguez, Ignacio; Portela Lopes de Almeida, Erika; Lauridsen, Mads

    2016-01-01

    Upcoming applications, such as autonomous vehicles, will pose strict requirements on the vehicular networks. In order to provide these new services reliably, an accurate understanding of propagation in the vehicular scenarios is needed. In this context, this paper presents a measurement-based eva...