WorldWideScience

Sample records for previous measurements based

  1. The impact of previous knee injury on force plate and field-based measures of balance.

    Science.gov (United States)

    Baltich, Jennifer; Whittaker, Jackie; Von Tscharner, Vinzenz; Nettel-Aguirre, Alberto; Nigg, Benno M; Emery, Carolyn

    2015-10-01

    Individuals with post-traumatic osteoarthritis demonstrate increased sway during quiet stance. The prospective association between balance and disease onset is unknown. Improved understanding of balance in the period between joint injury and disease onset could inform secondary prevention strategies to prevent or delay the disease. This study examines the association between youth sport-related knee injury and balance, 3-10years post-injury. Participants included 50 individuals (ages 15-26years) with a sport-related intra-articular knee injury sustained 3-10years previously and 50 uninjured age-, sex- and sport-matched controls. Force-plate measures during single-limb stance (center-of-pressure 95% ellipse-area, path length, excursion, entropic half-life) and field-based balance scores (triple single-leg hop, star-excursion, unipedal dynamic balance) were collected. Descriptive statistics (mean within-pair difference; 95% confidence intervals) were used to compare groups. Linear regression (adjusted for injury history) was used to assess the relationship between ellipse-area and field-based scores. Injured participants on average demonstrated greater medio-lateral excursion [mean within-pair difference (95% confidence interval); 2.8mm (1.0, 4.5)], more regular medio-lateral position [10ms (2, 18)], and shorter triple single-leg hop distances [-30.9% (-8.1, -53.7)] than controls, while no between group differences existed for the remaining outcomes. After taking into consideration injury history, triple single leg hop scores demonstrated a linear association with ellipse area (β=0.52, 95% confidence interval 0.01, 1.01). On average the injured participants adjusted their position less frequently and demonstrated a larger magnitude of movement during single-limb stance compared to controls. These findings support the evaluation of balance outcomes in the period between knee injury and post-traumatic osteoarthritis onset. Copyright © 2015 Elsevier Ltd. All rights

  2. Combining complexity measures of EEG data: multiplying measures reveal previously hidden information.

    Science.gov (United States)

    Burns, Thomas; Rajan, Ramesh

    2015-01-01

    Many studies have noted significant differences among human electroencephalograph (EEG) results when participants or patients are exposed to different stimuli, undertaking different tasks, or being affected by conditions such as epilepsy or Alzheimer's disease. Such studies often use only one or two measures of complexity and do not regularly justify their choice of measure beyond the fact that it has been used in previous studies. If more measures were added to such studies, however, more complete information might be found about these reported differences. Such information might be useful in confirming the existence or extent of such differences, or in understanding their physiological bases. In this study we analysed publically-available EEG data using a range of complexity measures to determine how well the measures correlated with one another. The complexity measures did not all significantly correlate, suggesting that different measures were measuring unique features of the EEG signals and thus revealing information which other measures were unable to detect. Therefore, the results from this analysis suggests that combinations of complexity measures reveal unique information which is in addition to the information captured by other measures of complexity in EEG data. For this reason, researchers using individual complexity measures for EEG data should consider using combinations of measures to more completely account for any differences they observe and to ensure the robustness of any relationships identified.

  3. First Ground-Based Infrared Solar Absorption Measurements of Free Tropospheric Methanol (CH3OH): Multidecade Infrared Time Series from Kitt Peak (31.9 deg N 111.6 deg W): Trend, Seasonal Cycle, and Comparison with Previous Measurements

    Science.gov (United States)

    Rinsland, Curtis P.; Mahieu, Emmanuel; Chiou, Linda; Herbin, Herve

    2009-01-01

    Atmospheric CH3OH (methanol) free tropospheric (2.09-14-km altitude) time series spanning 22 years has been analyzed on the basis of high-spectral resolution infrared solar absorption spectra of the strong vs band recorded from the U.S. National Solar Observatory on Kitt Peak (latitude 31.9degN, 111.6degW, 2.09-km altitude) with a 1-m Fourier transform spectrometer (FTS). The measurements span October 1981 to December 2003 and are the first long time series of CH3OH measurements obtained from the ground. The results were analyzed with SFIT2 version 3.93 and show a factor of three variations with season, a maximum at the beginning of July, a winter minimum, and no statistically significant long-term trend over the measurement time span.

  4. Multispecies Coevolution Particle Swarm Optimization Based on Previous Search History

    Directory of Open Access Journals (Sweden)

    Danping Wang

    2017-01-01

    Full Text Available A hybrid coevolution particle swarm optimization algorithm with dynamic multispecies strategy based on K-means clustering and nonrevisit strategy based on Binary Space Partitioning fitness tree (called MCPSO-PSH is proposed. Previous search history memorized into the Binary Space Partitioning fitness tree can effectively restrain the individuals’ revisit phenomenon. The whole population is partitioned into several subspecies and cooperative coevolution is realized by an information communication mechanism between subspecies, which can enhance the global search ability of particles and avoid premature convergence to local optimum. To demonstrate the power of the method, comparisons between the proposed algorithm and state-of-the-art algorithms are grouped into two categories: 10 basic benchmark functions (10-dimensional and 30-dimensional, 10 CEC2005 benchmark functions (30-dimensional, and a real-world problem (multilevel image segmentation problems. Experimental results show that MCPSO-PSH displays a competitive performance compared to the other swarm-based or evolutionary algorithms in terms of solution accuracy and statistical tests.

  5. In vivo dentate nucleus MRI relaxometry correlates with previous administration of Gadolinium-based contrast agents

    Energy Technology Data Exchange (ETDEWEB)

    Tedeschi, Enrico; Canna, Antonietta; Cocozza, Sirio; Russo, Carmela; Angelini, Valentina; Brunetti, Arturo [University ' ' Federico II' ' , Neuroradiology, Department of Advanced Biomedical Sciences, Naples (Italy); Palma, Giuseppe; Quarantelli, Mario [National Research Council, Institute of Biostructure and Bioimaging, Naples (Italy); Borrelli, Pasquale; Salvatore, Marco [IRCCS SDN, Naples (Italy); Lanzillo, Roberta; Postiglione, Emanuela; Morra, Vincenzo Brescia [University ' ' Federico II' ' , Department of Neurosciences, Reproductive and Odontostomatological Sciences, Naples (Italy)

    2016-12-15

    To evaluate changes in T1 and T2* relaxometry of dentate nuclei (DN) with respect to the number of previous administrations of Gadolinium-based contrast agents (GBCA). In 74 relapsing-remitting multiple sclerosis (RR-MS) patients with variable disease duration (9.8±6.8 years) and severity (Expanded Disability Status Scale scores:3.1±0.9), the DN R1 (1/T1) and R2* (1/T2*) relaxation rates were measured using two unenhanced 3D Dual-Echo spoiled Gradient-Echo sequences with different flip angles. Correlations of the number of previous GBCA administrations with DN R1 and R2* relaxation rates were tested, including gender and age effect, in a multivariate regression analysis. The DN R1 (normalized by brainstem) significantly correlated with the number of GBCA administrations (p<0.001), maintaining the same significance even when including MS-related factors. Instead, the DN R2* values correlated only with age (p=0.003), and not with GBCA administrations (p=0.67). In a subgroup of 35 patients for whom the administered GBCA subtype was known, the effect of GBCA on DN R1 appeared mainly related to linear GBCA. In RR-MS patients, the number of previous GBCA administrations correlates with R1 relaxation rates of DN, while R2* values remain unaffected, suggesting that T1-shortening in these patients is related to the amount of Gadolinium given. (orig.)

  6. Attribute and topology based change detection in a constellation of previously detected objects

    Science.gov (United States)

    Paglieroni, David W.; Beer, Reginald N.

    2016-01-19

    A system that applies attribute and topology based change detection to networks of objects that were detected on previous scans of a structure, roadway, or area of interest. The attributes capture properties or characteristics of the previously detected objects, such as location, time of detection, size, elongation, orientation, etc. The topology of the network of previously detected objects is maintained in a constellation database that stores attributes of previously detected objects and implicitly captures the geometrical structure of the network. A change detection system detects change by comparing the attributes and topology of new objects detected on the latest scan to the constellation database of previously detected objects.

  7. Ionospheric measurements during the CRISTA/MAHRSI campaign: their implications and comparison with previous campaigns

    Directory of Open Access Journals (Sweden)

    J. Laštovicka

    1999-08-01

    Full Text Available The CRISTA/MAHRSI experiment on board a space shuttle was accompanied by a broad campaign of rocket, balloon and ground-based measurements. Supporting lower ionospheric ground-based measurements were run in Europe and Eastern Asia between 1 October-30 November, 1994. Results of comparisons with long ionospheric data series together with short-term comparisons inside the interval October-November, 1994, showed that the upper middle atmosphere  (h = 80-100 km at middle latitudes of the Northern Hemisphere in the interval of the CRISTA/MAHRSI experiment (4-12 November, 1994 was very close to its expected climatological state. In other words, the average results of the experiment can be used as climatological data, at least for the given area/altitudes. The role of solar/geomagnetic and "meteorological" control of the lower ionosphere is investigated and compared with the results of MAP/WINE, MAC/SINE and DYANA campaigns. The effects of both solar/geomagnetic and global meteorological factors on the lower ionosphere are found to be weak during autumn 1994 compared to those in MAP/WINE and DYANA winters, and they are even slightly weaker than those in MAP/SINE summer. The comparison of the four campaigns suggests the following overall pattern: in winter the lower ionosphere at northern middle latitudes appears to be fairly well "meteorologically" controlled with a very weak solar influence. In summer, solar influence is somewhat stronger and dominates the weak "meteorological" influence, but the overall solar/meteorological control is weaker than in winter. In autumn we find the weakest overall solar/meteorological control, local effects evidently dominate.Key words. Ionosphere (ionosphere · atmosphere interactions; mid-latitude ionosphere

  8. Ionospheric measurements during the CRISTA/MAHRSI campaign: their implications and comparison with previous campaigns

    Directory of Open Access Journals (Sweden)

    J. Laštovicka

    Full Text Available The CRISTA/MAHRSI experiment on board a space shuttle was accompanied by a broad campaign of rocket, balloon and ground-based measurements. Supporting lower ionospheric ground-based measurements were run in Europe and Eastern Asia between 1 October-30 November, 1994. Results of comparisons with long ionospheric data series together with short-term comparisons inside the interval October-November, 1994, showed that the upper middle atmosphere 
    (h = 80-100 km at middle latitudes of the Northern Hemisphere in the interval of the CRISTA/MAHRSI experiment (4-12 November, 1994 was very close to its expected climatological state. In other words, the average results of the experiment can be used as climatological data, at least for the given area/altitudes. The role of solar/geomagnetic and "meteorological" control of the lower ionosphere is investigated and compared with the results of MAP/WINE, MAC/SINE and DYANA campaigns. The effects of both solar/geomagnetic and global meteorological factors on the lower ionosphere are found to be weak during autumn 1994 compared to those in MAP/WINE and DYANA winters, and they are even slightly weaker than those in MAP/SINE summer. The comparison of the four campaigns suggests the following overall pattern: in winter the lower ionosphere at northern middle latitudes appears to be fairly well "meteorologically" controlled with a very weak solar influence. In summer, solar influence is somewhat stronger and dominates the weak "meteorological" influence, but the overall solar/meteorological control is weaker than in winter. In autumn we find the weakest overall solar/meteorological control, local effects evidently dominate.

    Key words. Ionosphere (ionosphere · atmosphere interactions; mid-latitude ionosphere

  9. Intraocular pressure measurement in patients with previous LASIK surgery using pressure phosphene tonometer.

    Science.gov (United States)

    Cheng, Arthur C K; Leung, Dexter Y L; Cheung, Eva Y Y; Fan, Dorothy S P; Law, Ricky W K; Lam, Dennis S C

    2005-04-01

    To compare intraocular pressure (IOP) assessment in post-LASIK patients using non-contact tonometry, pressure phosphene tonometry and applanation tonometry. Sixty-two consecutive LASIK patients were analysed preoperatively and postoperatively with non-contact, pressure phosphene and applanation tonometry. Comparisons among these values were assessed with paired sample Student t-test, Pearson's correlation test and Bland-Altman plotting. There was no significant difference for preoperative IOP measurement between non-contact, pressure phosphene and applanation tonometry. The mean +/-SD difference between the preoperative non-contact tonometry and postoperative pressure phosphene tonometry IOP measurements was 0.80 +/- 2.77 mmHg (P contact tonometry significantly underestimated IOP measurement by 9.96 +/- 2.25 mmHg (P < 0.001). Pressure phosphene tonometry may provide an alternative method for the assessment of IOP in post-LASIK patients.

  10. Cultivation-based multiplex phenotyping of human gut microbiota allows targeted recovery of previously uncultured bacteria

    DEFF Research Database (Denmark)

    Rettedal, Elizabeth; Gumpert, Heidi; Sommer, Morten

    2014-01-01

    The human gut microbiota is linked to a variety of human health issues and implicated in antibiotic resistance gene dissemination. Most of these associations rely on culture-independent methods, since it is commonly believed that gut microbiota cannot be easily or sufficiently cultured. Here, we...... microbiota. Based on the phenotypic mapping, we tailor antibiotic combinations to specifically select for previously uncultivated bacteria. Utilizing this method we cultivate and sequence the genomes of four isolates, one of which apparently belongs to the genus Oscillibacter; uncultivated Oscillibacter...

  11. Vaccinia-based influenza vaccine overcomes previously induced immunodominance hierarchy for heterosubtypic protection.

    Science.gov (United States)

    Kwon, Ji-Sun; Yoon, Jungsoon; Kim, Yeon-Jung; Kang, Kyuho; Woo, Sunje; Jung, Dea-Im; Song, Man Ki; Kim, Eun-Ha; Kwon, Hyeok-Il; Choi, Young Ki; Kim, Jihye; Lee, Jeewon; Yoon, Yeup; Shin, Eui-Cheol; Youn, Jin-Won

    2014-08-01

    Growing concerns about unpredictable influenza pandemics require a broadly protective vaccine against diverse influenza strains. One of the promising approaches was a T cell-based vaccine, but the narrow breadth of T-cell immunity due to the immunodominance hierarchy established by previous influenza infection and efficacy against only mild challenge condition are important hurdles to overcome. To model T-cell immunodominance hierarchy in humans in an experimental setting, influenza-primed C57BL/6 mice were chosen and boosted with a mixture of vaccinia recombinants, individually expressing consensus sequences from avian, swine, and human isolates of influenza internal proteins. As determined by IFN-γ ELISPOT and polyfunctional cytokine secretion, the vaccinia recombinants of influenza expanded the breadth of T-cell responses to include subdominant and even minor epitopes. Vaccine groups were successfully protected against 100 LD50 challenges with PR/8/34 and highly pathogenic avian influenza H5N1, which contained the identical dominant NP366 epitope. Interestingly, in challenge with pandemic A/Cal/04/2009 containing mutations in the dominant epitope, only the group vaccinated with rVV-NP + PA showed improved protection. Taken together, a vaccinia-based influenza vaccine expressing conserved internal proteins improved the breadth of influenza-specific T-cell immunity and provided heterosubtypic protection against immunologically close as well as distant influenza strains. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Late preterm birth and previous cesarean section: a population-based cohort study.

    Science.gov (United States)

    Yasseen Iii, Abdool S; Bassil, Kate; Sprague, Ann; Urquia, Marcelo; Maguire, Jonathon L

    2018-02-21

    Late preterm birth (LPB) is increasingly common and associated with higher morbidity and mortality than term birth. Yet, little is known about the influence of previous cesarean section (PCS) and the occurrence of LPB in subsequent pregnancies. We aim to evaluate this association along with the potential mediation by cesarean sections in the current pregnancy. We use population-based birth registry data (2005-2012) to establish a cohort of live born singleton infants born between 34 and 41 gestational weeks to multiparous mothers. PCS was the primary exposure, LPB (34-36 weeks) was the primary outcome, and an unplanned or emergency cesarean section in the current pregnancy was the potential mediator. Associations were quantified using propensity weighted multivariable Poisson regression, and mediating associations were explored using the Baron-Kenny approach. The cohort included 481,531 births, 21,893 (4.5%) were LPB, and 119,983 (24.9%) were predated by at least one PCS. Among mothers with at least one PCS, 6307 (5.26%) were LPB. There was increased risk of LPB among women with at least one PCS (adjusted Relative Risk (aRR): 1.20 (95%CI [1.16, 1.23]). Unplanned or emergency cesarean section in the current pregnancy was identified as a strong mediator to this relationship (mediation ratio = 97%). PCS was associated with higher risk of LPB in subsequent pregnancies. This may be due to an increased risk of subsequent unplanned or emergency preterm cesarean sections. Efforts to minimize index cesarean sections may reduce the risk of LPB in subsequent pregnancies.

  13. Refractive index based measurements

    DEFF Research Database (Denmark)

    2014-01-01

    In a method for performing a refractive index based measurement of a property of a fluid such as chemical composition or temperature by observing an apparent angular shift in an interference fringe pattern produced by back or forward scattering interferometry, ambiguities in the measurement caused...... by the apparent shift being consistent with one of a number of numerical possibilities for the real shift which differ by 2n are resolved by combining measurements performed on the same sample using light paths therethrough of differing lengths....

  14. Refractive index based measurements

    DEFF Research Database (Denmark)

    2014-01-01

    In a method for performing a refractive index based measurement of a property of a fluid such as chemical composition or temperature, a chirp in the local spatial frequency of interference fringes of an interference pattern is reduced by mathematical manipulation of the recorded light intensity...

  15. Efficacy of peg-interferon based treatment in patients with hepatitis C refractory to previous conventional interferon-based treatment

    International Nuclear Information System (INIS)

    Shaikh, S.; Devrajani, B.R.; Kalhoro, M.

    2012-01-01

    Objective: To determine the efficacy of peg-interferon-based therapy in patients refractory to previous conventional interferon-based treatment and factors predicting sustained viral response (SVR). Study Design: Analytical study. Place and Duration of Study: Medical Unit IV, Liaquat University Hospital, Jamshoro, from July 2009 to June 2011. Methodology: This study included consecutive patients of hepatitis C who were previously treated with conventional interferon-based treatment for 6 months but were either non-responders, relapsed or had virologic breakthrough and stage = 2 with fibrosis on liver biopsy. All eligible patients were provided peg-interferon at the dosage of 180 mu g weekly with ribavirin thrice a day for 6 months. Sustained Viral Response (SVR) was defined as absence of HCV RNA at twenty four week after treatment. All data was processed on SPSS version 16. Results: Out of 450 patients enrolled in the study, 192 were excluded from the study on the basis of minimal fibrosis (stage 0 and 1). Two hundred and fifty eight patients fulfilled the inclusion criteria and 247 completed the course of peg-interferon treatment. One hundred and sixty one (62.4%) were males and 97 (37.6%) were females. The mean age was 39.9 +- 6.1 years, haemoglobin was 11.49 +- 2.45 g/dl, platelet count was 127.2 +- 50.6 10/sup 3/ /mm/sup 3/, ALT was 99 +- 65 IU/L. SVR was achieved in 84 (32.6%). The strong association was found between SVR and the pattern of response (p = 0. 001), degree of fibrosis and early viral response (p = 0.001). Conclusion: Peg-interferon based treatment is an effective and safe treatment option for patients refractory to conventional interferon-based treatment. (author)

  16. Refractive index based measurements

    DEFF Research Database (Denmark)

    2014-01-01

    A refractive index based measurement of a property of a fluid is measured in an apparatus comprising a variable wavelength coherent light source (16), a sample chamber (12), a wavelength controller (24), a light sensor (20), a data recorder (26) and a computation apparatus (28), by - directing...... coherent light having a wavelength along an input light path, - producing scattering of said light from each of a plurality of interfaces within said apparatus including interfaces between said fluid and a surface bounding said fluid, said scattering producing an interference pattern formed by said...... scattered light, - cyclically varying the wavelength of said light in said input light path over a 1 nm to 20nm wide range of wavelengths a rate of from 10Hz to 50 KHz, - recording variation of intensity of the interfering light with change in wavelength of the light at an angle of observation...

  17. Analysis of Product Buying Decision on Lazada E-commerce based on Previous Buyers’ Comments

    Directory of Open Access Journals (Sweden)

    Neil Aldrin

    2017-06-01

    Full Text Available The aims of the present research are: 1 to know that product buying decision possibly occurs, 2 to know how product buying decision occurs on Lazada e-commerce’s customers, 3 how previous buyers’ comments can increase product buying decision on Lazada e-commerce. This research utilizes qualitative research method. Qualitative research is a research that investigates other researches and makes assumption or discussion result so that other analysis results can be made in order to widen idea and opinion. Research result shows that product which has many ratings and reviews will trigger other buyers to purchase or get that product. The conclusion is that product buying decision may occur because there are some processes before making decision which are: looking for recognition and searching for problems, knowing the needs, collecting information, evaluating alternative, evaluating after buying. In those stages, buying decision on Lazada e-commerce is supported by price, promotion, service, and brand.

  18. The reliability of the Associate Platinum digital foot scanner in measuring previously developed footprint characteristics: a technical note.

    Science.gov (United States)

    Papuga, M Owen; Burke, Jeanmarie R

    2011-02-01

    An ink pad and paper, pressure-sensitive platforms, and photography have previously been used to collect footprint data used in clinical assessment. Digital scanners have been widely used more recently to collect such data. The purpose of this study was to evaluate the intra- and interrater reliability of a flatbed digital image scanning technology to capture footprint data. This study used a repeated-measures design on 32 (16 male 16 female) healthy subjects. The following measured indices of footprint were recorded from 2-dimensional images of the plantar surface of the foot recorded with an Associate Platinum (Foot Levelers Inc, Roanoke, VA) digital foot scanner: Staheli index, Chippaux-Smirak index, arch angle, and arch index. Intraclass correlation coefficient (ICC) values were calculated to evaluate intrarater, interday, and interclinician reliability. The ICC values for intrarater reliability were greater than or equal to .817, indicating an excellent level of reproducibility in assessing the collected images. Analyses of variance revealed that there were no significant differences between raters for each index (P > .05). The ICC values also indicated excellent reliability (.881-.971) between days and clinicians in all but one of the indices of footprint, arch angle (.689), with good reliability between clinicians. The full-factorial analysis of variance model did not reveal any interaction effects (P > .05), which indicated that indices of footprint were not changing across days and clinicians. Scanning technology used in this study demonstrated good intra- and interrater reliability measurements of footprint indices, as demonstrated by high ICC values. Copyright © 2011 National University of Health Sciences. Published by Mosby, Inc. All rights reserved.

  19. A New Zealand based cohort study of anaesthetic trainees' career outcomes compared with previously expressed intentions.

    Science.gov (United States)

    Moran, E M L; French, R A; Kennedy, R R

    2011-09-01

    Predicting workforce requirements is a difficult but necessary part of health resource planning. A 'snapshot' workforce survey undertaken in 2002 examined issues that New Zealand anaesthesia trainees expected would influence their choice of future workplace. We have restudied the same cohort to see if that workforce survey was a good predictor of outcome. Seventy (51%) of 138 surveys were completed in 2009 compared with 100 (80%) of 138 in the 2002 survey. Eighty percent of the 2002 respondents planned consultant positions in New Zealand. We found 64% of respondents were working in New Zealand (P New Zealand based respondents but only 40% of those living outside New Zealand agreed or strongly agreed with this statement (P New Zealand but was important for only 2% of those resident in New Zealand (P New Zealand were predominantly between NZ$150,000 and $200,000 while those overseas received between NZ$300,000 and $400,000. Of those that are resident in New Zealand, 84% had studied in a New Zealand medical school compared with 52% of those currently working overseas (P < 0.01). Our study shows that stated career intentions in a group do not predict the actual group outcomes. We suggest that 'snapshot' studies examining workforce intentions are of little value for workforce planning. However we believe an ongoing program matching career aspirations against career outcomes would be a useful tool in workforce planning.

  20. Volatile Organic Compounds (VOCs) Measurements in Karachi, Pakistan (2006): a Comparison With Previous Urban Sampling Campaigns Worldwide.

    Science.gov (United States)

    Barletta, B.; Meinardi, S.; Khwaja, H. A.; Beyersdorf, A. J.; Baker, A. K.; Zou, S.; Rowland, F.; Blake, D. R.

    2008-12-01

    Mixing ratios of carbon monoxide (CO), carbon dioxide (CO2), methane (CH4), and 47 nonmethane hydrocarbons - NMHCs - (19 alkanes, 13 alkenes, ethyne, and 14 aromatics) were determined for ground level whole air samples collected during the winter of 2006 in Karachi, Pakistan. Pakistan is among the fastest growing economies in Asia, and Karachi is one of the largest cities in the world with a rapidly expanding population of over 14 million in the whole metropolitan area, and a large industrial base. Samples were collected in January 2006 throughout the urban area to characterize the overall air composition of the city, and along the busiest road to determine the traffic signature of Karachi. This sampling campaign follows a previous study carried out in the winter of 1998-1999 in the same city, when elevated concentrations of many NMHCs were observed. Exceptionally high levels of methane were still observed in 2006 with an average mixing ratio of 5.0 ppmv (6.3 ppmv were observed in 1999). The overall air composition of the Karachi urban environment characterized during this 2006 sampling is compared to 1999 aiming to highlight any possible change in the main VOC sources present throughout the city. In particular, we want to evaluate the impact of the heavy usage of natural gas on the overall air quality of Karachi and the recently increased use of liquefied petroleum gas (LPG) as alternative source of energy. We also compare the composition of the urban troposphere of Karachi to other major urban centers worldwide such as Guangzhou (China), Mexico City (Mexico), and Milan (Italy).

  1. Estimating the effect of current, previous and never use of drugs in studies based on prescription registries

    DEFF Research Database (Denmark)

    Nielsen, Lars Hougaard; Løkkegaard, Ellen; Andreasen, Anne Helms

    2009-01-01

    of this misclassification for analysing the risk of breast cancer. MATERIALS AND METHODS: Prescription data were obtained from Danish Registry of Medicinal Products Statistics and we applied various methods to approximate treatment episodes. We analysed the duration of HT episodes to study the ability to identify......PURPOSE: Many studies which investigate the effect of drugs categorize the exposure variable into never, current, and previous use of the study drug. When prescription registries are used to make this categorization, the exposure variable possibly gets misclassified since the registries do...... not carry any information on the time of discontinuation of treatment.In this study, we investigated the amount of misclassification of exposure (never, current, previous use) to hormone therapy (HT) when the exposure variable was based on prescription data. Furthermore, we evaluated the significance...

  2. How the risky features of previous selection affect subsequent decision-making: evidence from behavioral and fMRI measures.

    Science.gov (United States)

    Dong, Guangheng; Zhang, Yifen; Xu, Jiaojing; Lin, Xiao; Du, Xiaoxia

    2015-01-01

    Human decision making is rarely conducted in temporal isolation. It is often biased and affected by environmental variables, particularly prior selections. In this study, we used a task that simulates a real gambling process to explore the effect of the risky features of a previous selection on subsequent decision making. Compared with decision making after an advantageous risk-taking situation (Risk_Adv), that after a disadvantageous risk-taking situation (Risk_Disadv) is associated with a longer response time (RT, the time spent in making decisions) and higher brain activations in the caudate and the dorsolateral prefrontal cortex (DLPFC). Compared with decisions after Risk_Adv, those after Risk_Disadv in loss trials are associated with higher brain activations in the left superior temporal gyrus (STG) and the precuneus. Brain activity and relevant RTs significantly correlated. Overall, people who experience disadvantageous risk-taking selections tend to focus on current decision making and engage cognitive endeavors in value evaluation and in the regulation of their risk-taking behaviors during decision making.

  3. NMR-based phytochemical analysis of Vitis vinifera cv Falanghina leaves. Characterization of a previously undescribed biflavonoid with antiproliferative activity.

    Science.gov (United States)

    Tartaglione, Luciana; Gambuti, Angelita; De Cicco, Paola; Ercolano, Giuseppe; Ianaro, Angela; Taglialatela-Scafati, Orazio; Moio, Luigi; Forino, Martino

    2018-03-01

    Vitis vinifera cv Falanghina is an ancient grape variety of Southern Italy. A thorough phytochemical analysis of the Falanghina leaves was conducted to investigate its specialised metabolite content. Along with already known molecules, such as caftaric acid, quercetin-3-O-β-d-glucopyranoside, quercetin-3-O-β-d-glucuronide, kaempferol-3-O-β-d-glucopyranoside and kaempferol-3-O-β-d-glucuronide, a previously undescribed biflavonoid was identified. For this last compound, a moderate bioactivity against metastatic melanoma cells proliferation was discovered. This datum can be of some interest to researchers studying human melanoma. The high content in antioxidant glycosylated flavonoids supports the exploitation of grape vine leaves as an inexpensive source of natural products for the food industry and for both pharmaceutical and nutraceutical companies. Additionally, this study offers important insights into the plant physiology, thus prompting possible technological researches of genetic selection based on the vine adaptation to specific pedo-climatic environments. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Software-based acoustical measurements

    CERN Document Server

    Miyara, Federico

    2017-01-01

    This textbook provides a detailed introduction to the use of software in combination with simple and economical hardware (a sound level meter with calibrated AC output and a digital recording system) to obtain sophisticated measurements usually requiring expensive equipment. It emphasizes the use of free, open source, and multiplatform software. Many commercial acoustical measurement systems use software algorithms as an integral component; however the methods are not disclosed. This book enables the reader to develop useful algorithms and provides insight into the use of digital audio editing tools to document features in the signal. Topics covered include acoustical measurement principles, in-depth critical study of uncertainty applied to acoustical measurements, digital signal processing from the basics, and metrologically-oriented spectral and statistical analysis of signals. The student will gain a deep understanding of the use of software for measurement purposes; the ability to implement software-based...

  5. Strain measurement based battery testing

    Science.gov (United States)

    Xu, Jeff Qiang; Steiber, Joe; Wall, Craig M.; Smith, Robert; Ng, Cheuk

    2017-05-23

    A method and system for strain-based estimation of the state of health of a battery, from an initial state to an aged state, is provided. A strain gauge is applied to the battery. A first strain measurement is performed on the battery, using the strain gauge, at a selected charge capacity of the battery and at the initial state of the battery. A second strain measurement is performed on the battery, using the strain gauge, at the selected charge capacity of the battery and at the aged state of the battery. The capacity degradation of the battery is estimated as the difference between the first and second strain measurements divided by the first strain measurement.

  6. Sunburn and sun-protective behaviors among adults with and without previous nonmelanoma skin cancer: a population-based study

    Science.gov (United States)

    Fischer, Alexander H.; Wang, Timothy S.; Yenokyan, Gayane; Kang, Sewon; Chien, Anna L.

    2016-01-01

    Background Individuals with previous nonmelanoma skin cancer (NMSC) are at increased risk for subsequent skin cancer, and should therefore limit UV exposure. Objective To determine whether individuals with previous NMSC engage in better sun protection than those with no skin cancer history. Methods We pooled self-reported data (2005 and 2010 National Health Interview Surveys) from US non-Hispanic white adults (758 with and 34,161 without previous NMSC). We calculated adjusted prevalence odds ratios (aPOR) and 95% confidence intervals (95% CI), taking into account the complex survey design. Results Individuals with previous NMSC versus no history of NMSC had higher rates of frequent use of shade (44.3% versus 27.0%; aPOR=1.41; 1.16–1.71), long sleeves (20.5% versus 7.7%; aPOR=1.55; 1.21–1.98), a wide-brimmed hat (26.1% versus 10.5%; aPOR=1.52; 1.24–1.87), and sunscreen (53.7% versus 33.1%; aPOR=2.11; 95% CI=1.73–2.59), but did not have significantly lower odds of recent sunburn (29.7% versus 40.7%; aPOR=0.95; 0.77–1.17). Among subjects with previous NMSC, recent sunburn was inversely associated with age, sun avoidance, and shade but not sunscreen. Limitations Self-reported cross-sectional data and unavailable information quantifying regular sun exposure. Conclusion Physicians should emphasize sunburn prevention when counseling patients with previous NMSC, especially younger adults, focusing on shade and sun avoidance over sunscreen. PMID:27198078

  7. New population-based exome data question the pathogenicity of some genetic variants previously associated with Marfan syndrome

    DEFF Research Database (Denmark)

    Yang, Ren-Qiang; Jabbari, Javad; Cheng, Xiao-Shu

    2014-01-01

    BACKGROUND: Marfan syndrome (MFS) is a rare autosomal dominantly inherited connective tissue disorder with an estimated prevalence of 1:5,000. More than 1000 variants have been previously reported to be associated with MFS. However, the disease-causing effect of these variants may be questionable...

  8. Five criteria for using a surrogate endpoint to predict treatment effect based on data from multiple previous trials.

    Science.gov (United States)

    Baker, Stuart G

    2018-02-20

    A surrogate endpoint in a randomized clinical trial is an endpoint that occurs after randomization and before the true, clinically meaningful, endpoint that yields conclusions about the effect of treatment on true endpoint. A surrogate endpoint can accelerate the evaluation of new treatments but at the risk of misleading conclusions. Therefore, criteria are needed for deciding whether to use a surrogate endpoint in a new trial. For the meta-analytic setting of multiple previous trials, each with the same pair of surrogate and true endpoints, this article formulates 5 criteria for using a surrogate endpoint in a new trial to predict the effect of treatment on the true endpoint in the new trial. The first 2 criteria, which are easily computed from a zero-intercept linear random effects model, involve statistical considerations: an acceptable sample size multiplier and an acceptable prediction separation score. The remaining 3 criteria involve clinical and biological considerations: similarity of biological mechanisms of treatments between the new trial and previous trials, similarity of secondary treatments following the surrogate endpoint between the new trial and previous trials, and a negligible risk of harmful side effects arising after the observation of the surrogate endpoint in the new trial. These 5 criteria constitute an appropriately high bar for using a surrogate endpoint to make a definitive treatment recommendation. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.

  9. New population-based exome data are questioning the pathogenicity of previously cardiomyopathy-associated genetic variants

    DEFF Research Database (Denmark)

    Andreasen, Charlotte Hartig; Nielsen, Jonas B; Refsgaard, Lena

    2013-01-01

    Cardiomyopathies are a heterogeneous group of diseases with various etiologies. We focused on three genetically determined cardiomyopathies: hypertrophic (HCM), dilated (DCM), and arrhythmogenic right ventricular cardiomyopathy (ARVC). Eighty-four genes have so far been associated with these card......Cardiomyopathies are a heterogeneous group of diseases with various etiologies. We focused on three genetically determined cardiomyopathies: hypertrophic (HCM), dilated (DCM), and arrhythmogenic right ventricular cardiomyopathy (ARVC). Eighty-four genes have so far been associated...... with these cardiomyopathies, but the disease-causing effect of reported variants is often dubious. In order to identify possible false-positive variants, we investigated the prevalence of previously reported cardiomyopathy-associated variants in recently published exome data. We searched for reported missense and nonsense...... variants in the NHLBI-Go Exome Sequencing Project (ESP) containing exome data from 6500 individuals. In ESP, we identified 94 variants out of 687 (14%) variants previously associated with HCM, 58 out of 337 (17%) variants associated with DCM, and 38 variants out of 209 (18%) associated with ARVC...

  10. Association of single nucleotide polymorphisms in candidate genes previously related to genetic variation in fertility with phenotypic measurements of reproductive function in Holstein cows.

    Science.gov (United States)

    Ortega, M Sofia; Denicol, Anna C; Cole, John B; Null, Daniel J; Taylor, Jeremy F; Schnabel, Robert D; Hansen, Peter J

    2017-05-01

    Many genetic markers related to health or production traits are not evaluated in populations independent of the discovery population or related to phenotype. Here we evaluated 68 single nucleotide polymorphisms (SNP) in candidate genes previously associated with genetic merit for fertility and production traits for association with phenotypic measurements of fertility in a population of Holstein cows that was selected based on predicted transmitting ability (PTA) for daughter pregnancy rate (DPR; high, ≥1, n = 989; low, ≤ -1.0, n = 1,285). Cows with a high PTA for DPR had higher pregnancy rate at first service, fewer services per conception, and fewer days open than cows with a low PTA for DPR. Of the 68 SNP, 11 were associated with pregnancy rate at first service, 16 with services per conception, and 19 with days open. Single nucleotide polymorphisms in 12 genes (BDH2, BSP3, CAST, CD2, CD14, FUT1, FYB, GCNT3, HSD17B7, IBSP, OCLN, and PCCB) had significant associations with 2 fertility traits, and SNP in 4 genes (CSPP1, FCER1G, PMM2, and TBC1D24) had significant associations with each of the 3 traits. Results from this experiment were compared with results from 2 earlier studies in which the SNP were associated with genetic estimates of fertility. One study involved the same animals as used here, and the other study was of an independent population of bulls. A total of 13 SNP associated with 1 or more phenotypic estimates of fertility were directionally associated with genetic estimates of fertility in the same cow population. Moreover, 14 SNP associated with reproductive phenotype were directionally associated with genetic estimates of fertility in the bull population. Nine SNP (located in BCAS, BSP3, CAST, FUT1, HSD17B7, OCLN, PCCB, PMM2, and TBC1D24) had a directional association with fertility in all 3 studies. Examination of the function of the genes with SNP associated with reproduction in more than one study indicates the importance of steroid hormones

  11. Reasons for placement of restorations on previously unrestored tooth surfaces by dentists in The Dental Practice-Based Research Network

    DEFF Research Database (Denmark)

    Nascimento, Marcelle M; Gordan, Valeria V; Qvist, Vibeke

    2010-01-01

    The authors conducted a study to identify and quantify the reasons used by dentists in The Dental Practice-Based Research Network (DPBRN) for placing restorations on unrestored permanent tooth surfaces and the dental materials they used in doing so....

  12. Biotin IgM Antibodies in Human Blood: A Previously Unknown Factor Eliciting False Results in Biotinylation-Based Immunoassays

    Science.gov (United States)

    Chen, Tingting; Hedman, Lea; Mattila, Petri S.; Jartti, Laura; Jartti, Tuomas; Ruuskanen, Olli; Söderlund-Venermo, Maria; Hedman, Klaus

    2012-01-01

    Biotin is an essential vitamin that binds streptavidin or avidin with high affinity and specificity. As biotin is a small molecule that can be linked to proteins without affecting their biological activity, biotinylation is applied widely in biochemical assays. In our laboratory, IgM enzyme immuno assays (EIAs) of µ-capture format have been set up against many viruses, using as antigen biotinylated virus like particles (VLPs) detected by horseradish peroxidase-conjugated streptavidin. We recently encountered one serum sample reacting with the biotinylated VLP but not with the unbiotinylated one, suggesting in human sera the occurrence of biotin-reactive antibodies. In the present study, we search the general population (612 serum samples from adults and 678 from children) for IgM antibodies reactive with biotin and develop an indirect EIA for quantification of their levels and assessment of their seroprevalence. These IgM antibodies were present in 3% adults regardless of age, but were rarely found in children. The adverse effects of the biotin IgM on biotinylation-based immunoassays were assessed, including four inhouse and one commercial virus IgM EIAs, showing that biotin IgM do cause false positivities. The biotin can not bind IgM and streptavidin or avidin simultaneously, suggesting that these biotin-interactive compounds compete for the common binding site. In competitive inhibition assays, the affinities of biotin IgM antibodies ranged from 2.1×10−3 to 1.7×10−4 mol/L. This is the first report on biotin antibodies found in humans, providing new information on biotinylation-based immunoassays as well as new insights into the biomedical effects of vitamins. PMID:22879954

  13. Measuring Dynamic and Kinetic Information in the Previously Inaccessible Supra-tc Window of Nanoseconds to Microseconds by Solution NMR Spectroscopy

    Directory of Open Access Journals (Sweden)

    Donghan Lee

    2013-09-01

    Full Text Available Nuclear Magnetic Resonance (NMR spectroscopy is a powerful tool that has enabled experimentalists to characterize molecular dynamics and kinetics spanning a wide range of time-scales from picoseconds to days. This review focuses on addressing the previously inaccessible supra-τc window (defined as τc < supra-τc < 40 μs; in which τc is the overall tumbling time of a molecule from the perspective of local inter-nuclear vector dynamics extracted from residual dipolar couplings (RDCs and from the perspective of conformational exchange captured by relaxation dispersion measurements (RD. The goal of the first section is to present a detailed analysis of how to extract protein dynamics encoded in RDCs and how to relate this information to protein functionality within the previously inaccessible supra-τc window. In the second section, the current state of the art for RD is analyzed, as well as the considerable progress toward pushing the sensitivity of RD further into the supra-τc scale by up to a factor of two (motion up to 25 ms. From the data obtained with these techniques and methodology, the importance of the supra-τ c scale for protein function and molecular recognition is becoming increasingly clearer as the connection between motion on the supra-τc scale and protein functionality from the experimental side is further strengthened with results from molecular dynamics simulations.

  14. Hemoglobin-Based Oxygen Carrier (HBOC) Development in Trauma: Previous Regulatory Challenges, Lessons Learned, and a Path Forward.

    Science.gov (United States)

    Keipert, Peter E

    2017-01-01

    Historically, hemoglobin-based oxygen carriers (HBOCs) were being developed as "blood substitutes," despite their transient circulatory half-life (~ 24 h) vs. transfused red blood cells (RBCs). More recently, HBOC commercial development focused on "oxygen therapeutic" indications to provide a temporary oxygenation bridge until medical or surgical interventions (including RBC transfusion, if required) can be initiated. This included the early trauma trials with HemAssist ® (BAXTER), Hemopure ® (BIOPURE) and PolyHeme ® (NORTHFIELD) for resuscitating hypotensive shock. These trials all failed due to safety concerns (e.g., cardiac events, mortality) and certain protocol design limitations. In 2008 the Food and Drug Administration (FDA) put all HBOC trials in the US on clinical hold due to the unfavorable benefit:risk profile demonstrated by various HBOCs in different clinical studies in a meta-analysis published by Natanson et al. (2008). During standard resuscitation in trauma, organ dysfunction and failure can occur due to ischemia in critical tissues, which can be detected by the degree of lactic acidosis. SANGART'S Phase 2 trauma program with MP4OX therefore added lactate >5 mmol/L as an inclusion criterion to enroll patients who had lost sufficient blood to cause a tissue oxygen debt. This was key to the successful conduct of their Phase 2 program (ex-US, from 2009 to 2012) to evaluate MP4OX as an adjunct to standard fluid resuscitation and transfusion of RBCs. In 2013, SANGART shared their Phase 2b results with the FDA, and succeeded in getting the FDA to agree that a planned Phase 2c higher dose comparison study of MP4OX in trauma could include clinical sites in the US. Unfortunately, SANGART failed to secure new funding and was forced to terminate development and operations in Dec 2013, even though a regulatory path forward with FDA approval to proceed in trauma had been achieved.

  15. Measuring Disorientation Based on the Needleman-Wunsch Algorithm

    Science.gov (United States)

    Güyer, Tolga; Atasoy, Bilal; Somyürek, Sibel

    2015-01-01

    This study offers a new method to measure navigation disorientation in web based systems which is powerful learning medium for distance and open education. The Needleman-Wunsch algorithm is used to measure disorientation in a more precise manner. The process combines theoretical and applied knowledge from two previously distinct research areas,…

  16. Animal-based measures for welfare assessment

    Directory of Open Access Journals (Sweden)

    Agostino Sevi

    2010-01-01

    Full Text Available Animal welfare assessment can’t be irrespective of measures taken on animals. Indeed, housing parametersrelatedtostructures, designandmicro-environment, evenifreliable parameters related to structures, design and micro-environment, even if reliable and easier to take, can only identify conditions which could be detrimental to animal welfare, but can’t predict poor welfare in animals per se. Welfare assessment through animal-based measures is almost complex, given that animals’ responses to stressful conditions largely depend on the nature, length and intensity of challenges and on physiological status, age, genetic susceptibility and previous experience of animals. Welfare assessment requires a multi-disciplinary approach and the monitoring of productive, ethological, endocrine, immunological and pathological param- eters to be exhaustive and reliable. So many measures are needed, because stresses can act only on some of the mentioned parameters or on all of them but at different times and degree. Under this point of view, the main aim of research is to find feasible and most responsive indicators of poor animal welfare. In last decades, studies focused on the following parameters for animal wel- fare assessment indexes of biological efficiency, responses to behavioral tests, cortisol secretion, neutrophil to lymphocyte ratio, lymphocyte proliferation, production of antigen specific IgG and cytokine release, somatic cell count and acute phase proteins. Recently, a lot of studies have been addressed to reduce handling and constraint of animals for taking measures to be used in welfare assessment, since such procedures can induce stress in animals and undermined the reliability of measures taken for welfare assessment. Range of animal-based measures for welfare assessment is much wider under experimental condition than at on-farm level. In welfare monitoring on-farm the main aim is to find feasible measures of proved validity and reliability

  17. Status of radiation-based measurement technology

    International Nuclear Information System (INIS)

    Moon, B. S.; Lee, J. W.; Chung, C. E.; Hong, S. B.; Kim, J. T.; Park, W. M.; Kim, J. Y.

    1999-03-01

    This report describes the status of measurement equipment using radiation source and new technologies in this field. This report includes the development status in Korea together with a brief description of the technology development and application status in ten countries including France, America, and Japan. Also this report describes technical factors related to radiation-based measurement and trends of new technologies. Measurement principles are also described for the equipment that is widely used among radiation-based measurement, such as level measurement, density measurement, basis weight measurement, moisture measurement, and thickness measurement. (author). 7 refs., 2 tabs., 21 figs

  18. SQUID-based measuring systems

    Indian Academy of Sciences (India)

    field produced by a given two-dimensional current density distribution is inverted using the Fourier transform technique. Keywords ... Superconducting quantum interference devices (SQUIDs) are the most sensitive detectors for measurement of ... omagnetic prospecting, detection of gravity waves etc. Judging the importance ...

  19. European wet deposition maps based on measurements

    NARCIS (Netherlands)

    Leeuwen EP van; Erisman JW; Draaijers GPJ; Potma CJM; Pul WAJ van; LLO

    1995-01-01

    To date, wet deposition maps on a European scale have been based on long-range transport model results. For most components wet deposition maps based on measurements are only available on national scales. Wet deposition maps of acidifying components and base cations based on measurements are needed

  20. Spectrophotometer-Based Color Measurements

    Science.gov (United States)

    2017-10-24

    equipment. There are several American Society for Testing and Materials ( ASTM ) chapters covering the use of spectrometers for color measurements (refs. 3...Perkin Elmer software and procedures described in ASTM chapter E308 (ref. 3). All spectral data was stored on the computer. A summary of the color...similarity, or lack thereof, between two colors (ref. 5). In this report, the Euclidean distance metric, E, is used and recommended in ASTM D2244

  1. Effectiveness of Ritonavir-Boosted Protease Inhibitor Monotherapy in Clinical Practice Even with Previous Virological Failures to Protease Inhibitor-Based Regimens.

    Directory of Open Access Journals (Sweden)

    Luis F López-Cortés

    Full Text Available Significant controversy still exists about ritonavir-boosted protease inhibitor monotherapy (mtPI/rtv as a simplification strategy that is used up to now to treat patients that have not experienced previous virological failure (VF while on protease inhibitor (PI -based regimens. We have evaluated the effectiveness of two mtPI/rtv regimens in an actual clinical practice setting, including patients that had experienced previous VF with PI-based regimens.This retrospective study analyzed 1060 HIV-infected patients with undetectable viremia that were switched to lopinavir/ritonavir or darunavir/ritonavir monotherapy. In cases in which the patient had previously experienced VF while on a PI-based regimen, the lack of major HIV protease resistance mutations to lopinavir or darunavir, respectively, was mandatory. The primary endpoint of this study was the percentage of participants with virological suppression after 96 weeks according to intention-to-treat analysis (non-complete/missing = failure.A total of 1060 patients were analyzed, including 205 with previous VF while on PI-based regimens, 90 of whom were on complex therapies due to extensive resistance. The rates of treatment effectiveness (intention-to-treat analysis and virological efficacy (on-treatment analysis at week 96 were 79.3% (CI95, 76.8-81.8 and 91.5% (CI95, 89.6-93.4, respectively. No relationships were found between VF and earlier VF while on PI-based regimens, the presence of major or minor protease resistance mutations, the previous time on viral suppression, CD4+ T-cell nadir, and HCV-coinfection. Genotypic resistance tests were available in 49 out of the 74 patients with VFs and only four patients presented new major protease resistance mutations.Switching to mtPI/rtv achieves sustained virological control in most patients, even in those with previous VF on PI-based regimens as long as no major resistance mutations are present for the administered drug.

  2. Measurement of dabigatran: previously demonstrated Hemoclot® Thrombin Inhibitor assay reagent instability on Sysmex CS-2100i is no longer an issue

    DEFF Research Database (Denmark)

    Comuth, Willemijn; Faaborg, Louise; Henriksen, Linda Østervig

    2017-01-01

    hours. Since the reagent composition was unchanged, the increased stability could be due to changed logistics by the supplier, with stock and transfer closer by. Previously demonstrated HTI reagent instability is no longer an issue at our laboratory. The reliability of results of clinical studies...

  3. Bluetooth-based distributed measurement system

    International Nuclear Information System (INIS)

    Tang Baoping; Chen Zhuo; Wei Yuguo; Qin Xiaofeng

    2007-01-01

    A novel distributed wireless measurement system, which is consisted of a base station, wireless intelligent sensors and relay nodes etc, is established by combining of Bluetooth-based wireless transmission, virtual instrument, intelligent sensor, and network. The intelligent sensors mounted on the equipments to be measured acquire various parameters and the Bluetooth relay nodes get the acquired data modulated and sent to the base station, where data analysis and processing are done so that the operational condition of the equipment can be evaluated. The establishment of the distributed measurement system is discussed with a measurement flow chart for the distributed measurement system based on Bluetooth technology, and the advantages and disadvantages of the system are analyzed at the end of the paper and the measurement system has successfully been used in Daqing oilfield, China for measurement of parameters, such as temperature, flow rate and oil pressure at an electromotor-pump unit

  4. Bluetooth-based distributed measurement system

    Science.gov (United States)

    Tang, Baoping; Chen, Zhuo; Wei, Yuguo; Qin, Xiaofeng

    2007-07-01

    A novel distributed wireless measurement system, which is consisted of a base station, wireless intelligent sensors and relay nodes etc, is established by combining of Bluetooth-based wireless transmission, virtual instrument, intelligent sensor, and network. The intelligent sensors mounted on the equipments to be measured acquire various parameters and the Bluetooth relay nodes get the acquired data modulated and sent to the base station, where data analysis and processing are done so that the operational condition of the equipment can be evaluated. The establishment of the distributed measurement system is discussed with a measurement flow chart for the distributed measurement system based on Bluetooth technology, and the advantages and disadvantages of the system are analyzed at the end of the paper and the measurement system has successfully been used in Daqing oilfield, China for measurement of parameters, such as temperature, flow rate and oil pressure at an electromotor-pump unit.

  5. Bluetooth-based distributed measurement system

    Energy Technology Data Exchange (ETDEWEB)

    Tang Baoping; Chen Zhuo; Wei Yuguo; Qin Xiaofeng [Department of Mechatronics, College of Mechanical Engineering, Chongqing University, Chongqing, 400030 (China)

    2007-07-15

    A novel distributed wireless measurement system, which is consisted of a base station, wireless intelligent sensors and relay nodes etc, is established by combining of Bluetooth-based wireless transmission, virtual instrument, intelligent sensor, and network. The intelligent sensors mounted on the equipments to be measured acquire various parameters and the Bluetooth relay nodes get the acquired data modulated and sent to the base station, where data analysis and processing are done so that the operational condition of the equipment can be evaluated. The establishment of the distributed measurement system is discussed with a measurement flow chart for the distributed measurement system based on Bluetooth technology, and the advantages and disadvantages of the system are analyzed at the end of the paper and the measurement system has successfully been used in Daqing oilfield, China for measurement of parameters, such as temperature, flow rate and oil pressure at an electromotor-pump unit.

  6. Cyst-based measurements for assessing lymphangioleiomyomatosis in computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Lo, P., E-mail: pechinlo@mednet.edu.ucla; Brown, M. S.; Kim, H.; Kim, H.; Goldin, J. G. [Center for Computer Vision and Imaging Biomarkers, Department of Radiological Sciences, David Geffen School of Medicine, University of California, Los Angeles, California 90024 (United States); Argula, R.; Strange, C. [Division of Pulmonary and Critical Care Medicine, Medical University of South Carolina, Charleston, South Carolina 29425 (United States)

    2015-05-15

    Purpose: To investigate the efficacy of a new family of measurements made on individual pulmonary cysts extracted from computed tomography (CT) for assessing the severity of lymphangioleiomyomatosis (LAM). Methods: CT images were analyzed using thresholding to identify a cystic region of interest from chest CT of LAM patients. Individual cysts were then extracted from the cystic region by the watershed algorithm, which separates individual cysts based on subtle edges within the cystic regions. A family of measurements were then computed, which quantify the amount, distribution, and boundary appearance of the cysts. Sequential floating feature selection was used to select a small subset of features for quantification of the severity of LAM. Adjusted R{sup 2} from multiple linear regression and R{sup 2} from linear regression against measurements from spirometry were used to compare the performance of our proposed measurements with currently used density based CT measurements in the literature, namely, the relative area measure and the D measure. Results: Volumetric CT data, performed at total lung capacity and residual volume, from a total of 49 subjects enrolled in the MILES trial were used in our study. Our proposed measures had adjusted R{sup 2} ranging from 0.42 to 0.59 when regressing against the spirometry measures, with p < 0.05. For previously used density based CT measurements in the literature, the best R{sup 2} was 0.46 (for only one instance), with the majority being lower than 0.3 or p > 0.05. Conclusions: The proposed family of CT-based cyst measurements have better correlation with spirometric measures than previously used density based CT measurements. They show potential as a sensitive tool for quantitatively assessing the severity of LAM.

  7. An automated patient recognition method based on an image-matching technique using previous chest radiographs in the picture archiving and communication system environment

    International Nuclear Information System (INIS)

    Morishita, Junji; Katsuragawa, Shigehiko; Kondo, Keisuke; Doi, Kunio

    2001-01-01

    An automated patient recognition method for correcting 'wrong' chest radiographs being stored in a picture archiving and communication system (PACS) environment has been developed. The method is based on an image-matching technique that uses previous chest radiographs. For identification of a 'wrong' patient, the correlation value was determined for a previous image of a patient and a new, current image of the presumed corresponding patient. The current image was shifted horizontally and vertically and rotated, so that we could determine the best match between the two images. The results indicated that the correlation values between the current and previous images for the same, 'correct' patients were generally greater than those for different, 'wrong' patients. Although the two histograms for the same patient and for different patients overlapped at correlation values greater than 0.80, most parts of the histograms were separated. The correlation value was compared with a threshold value that was determined based on an analysis of the histograms of correlation values obtained for the same patient and for different patients. If the current image is considered potentially to belong to a 'wrong' patient, then a warning sign with the probability for a 'wrong' patient is provided to alert radiology personnel. Our results indicate that at least half of the 'wrong' images in our database can be identified correctly with the method described in this study. The overall performance in terms of a receiver operating characteristic curve showed a high performance of the system. The results also indicate that some readings of 'wrong' images for a given patient in the PACS environment can be prevented by use of the method we developed. Therefore an automated warning system for patient recognition would be useful in correcting 'wrong' images being stored in the PACS environment

  8. Is previous disaster experience a good predictor for disaster preparedness in extreme poverty households in remote Muslim minority based community in China?

    Science.gov (United States)

    Chan, Emily Y Y; Kim, Jean H; Lin, Cherry; Cheung, Eliza Y L; Lee, Polly P Y

    2014-06-01

    Disaster preparedness is an important preventive strategy for protecting health and mitigating adverse health effects of unforeseen disasters. A multi-site based ethnic minority project (2009-2015) is set up to examine health and disaster preparedness related issues in remote, rural, disaster prone communities in China. The primary objective of this reported study is to examine if previous disaster experience significantly increases household disaster preparedness levels in remote villages in China. A cross-sectional, household survey was conducted in January 2011 in Gansu Province, in a predominately Hui minority-based village. Factors related to disaster preparedness were explored using quantitative methods. Two focus groups were also conducted to provide additional contextual explanations to the quantitative findings of this study. The village household response rate was 62.4 % (n = 133). Although previous disaster exposure was significantly associated with perception of living in a high disaster risk area (OR = 6.16), only 10.7 % households possessed a disaster emergency kit. Of note, for households with members who had non-communicable diseases, 9.6 % had prepared extra medications to sustain clinical management of their chronic conditions. This is the first study that examined disaster preparedness in an ethnic minority population in remote communities in rural China. Our results indicate the need of disaster mitigation education to promote preparedness in remote, resource-poor communities.

  9. Laparoscopy After Previous Laparotomy

    Directory of Open Access Journals (Sweden)

    Zulfo Godinjak

    2006-11-01

    Full Text Available Following the abdominal surgery, extensive adhesions often occur and they can cause difficulties during laparoscopic operations. However, previous laparotomy is not considered to be a contraindication for laparoscopy. The aim of this study is to present that an insertion of Veres needle in the region of umbilicus is a safe method for creating a pneumoperitoneum for laparoscopic operations after previous laparotomy. In the last three years, we have performed 144 laparoscopic operations in patients that previously underwent one or two laparotomies. Pathology of digestive system, genital organs, Cesarean Section or abdominal war injuries were the most common causes of previouslaparotomy. During those operations or during entering into abdominal cavity we have not experienced any complications, while in 7 patients we performed conversion to laparotomy following the diagnostic laparoscopy. In all patients an insertion of Veres needle and trocar insertion in the umbilical region was performed, namely a technique of closed laparoscopy. Not even in one patient adhesions in the region of umbilicus were found, and no abdominal organs were injured.

  10. Measuring dynamic and kinetic information in the previously inaccessible supra-τ(c) window of nanoseconds to microseconds by solution NMR spectroscopy.

    Science.gov (United States)

    Ban, David; Sabo, T Michael; Griesinger, Christian; Lee, Donghan

    2013-09-26

    Nuclear Magnetic Resonance (NMR) spectroscopy is a powerful tool that has enabled experimentalists to characterize molecular dynamics and kinetics spanning a wide range of time-scales from picoseconds to days. This review focuses on addressing the previously inaccessible supra-tc window (defined as τ(c) supra-τ(c) supra-τ(c) window. In the second section, the current state of the art for RD is analyzed, as well as the considerable progress toward pushing the sensitivity of RD further into the supra-τ(c) scale by up to a factor of two (motion up to 25 μs). From the data obtained with these techniques and methodology, the importance of the supra-τ(c) scale for protein function and molecular recognition is becoming increasingly clearer as the connection between motion on the supra-τ(c) scale and protein functionality from the experimental side is further strengthened with results from molecular dynamics simulations.

  11. Korean Clinic Based Outcome Measure Studies

    OpenAIRE

    Jongbae Park

    2003-01-01

    Background: Evidence based medicine has become main tools for medical practice. However, conducting a highly ranked in the evidence hierarchy pyramid is not easy or feasible at all times and places. There remains a room for descriptive clinical outcome measure studies with admitting the limit of the intepretation. Aims: Presents three Korean clinic based outcome measure studies with a view to encouraging Korean clinicians to conduct similar studies. Methods: Three studies are presented...

  12. Previous study for the setting up and optimization of detection of ZnS(Ag) scintillation applied to the measure of alpha radioactivity index

    International Nuclear Information System (INIS)

    Pujol, L.; Suarez-Navarro, J.A.; Montero, M.

    1998-01-01

    The determination of radiological water quality is useful for a wide range of environmental studies. In these cases, the gross alpha activity is one of the parameters to determine. This parameter permits to decide if further radiological analyses are necessary in order to identify and quantify the presence of alpha emitters in water. The usual method for monitoring the gross alpha activity includes sample evaporation to dryness on a disk and counting using ZnS(Ag) scintillation detector. Detector electronics is provided with two components which are adjustable by the user the high-voltage applied to the photomultiplier tubes and the low level discriminator that is used to eliminate the electronic noise. The high-voltage and low level discriminator optimization are convenient in order to reach the best counting conditions. This paper is a preliminary study of the procedure followed for the setting up and optimization of the detector electronics in the laboratories of CEDEX for the measurement of gross alpha activity. (Author)

  13. A novel pH-responsive hydrogel-based on calcium alginate engineered by the previous formation of polyelectrolyte complexes (PECs) intended to vaginal administration.

    Science.gov (United States)

    Ferreira, Natália Noronha; Perez, Taciane Alvarenga; Pedreiro, Liliane Neves; Prezotti, Fabíola Garavello; Boni, Fernanda Isadora; Cardoso, Valéria Maria de Oliveira; Venâncio, Tiago; Gremião, Maria Palmira Daflon

    2017-10-01

    This work aimed to develop a calcium alginate hydrogel as a pH responsive delivery system for polymyxin B (PMX) sustained-release through the vaginal route. Two samples of sodium alginate from different suppliers were characterized. The molecular weight and M/G ratio determined were, approximately, 107 KDa and 1.93 for alginate_S and 32 KDa and 1.36 for alginate_V. Polymer rheological investigations were further performed through the preparation of hydrogels. Alginate_V was selected for subsequent incorporation of PMX due to the acquisition of pseudoplastic viscous system able to acquiring a differential structure in simulated vaginal microenvironment (pH 4.5). The PMX-loaded hydrogel (hydrogel_PMX) was engineered based on polyelectrolyte complexes (PECs) formation between alginate and PMX followed by crosslinking with calcium chloride. This system exhibited a morphology with variable pore sizes, ranging from 100 to 200 μm and adequate syringeability. The hydrogel liquid uptake ability in an acid environment was minimized by the previous PECs formation. In vitro tests evidenced the hydrogels mucoadhesiveness. PMX release was pH-dependent and the system was able to sustain the release up to 6 days. A burst release was observed at pH 7.4 and drug release was driven by an anomalous transport, as determined by the Korsmeyer-Peppas model. At pH 4.5, drug release correlated with Weibull model and drug transport was driven by Fickian diffusion. The calcium alginate hydrogels engineered by the previous formation of PECs showed to be a promising platform for sustained release of cationic drugs through vaginal administration.

  14. Trial of labour and vaginal birth after previous caesarean section: A population based study of Eastern African immigrants in Victoria, Australia.

    Science.gov (United States)

    Belihu, Fetene B; Small, Rhonda; Davey, Mary-Ann

    2017-03-01

    Variations in caesarean section (CS) between some immigrant groups and receiving country populations have been widely reported. Often, African immigrant women are at higher risk of CS than the receiving population in developed countries. However, evidence about subsequent mode of birth following CS for African women post-migration is lacking. The objective of this study was to examine differences in attempted and successful vaginal birth after previous caesarean (VBAC) for Eastern African immigrants (Eritrea, Ethiopia, Somalia and Sudan) compared with Australian-born women. A population-based observational study was conducted using the Victorian Perinatal Data Collection. Pearson's chi-square test and logistic regression analysis were performed to generate adjusted odds ratios for attempted and successful VBAC. Victoria, Australia. 554 Eastern African immigrants and 24,587 Australian-born eligible women with previous CS having singleton births in public care. 41.5% of Eastern African immigrant women and 26.1% Australian-born women attempted a VBAC with 50.9% of Eastern African immigrants and 60.5% of Australian-born women being successful. After adjusting for maternal demographic characteristics and available clinical confounding factors, Eastern African immigrants were more likely to attempt (OR adj 1.94, 95% CI 1.57-2.47) but less likely to succeed (OR adj 0.54 95% CI 0.41-0.71) in having a VBAC. There are disparities in attempted and successful VBAC between Eastern African origin and Australian-born women. Unsuccessful VBAC attempt is more common among Eastern African immigrants, suggesting the need for improved strategies to select and support potential candidates for vaginal birth among these immigrants to enhance success and reduce potential complications associated with failed VBAC attempt. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  15. Deformation Measurements of Gabion Walls Using Image Based Modeling

    Directory of Open Access Journals (Sweden)

    Marek Fraštia

    2014-06-01

    Full Text Available The image based modeling finds use in applications where it is necessary to reconstructthe 3D surface of the observed object with a high level of detail. Previous experiments showrelatively high variability of the results depending on the camera type used, the processingsoftware, or the process evaluation. The authors tested the method of SFM (Structure fromMotion to determine the stability of gabion walls. The results of photogrammetricmeasurements were compared to precise geodetic point measurements.

  16. Measurement-based reliability/performability models

    Science.gov (United States)

    Hsueh, Mei-Chen

    1987-01-01

    Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.

  17. Fludarabine-based versus CHOP-like regimens with or without rituximab in patients with previously untreated indolent lymphoma: a retrospective analysis of safety and efficacy

    Directory of Open Access Journals (Sweden)

    Xu XX

    2013-10-01

    Full Text Available Xiao-xiao Xu,1 Bei Yan,2 Zhen-xing Wang,3 Yong Yu,1 Xiao-xiong Wu,2 Yi-zhuo Zhang11Department of Hematology, Tianjin Medical University Cancer Institute and Hospital, Tianjin Key Laboratory of Cancer Prevention and Therapy, Tianjin, 2Department of Hematology, First Affiliated Hospital of Chinese People's Liberation Army General Hospital, Beijing, 3Department of Stomach Oncology, TianJin Medical University Cancer Institute and Hospital, Key Laboratory of Cancer Prevention and Therapy, Tianjin, People's Republic of ChinaAbstract: Fludarabine-based regimens and CHOP (doxorubicin, cyclophosphamide, vincristine, prednisone-like regimens with or without rituximab are the most common treatment modalities for indolent lymphoma. However, there is no clear evidence to date about which chemotherapy regimen should be the proper initial treatment of indolent lymphoma. More recently, the use of fludarabine has raised concerns due to its high number of toxicities, especially hematological toxicity and infectious complications. The present study aimed to retrospectively evaluate both the efficacy and the potential toxicities of the two main regimens (fludarabine-based and CHOP-like regimens in patients with previously untreated indolent lymphoma. Among a total of 107 patients assessed, 54 patients received fludarabine-based regimens (FLU arm and 53 received CHOP or CHOPE (doxorubicin, cyclophosphamide, vincristine, prednisone, or plus etoposide regimens (CHOP arm. The results demonstrated that fludarabine-based regimens could induce significantly improved progression-free survival (PFS compared with CHOP-like regimens. However, the FLU arm showed overall survival, complete response, and overall response rates similar to those of the CHOP arm. Grade 3–4 neutropenia occurred in 42.6% of the FLU arm and 7.5% of the CHOP arm (P 60 years and presentation of grade 3–4 myelosuppression were the independent factors to infection, and the FLU arm had significantly

  18. An USB-based time measurement system

    International Nuclear Information System (INIS)

    Qin Xi; Liu Shubin; An Qi

    2010-01-01

    In this paper,we report the electronics of a timing measurement system of PTB(portable TDC board), which is a handy tool based on USB interface, customized for high precision time measurements without any crates. The time digitization is based on the High Performance TDC Chip (HPTDC). The real-time compensation for HPTDC outputs and the USB master logic are implemented in an ALTERA's Cyclone FPGA. The architecture design and logic design are described in detail. Test of the system showed a time resolution of 13.3 ps. (authors)

  19. Toward Measuring Network Aesthetics Based on Symmetry

    Directory of Open Access Journals (Sweden)

    Zengqiang Chen

    2017-05-01

    Full Text Available In this exploratory paper, we discuss quantitative graph-theoretical measures of network aesthetics. Related work in this area has typically focused on geometrical features (e.g., line crossings or edge bendiness of drawings or visual representations of graphs which purportedly affect an observer’s perception. Here we take a very different approach, abandoning reliance on geometrical properties, and apply information-theoretic measures to abstract graphs and networks directly (rather than to their visual representaions as a means of capturing classical appreciation of structural symmetry. Examples are used solely to motivate the approach to measurement, and to elucidate our symmetry-based mathematical theory of network aesthetics.

  20. Accuracy of magnetic resonance based susceptibility measurements

    Science.gov (United States)

    Erdevig, Hannah E.; Russek, Stephen E.; Carnicka, Slavka; Stupic, Karl F.; Keenan, Kathryn E.

    2017-05-01

    Magnetic Resonance Imaging (MRI) is increasingly used to map the magnetic susceptibility of tissue to identify cerebral microbleeds associated with traumatic brain injury and pathological iron deposits associated with neurodegenerative diseases such as Parkinson's and Alzheimer's disease. Accurate measurements of susceptibility are important for determining oxygen and iron content in blood vessels and brain tissue for use in noninvasive clinical diagnosis and treatment assessments. Induced magnetic fields with amplitude on the order of 100 nT, can be detected using MRI phase images. The induced field distributions can then be inverted to obtain quantitative susceptibility maps. The focus of this research was to determine the accuracy of MRI-based susceptibility measurements using simple phantom geometries and to compare the susceptibility measurements with magnetometry measurements where SI-traceable standards are available. The susceptibilities of paramagnetic salt solutions in cylindrical containers were measured as a function of orientation relative to the static MRI field. The observed induced fields as a function of orientation of the cylinder were in good agreement with simple models. The MRI susceptibility measurements were compared with SQUID magnetometry using NIST-traceable standards. MRI can accurately measure relative magnetic susceptibilities while SQUID magnetometry measures absolute magnetic susceptibility. Given the accuracy of moment measurements of tissue mimicking samples, and the need to look at small differences in tissue properties, the use of existing NIST standard reference materials to calibrate MRI reference structures is problematic and better reference materials are required.

  1. Image based method for aberration measurement of lithographic tools

    Science.gov (United States)

    Xu, Shuang; Tao, Bo; Guo, Yongxing; Li, Gongfa

    2018-01-01

    Information of lens aberration of lithographic tools is important as it directly affects the intensity distribution in the image plane. Zernike polynomials are commonly used for a mathematical description of lens aberrations. Due to the advantage of lower cost and easier implementation of tools, image based measurement techniques have been widely used. Lithographic tools are typically partially coherent systems that can be described by a bilinear model, which entails time consuming calculations and does not lend a simple and intuitive relationship between lens aberrations and the resulted images. Previous methods for retrieving lens aberrations in such partially coherent systems involve through-focus image measurements and time-consuming iterative algorithms. In this work, we propose a method for aberration measurement in lithographic tools, which only requires measuring two images of intensity distribution. Two linear formulations are derived in matrix forms that directly relate the measured images to the unknown Zernike coefficients. Consequently, an efficient non-iterative solution is obtained.

  2. A SVD Based Image Complexity Measure

    DEFF Research Database (Denmark)

    Gustafsson, David Karl John; Pedersen, Kim Steenstrup; Nielsen, Mads

    2009-01-01

    Images are composed of geometric structures and texture, and different image processing tools - such as denoising, segmentation and registration - are suitable for different types of image contents. Characterization of the image content in terms of geometric structure and texture is an important...... problem that one is often faced with. We propose a patch based complexity measure, based on how well the patch can be approximated using singular value decomposition. As such the image complexity is determined by the complexity of the patches. The concept is demonstrated on sequences from the newly...... collected DIKU Multi-Scale image database....

  3. Ordinal-Measure Based Shape Correspondence

    Directory of Open Access Journals (Sweden)

    Faouzi Alaya Cheikh

    2002-04-01

    Full Text Available We present a novel approach to shape similarity estimation based on distance transformation and ordinal correlation. The proposed method operates in three steps: object alignment, contour to multilevel image transformation, and similarity evaluation. This approach is suitable for use in shape classification, content-based image retrieval and performance evaluation of segmentation algorithms. The two latter applications are addressed in this papers. Simulation results show that in both applications our proposed measure performs quite well in quantifying shape similarity. The scores obtained using this technique reflect well the correspondence between object contours as humans perceive it.

  4. Green maritime transportation: Market based measures

    DEFF Research Database (Denmark)

    Psaraftis, Harilaos N.

    2016-01-01

    The purpose of this chapter is to introduce the concept of Market Based Measures (MBMs) to reduce Green House Gas (GHG) emissions from ships, and review several distinct MBM proposals that have been under consideration by the International Maritime Organization (IMO). The chapter discusses the me...... the mechanisms used by MBMs, and explores how the concept of the Marginal Abatement Cost (MAC) can be linked to MBMs. It also attempts to discuss the pros and cons of the submitted proposals....

  5. Statistical inference based on divergence measures

    CERN Document Server

    Pardo, Leandro

    2005-01-01

    The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this powerful approach.Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, prese...

  6. Bioimpedance measurement based evaluation of wound healing.

    Science.gov (United States)

    Kekonen, Atte; Bergelin, Mikael; Eriksson, Jan-Erik; Vaalasti, Annikki; Ylänen, Heimo; Viik, Jari

    2017-06-22

    Our group has developed a bipolar bioimpedance measurement-based method for determining the state of wound healing. The objective of this study was to assess the capability of the method. To assess the performance of the method, we arranged a follow-up study of four acute wounds. The wounds were measured using the method and photographed throughout the healing process. Initially the bioimpedance of the wounds was significantly lower than the impedance of the undamaged skin, used as a baseline. Gradually, as healing progressed, the wound impedance increased and finally reached the impedance of the undamaged skin. The clinical appearance of the wounds examined in this study corresponded well with the parameters derived from the bioimpedance data. Hard-to-heal wounds are a significant and growing socioeconomic burden, especially in the developed countries, due to aging populations and to the increasing prevalence of various lifestyle related diseases. The assessment and the monitoring of chronic wounds are mainly based on visual inspection by medical professionals. The dressings covering the wound must be removed before assessment; this may disturb the wound healing process and significantly increases the work effort of the medical staff. There is a need for an objective and quantitative method for determining the status of a wound without removing the wound dressings. This study provided evidence of the capability of the bioimpedance based method for assessing the wound status. In the future measurements with the method should be extended to concern hard-to-heal wounds.

  7. Sunburn and sun-protective behaviors among adults with and without previous nonmelanoma skin cancer (NMSC): A population-based study.

    Science.gov (United States)

    Fischer, Alexander H; Wang, Timothy S; Yenokyan, Gayane; Kang, Sewon; Chien, Anna L

    2016-08-01

    Individuals with previous nonmelanoma skin cancer (NMSC) are at increased risk for subsequent skin cancer, and should therefore limit ultraviolet exposure. We sought to determine whether individuals with previous NMSC engage in better sun protection than those with no skin cancer history. We pooled self-reported data (2005 and 2010 National Health Interview Surveys) from US non-Hispanic white adults (758 with and 34,161 without previous NMSC). We calculated adjusted prevalence odds ratios (aPOR) and 95% confidence intervals (CI), taking into account the complex survey design. Individuals with previous NMSC versus no history of NMSC had higher rates of frequent use of shade (44.3% vs 27.0%; aPOR 1.41; 95% CI 1.16-1.71), long sleeves (20.5% vs 7.7%; aPOR 1.55; 95% CI 1.21-1.98), a wide-brimmed hat (26.1% vs 10.5%; aPOR 1.52; 95% CI 1.24-1.87), and sunscreen (53.7% vs 33.1%; aPOR 2.11; 95% CI 1.73-2.59), but did not have significantly lower odds of recent sunburn (29.7% vs 40.7%; aPOR 0.95; 95% CI 0.77-1.17). Among those with previous NMSC, recent sunburn was inversely associated with age, sun avoidance, and shade but not sunscreen. Self-reported cross-sectional data and unavailable information quantifying regular sun exposure are limitations. Physicians should emphasize sunburn prevention when counseling patients with previous NMSC, especially younger adults, focusing on shade and sun avoidance over sunscreen. Copyright © 2016 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.

  8. Measuring globalization-based acculturation in Ladakh

    DEFF Research Database (Denmark)

    Ozer, Simon; Schwartz, Seth

    2016-01-01

    Theories and methodologies within acculturation psychology have been advanced in orderto capture the complex process of intercultural contact in various contexts. Differentiatingglobalization-based acculturation from immigrant-based acculturation has broadened thefield of acculturation psychology...... to include groups who are exposed to global culturalstreams without international migration. The globalization-based acculturation process inthe North Indian region of Ladakh appears to be a tricultural encounter, suggesting anaddendum to the bidimensional acculturation model for this group (and perhaps...... for othersas well). This study explores the development, usability, and validity of a tridimensionalacculturation measure aiming to capture the multicultural orientations initiated by theprocess of globalization in Ladakh. The tridimensional acculturation scale was found to fitthe data significantly better...

  9. Property-Based Software Engineering Measurement

    Science.gov (United States)

    Briand, Lionel C.; Morasca, Sandro; Basili, Victor R.

    1997-01-01

    Little theory exists in the field of software system measurement. Concepts such as complexity, coupling, cohesion or even size are very often subject to interpretation and appear to have inconsistent definitions in the literature. As a consequence, there is little guidance provided to the analyst attempting to define proper measures for specific problems. Many controversies in the literature are simply misunderstandings and stem from the fact that some people talk about different measurement concepts under the same label (complexity is the most common case). There is a need to define unambiguously the most important measurement concepts used in the measurement of software products. One way of doing so is to define precisely what mathematical properties characterize these concepts, regardless of the specific software artifacts to which these concepts are applied. Such a mathematical framework could generate a consensus in the software engineering community and provide a means for better communication among researchers, better guidelines for analysts, and better evaluation methods for commercial static analyzers for practitioners. In this paper, we propose a mathematical framework which is generic, because it is not specific to any particular software artifact and rigorous, because it is based on precise mathematical concepts. We use this framework to propose definitions of several important measurement concepts (size, length, complexity, cohesion, coupling). It does not intend to be complete or fully objective; other frameworks could have been proposed and different choices could have been made. However, we believe that the formalisms and properties we introduce are convenient and intuitive. This framework contributes constructively to a firmer theoretical ground of software measurement.

  10. Two new prediction rules for spontaneous pregnancy leading to live birth among subfertile couples, based on the synthesis of three previous models.

    NARCIS (Netherlands)

    C.C. Hunault; J.D.F. Habbema (Dik); M.J.C. Eijkemans (René); J.A. Collins (John); J.L.H. Evers (Johannes); E.R. te Velde (Egbert)

    2004-01-01

    textabstractBACKGROUND: Several models have been published for the prediction of spontaneous pregnancy among subfertile patients. The aim of this study was to broaden the empirical basis for these predictions by making a synthesis of three previously published models. METHODS:

  11. Acceleration and Orientation Jumping Performance Differences Among Elite Professional Male Handball Players With or Without Previous ACL Reconstruction: An Inertial Sensor Unit-Based Study.

    Science.gov (United States)

    Setuain, Igor; González-Izal, Miriam; Alfaro, Jesús; Gorostiaga, Esteban; Izquierdo, Mikel

    2015-12-01

    Handball is one of the most challenging sports for the knee joint. Persistent biomechanical and jumping capacity alterations can be observed in athletes with an anterior cruciate ligament (ACL) injury. Commonly identified jumping biomechanical alterations have been described by the use of laboratory technologies. However, portable and easy-to-handle technologies that enable an evaluation of jumping biomechanics at the training field are lacking. To analyze unilateral/bilateral acceleration and orientation jumping performance differences among elite male handball athletes with or without previous ACL reconstruction via a single inertial sensor unit device. Case control descriptive study. At the athletes' usual training court. Twenty-two elite male (6 ACL-reconstructed and 16 uninjured control players) handball players were evaluated. The participants performed a vertical jump test battery that included a 50-cm vertical bilateral drop jump, a 20-cm vertical unilateral drop jump, and vertical unilateral countermovement jump maneuvers. Peak 3-dimensional (X, Y, Z) acceleration (m·s(-2)), jump phase duration and 3-dimensional orientation values (°) were obtained from the inertial sensor unit device. Two-tailed t-tests and a one-way analysis of variance were performed to compare means. The P value cut-off for significance was set at P handball athletes with previous ACL reconstruction demonstrated a jumping biomechanical profile similar to control players, including similar jumping performance values in both bilateral and unilateral jumping maneuvers, several years after ACL reconstruction. These findings are in agreement with previous research showing full functional restoration of abilities in top-level male athletes after ACL reconstruction, rehabilitation and subsequent return to sports at the previous level. Copyright © 2015 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.

  12. Effect of feeding different cereal-based diets on the performance and gut health of weaned piglets with or without previous access to creep feed during lactation.

    Science.gov (United States)

    Torrallardona, D; Andrés-Elias, N; López-Soria, S; Badiola, I; Cerdà-Cuéllar, M

    2012-12-01

    A trial was conducted to evaluate the effect of different cereals on the performance, gut mucosa, and microbiota of weanling pigs with or without previous access to creep feed during lactation. A total of 108 newly weaned pigs (7.4 kg BW; 26 d of age; half with and half without creep feed) were used. Piglets were distributed by BW into 36 pens according to a 2 × 6 factorial arrangement of treatments with previous access to creep feed (with or without) and cereal source in the experimental diet [barley (Hordeum vulgare), rice (Oryza sativa)-wheat (Triticum aestivum) bran, corn (Zea mays), naked oats (Avena sativa), oats, or rice] as main factors. Pigs were offered the experimental diets for 21 d and performance was monitored. At day 21, 4 piglets from each treatment were killed and sampled for the histological evaluation of jejunal mucosa and the study of ileal and cecal microbiota by RFLP. The Manhattan distances between RFLP profiles were calculated and intragroup similarities (IGS) were estimated for each treatment. An interaction between cereal source and previous creep feeding was observed for ADFI (P creep feeding increased ADFI for the rice-wheat bran diet it reduced it for naked oats. No differences in mucosal morphology were observed except for deeper crypts in pigs that did not have previous access to creep feed (P creep feeding and cereal was also observed for the IGS of the cecal microbiota at day 21 (P creep feed reduced IGS in the piglets fed oats or barley but no differences were observed for the other cereal sources. It is concluded that the effect of creep feeding during lactation on the performance and the microbiota of piglets after weaning is dependent on the nature of the cereal in the postweaning diet.

  13. Korean Clinic Based Outcome Measure Studies

    Directory of Open Access Journals (Sweden)

    Jongbae Park

    2003-02-01

    Full Text Available Background: Evidence based medicine has become main tools for medical practice. However, conducting a highly ranked in the evidence hierarchy pyramid is not easy or feasible at all times and places. There remains a room for descriptive clinical outcome measure studies with admitting the limit of the intepretation. Aims: Presents three Korean clinic based outcome measure studies with a view to encouraging Korean clinicians to conduct similar studies. Methods: Three studies are presented briefly here including 1 Quality of Life of liver cancer patients after 8 Constitutional acupuncture; 2 Developing a Korean version of Measuring yourself Medical Outcome profile (MYMOP; and 3 Survey on 5 Shu points: a pilot In the first study, we have included 4 primary or secondary liver cancer patients collecting their diagnostic X-ray film and clinical data f개m their hospital, and asked them to fill in the European Organization Research and Treatment of Cancer, Quality of Life Questionnaire before the commencement of the treatment. The acupuncture treatment is set up format but not disclosed yet. The translation and developing a Korean version of outcome measures that is Korean clinician friendly has been sought for MYMOP is one of the most appropriate one. The permission was granted, the translation into Korean was done, then back translated into English only based on the Korean translation by the researcher who is bilingual in both languages. The back translation was compared by the original developer of MYMOP and confirmed usable. In order to test the existence of acupoints and meridians through popular forms of Korean acupuncture regimes, we aim at collecting opinions from 101 Korean clinicians that have used those forms. The questions asked include most effective symptoms, 5 Shu points, points those are least likely to use due to either adverse events or the lack of effectiveness, theoretical reasons for the above proposals, proposing outcome measures

  14. Previously unknown species of Aspergillus.

    Science.gov (United States)

    Gautier, M; Normand, A-C; Ranque, S

    2016-08-01

    The use of multi-locus DNA sequence analysis has led to the description of previously unknown 'cryptic' Aspergillus species, whereas classical morphology-based identification of Aspergillus remains limited to the section or species-complex level. The current literature highlights two main features concerning these 'cryptic' Aspergillus species. First, the prevalence of such species in clinical samples is relatively high compared with emergent filamentous fungal taxa such as Mucorales, Scedosporium or Fusarium. Second, it is clearly important to identify these species in the clinical laboratory because of the high frequency of antifungal drug-resistant isolates of such Aspergillus species. Matrix-assisted laser desorption/ionization-time of flight mass spectrometry (MALDI-TOF MS) has recently been shown to enable the identification of filamentous fungi with an accuracy similar to that of DNA sequence-based methods. As MALDI-TOF MS is well suited to the routine clinical laboratory workflow, it facilitates the identification of these 'cryptic' Aspergillus species at the routine mycology bench. The rapid establishment of enhanced filamentous fungi identification facilities will lead to a better understanding of the epidemiology and clinical importance of these emerging Aspergillus species. Based on routine MALDI-TOF MS-based identification results, we provide original insights into the key interpretation issues of a positive Aspergillus culture from a clinical sample. Which ubiquitous species that are frequently isolated from air samples are rarely involved in human invasive disease? Can both the species and the type of biological sample indicate Aspergillus carriage, colonization or infection in a patient? Highly accurate routine filamentous fungi identification is central to enhance the understanding of these previously unknown Aspergillus species, with a vital impact on further improved patient care. Copyright © 2016 European Society of Clinical Microbiology and

  15. Measuring Modularity in Open Source Code Bases

    Directory of Open Access Journals (Sweden)

    Roberto Milev

    2009-03-01

    Full Text Available Modularity of an open source software code base has been associated with growth of the software development community, the incentives for voluntary code contribution, and a reduction in the number of users who take code without contributing back to the community. As a theoretical construct, modularity links OSS to other domains of research, including organization theory, the economics of industry structure, and new product development. However, measuring the modularity of an OSS design has proven difficult, especially for large and complex systems. In this article, we describe some preliminary results of recent research at Carleton University that examines the evolving modularity of large-scale software systems. We describe a measurement method and a new modularity metric for comparing code bases of different size, introduce an open source toolkit that implements this method and metric, and provide an analysis of the evolution of the Apache Tomcat application server as an illustrative example of the insights gained from this approach. Although these results are preliminary, they open the door to further cross-discipline research that quantitatively links the concerns of business managers, entrepreneurs, policy-makers, and open source software developers.

  16. Augment clinical measurement using a constraint-based esophageal model

    Science.gov (United States)

    Kou, Wenjun; Acharya, Shashank; Kahrilas, Peter; Patankar, Neelesh; Pandolfino, John

    2017-11-01

    Quantifying the mechanical properties of the esophageal wall is crucial to understanding impairments of trans-esophageal flow characteristic of several esophageal diseases. However, these data are unavailable owing to technological limitations of current clinical diagnostic instruments that instead display esophageal luminal cross sectional area based on intraluminal impedance change. In this work, we developed an esophageal model to predict bolus flow and the wall property based on clinical measurements. The model used the constraint-based immersed-boundary method developed previously by our group. Specifically, we first approximate the time-dependent wall geometry based on impedance planimetry data on luminal cross sectional area. We then fed these along with pressure data into the model and computed wall tension based on simulated pressure and flow fields, and the material property based on the strain-stress relationship. As examples, we applied this model to augment FLIP (Functional Luminal Imaging Probe) measurements in three clinical cases: a normal subject, achalasia, and eosinophilic esophagitis (EoE). Our findings suggest that the wall stiffness was greatest in the EoE case, followed by the achalasia case, and then the normal. This is supported by NIH Grant R01 DK56033 and R01 DK079902.

  17. Link-Based Similarity Measures Using Reachability Vectors

    Directory of Open Access Journals (Sweden)

    Seok-Ho Yoon

    2014-01-01

    Full Text Available We present a novel approach for computing link-based similarities among objects accurately by utilizing the link information pertaining to the objects involved. We discuss the problems with previous link-based similarity measures and propose a novel approach for computing link based similarities that does not suffer from these problems. In the proposed approach each target object is represented by a vector. Each element of the vector corresponds to all the objects in the given data, and the value of each element denotes the weight for the corresponding object. As for this weight value, we propose to utilize the probability of reaching from the target object to the specific object, computed using the “Random Walk with Restart” strategy. Then, we define the similarity between two objects as the cosine similarity of the two vectors. In this paper, we provide examples to show that our approach does not suffer from the aforementioned problems. We also evaluate the performance of the proposed methods in comparison with existing link-based measures, qualitatively and quantitatively, with respect to two kinds of data sets, scientific papers and Web documents. Our experimental results indicate that the proposed methods significantly outperform the existing measures.

  18. Proportion of U.S. Civilian Population Ineligible for U.S. Air Force Enlistment Based on Current and Previous Weight Standards

    National Research Council Canada - National Science Library

    D'Mello, Tiffany A; Yamane, Grover K

    2007-01-01

    .... Until recently, gender-specific weight standards based on height were in place. However, in June 2006 the USAF implemented a new set of height-weight limits utilizing body mass index (BMI) criteria...

  19. Linear systems a measurement based approach

    CERN Document Server

    Bhattacharyya, S P; Mohsenizadeh, D N

    2014-01-01

    This brief presents recent results obtained on the analysis, synthesis and design of systems described by linear equations. It is well known that linear equations arise in most branches of science and engineering as well as social, biological and economic systems. The novelty of this approach is that no models of the system are assumed to be available, nor are they required. Instead, a few measurements made on the system can be processed strategically to directly extract design values that meet specifications without constructing a model of the system, implicitly or explicitly. These new concepts are illustrated by applying them to linear DC and AC circuits, mechanical, civil and hydraulic systems, signal flow block diagrams and control systems. These applications are preliminary and suggest many open problems. The results presented in this brief are the latest effort in this direction and the authors hope these will lead to attractive alternatives to model-based design of engineering and other systems.

  20. Heterogeneity Measurement Based on Distance Measure for Polarimetric SAR Data

    Science.gov (United States)

    Xing, Xiaoli; Chen, Qihao; Liu, Xiuguo

    2018-04-01

    To effectively test the scene heterogeneity for polarimetric synthetic aperture radar (PolSAR) data, in this paper, the distance measure is introduced by utilizing the similarity between the sample and pixels. Moreover, given the influence of the distribution and modeling texture, the K distance measure is deduced according to the Wishart distance measure. Specifically, the average of the pixels in the local window replaces the class center coherency or covariance matrix. The Wishart and K distance measure are calculated between the average matrix and the pixels. Then, the ratio of the standard deviation to the mean is established for the Wishart and K distance measure, and the two features are defined and applied to reflect the complexity of the scene. The proposed heterogeneity measure is proceeded by integrating the two features using the Pauli basis. The experiments conducted on the single-look and multilook PolSAR data demonstrate the effectiveness of the proposed method for the detection of the scene heterogeneity.

  1. Using satellite-based measurements to explore ...

    Science.gov (United States)

    New particle formation (NPF) can potentially alter regional climate by increasing aerosol particle (hereafter particle) number concentrations and ultimately cloud condensation nuclei. The large scales on which NPF is manifest indicate potential to use satellite-based (inherently spatially averaged) measurements of atmospheric conditions to diagnose the occurrence of NPF and NPF characteristics. We demonstrate the potential for using satellite-measurements of insolation (UV), trace gas concentrations (sulfur dioxide (SO2), nitrogen dioxide (NO2), ammonia (NH3), formaldehyde (HCHO), ozone (O3)), aerosol optical properties (aerosol optical depth (AOD), Ångström exponent (AE)), and a proxy of biogenic volatile organic compound emissions (leaf area index (LAI), temperature (T)) as predictors for NPF characteristics: formation rates, growth rates, survival probabilities, and ultrafine particle (UFP) concentrations at five locations across North America. NPF at all sites is most frequent in spring, exhibits a one-day autocorrelation, and is associated with low condensational sink (AOD×AE) and HCHO concentrations, and high UV. However, there are important site-to-site variations in NPF frequency and characteristics, and in which of the predictor variables (particularly gas concentrations) significantly contribute to the explanatory power of regression models built to predict those characteristics. This finding may provide a partial explanation for the reported spatia

  2. Five-year efficacy and safety of tenofovir-based salvage therapy for patients with chronic hepatitis B who previously failed LAM/ADV therapy.

    Science.gov (United States)

    Lim, Lucy; Thompson, Alexander; Patterson, Scott; George, Jacob; Strasser, Simone; Lee, Alice; Sievert, William; Nicoll, Amanda; Desmond, Paul; Roberts, Stuart; Marion, Kaye; Bowden, Scott; Locarnini, Stephen; Angus, Peter

    2017-06-01

    Multidrug-resistant HBV continues to be an important clinical problem. The TDF-109 study demonstrated that TDF±LAM is an effective salvage therapy through 96 weeks for LAM-resistant patients who previously failed ADV add-on or switch therapy. We evaluated the 5-year efficacy and safety outcomes in patients receiving long-term TDF±LAM in the TDF-109 study. A total of 59 patients completed the first phase of the TDF-109 study and 54/59 were rolled over into a long-term prospective open-label study of TDF±LAM 300 mg daily. Results are reported at the end of year 5 of treatment. At year 5, 75% (45/59) had achieved viral suppression by intent-to-treat analysis. Per-protocol assessment revealed 83% (45/54) were HBV DNA undetectable. Nine patients remained HBV DNA detectable, however 8/9 had very low HBV DNA levels (<264IU/mL) and did not meet virological criteria for virological breakthrough (VBT). One patient experienced VBT, but this was in the setting of documented non-compliance. The response was independent of baseline LAM therapy or mutations conferring ADV resistance. Four patients discontinued TDF, one patient was lost to follow-up and one died from hepatocellular carcinoma. Long-term TDF treatment appears to be safe and effective in patients with prior failure of LAM and a suboptimal response to ADV therapy. These findings confirm that TDF has a high genetic barrier to resistance is active against multidrug-resistant HBV, and should be the preferred oral anti-HBV agent in CHB patients who fail treatment with LAM and ADV. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  3. Multiparty correlation measure based on the cumulant

    International Nuclear Information System (INIS)

    Zhou, D. L.; Zeng, B.; Xu, Z.; You, L.

    2006-01-01

    We propose a genuine multiparty correlation measure for a multiparty quantum system as the trace norm of the cumulant of the state. The legitimacy of our multiparty correlation measure is explicitly demonstrated by proving it satisfies the five basic conditions required for a correlation measure. As an application we construct an efficient algorithm for the calculation of our measures for all stabilizer states

  4. Population-based Neisseria gonorrhoeae, Chlamydia trachomatis and Trichomonas vaginalis Prevalence Using Discarded, Deidentified Urine Specimens Previously Collected for Drug Testing (Open Access Publisher’s Version)

    Science.gov (United States)

    2017-10-24

    trichomonas vaginalis testing, Melinda Balansay-ames, chris Myers and gary Brice for Pcr- based sex determination testing, and Kimberly De Vera for...2017-053355 rEFErEnCEs 1 torrone e , Papp J, Weinstock H. centers for Disease control and Prevention (cDc). Prevalence of Chlamydia trachomatis genital...infection among persons aged 14-39 years-United States, 2007-2012. MMWR Morb Mortal Wkly Rep 2014;63:834–7. 2 rietmeijer ca, Hopkins e , geisler WM

  5. Diagnostic Accuracy of Robot-Guided, Software Based Transperineal MRI/TRUS Fusion Biopsy of the Prostate in a High Risk Population of Previously Biopsy Negative Men

    Directory of Open Access Journals (Sweden)

    Malte Kroenig

    2016-01-01

    Full Text Available Objective. In this study, we compared prostate cancer detection rates between MRI-TRUS fusion targeted and systematic biopsies using a robot-guided, software based transperineal approach. Methods and Patients. 52 patients received a MRIT/TRUS fusion followed by a systematic volume adapted biopsy using the same robot-guided transperineal approach. The primary outcome was the detection rate of clinically significant disease (Gleason grade ≥ 4. Secondary outcomes were detection rate of all cancers, sampling efficiency and utility, and serious adverse event rate. Patients received no antibiotic prophylaxis. Results. From 52 patients, 519 targeted biopsies from 135 lesions and 1561 random biopsies were generated (total n=2080. Overall detection rate of clinically significant PCa was 44.2% (23/52 and 50.0% (26/52 for target and random biopsy, respectively. Sampling efficiency as the median number of cores needed to detect clinically significant prostate cancer was 9 for target (IQR: 6–14.0 and 32 (IQR: 24–32 for random biopsy. The utility as the number of additionally detected clinically significant PCa cases by either strategy was 0% (0/52 for target and 3.9% (2/52 for random biopsy. Conclusions. MRI/TRUS fusion based target biopsy did not show an advantage in the overall detection rate of clinically significant prostate cancer.

  6. New Diagnosis of AIDS Based on Salmonella enterica subsp. I (enterica Enteritidis (A Meningitis in a Previously Immunocompetent Adult in the United States

    Directory of Open Access Journals (Sweden)

    Andrew C. Elton

    2017-01-01

    Full Text Available Salmonella meningitis is a rare manifestation of meningitis typically presenting in neonates and the elderly. This infection typically associates with foodborne outbreaks in developing nations and AIDS-endemic regions. We report a case of a 19-year-old male presenting with altered mental status after 3-day absence from work at a Wisconsin tourist area. He was febrile, tachycardic, and tachypneic with a GCS of 8. The patient was intubated and a presumptive diagnosis of meningitis was made. Treatment was initiated with ceftriaxone, vancomycin, acyclovir, dexamethasone, and fluid resuscitation. A lumbar puncture showed cloudy CSF with Gram negative rods. He was admitted to the ICU. CSF culture confirmed Salmonella enterica subsp. I (enterica Enteritidis (A. Based on this finding, a 4th-generation HIV antibody/p24 antigen test was sent. When this returned positive, a CD4 count was obtained and showed 3 cells/mm3, confirming AIDS. The patient ultimately received 38 days of ceftriaxone, was placed on elvitegravir, cobicistat, emtricitabine, and tenofovir alafenamide (Genvoya for HIV/AIDS, and was discharged neurologically intact after a 44-day admission.

  7. Cost-Effectiveness Model for Chemoimmunotherapy Options in Patients with Previously Untreated Chronic Lymphocytic Leukemia Unsuitable for Full-Dose Fludarabine-Based Therapy.

    Science.gov (United States)

    Becker, Ursula; Briggs, Andrew H; Moreno, Santiago G; Ray, Joshua A; Ngo, Phuong; Samanta, Kunal

    2016-06-01

    To evaluate the cost-effectiveness of treatment with anti-CD20 monoclonal antibody obinutuzumab plus chlorambucil (GClb) in untreated patients with chronic lymphocytic leukemia unsuitable for full-dose fludarabine-based therapy. A Markov model was used to assess the cost-effectiveness of GClb versus other chemoimmunotherapy options. The model comprised three mutually exclusive health states: "progression-free survival (with/without therapy)", "progression (refractory/relapsed lines)", and "death". Each state was assigned a health utility value representing patients' quality of life and a specific cost value. Comparisons between GClb and rituximab plus chlorambucil or only chlorambucil were performed using patient-level clinical trial data; other comparisons were performed via a network meta-analysis using information gathered in a systematic literature review. To support the model, a utility elicitation study was conducted from the perspective of the UK National Health Service. There was good agreement between the model-predicted progression-free and overall survival and that from the CLL11 trial. On incorporating data from the indirect treatment comparisons, it was found that GClb was cost-effective with a range of incremental cost-effectiveness ratios below a threshold of £30,000 per quality-adjusted life-year gained, and remained so during deterministic and probabilistic sensitivity analyses under various scenarios. GClb was estimated to increase both quality-adjusted life expectancy and treatment costs compared with several commonly used therapies, with incremental cost-effectiveness ratios below commonly referenced UK thresholds. This article offers a real example of how to combine direct and indirect evidence in a cost-effectiveness analysis of oncology drugs. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  8. Assessing Therapist Competence: Development of a Performance-Based Measure and Its Comparison With a Web-Based Measure.

    Science.gov (United States)

    Cooper, Zafra; Doll, Helen; Bailey-Straebler, Suzanne; Bohn, Kristin; de Vries, Dian; Murphy, Rebecca; O'Connor, Marianne E; Fairburn, Christopher G

    2017-10-31

    Recent research interest in how best to train therapists to deliver psychological treatments has highlighted the need for rigorous, but scalable, means of measuring therapist competence. There are at least two components involved in assessing therapist competence: the assessment of their knowledge of the treatment concerned, including how and when to use its strategies and procedures, and an evaluation of their ability to apply such knowledge skillfully in practice. While the assessment of therapists' knowledge has the potential to be completed efficiently on the Web, the assessment of skill has generally involved a labor-intensive process carried out by clinicians, and as such, may not be suitable for assessing training outcome in certain circumstances. The aims of this study were to develop and evaluate a role-play-based measure of skill suitable for assessing training outcome and to compare its performance with a highly scalable Web-based measure of applied knowledge. Using enhanced cognitive behavioral therapy (CBT-E) for eating disorders as an exemplar, clinical scenarios for role-play assessment were developed and piloted together with a rating scheme for assessing trainee therapists' performance. These scenarios were evaluated by examining the performance of 93 therapists from different professional backgrounds and at different levels of training in implementing CBT-E. These therapists also completed a previously developed Web-based measure of applied knowledge, and the ability of the Web-based measure to efficiently predict competence on the role-play measure was investigated. The role-play measure assessed performance at implementing a range of CBT-E procedures. The majority of the therapists rated their performance as moderately or closely resembling their usual clinical performance. Trained raters were able to achieve good-to-excellent reliability for averaged competence, with intraclass correlation coefficients ranging from .653 to 909. The measure was

  9. Orthogonality Measurement for Homogenous Projects-Bases

    Science.gov (United States)

    Ivan, Ion; Sandu, Andrei; Popa, Marius

    2009-01-01

    The homogenous projects-base concept is defined. Next, the necessary steps to create a homogenous projects-base are presented. A metric system is built, which then will be used for analyzing projects. The indicators which are meaningful for analyzing a homogenous projects-base are selected. The given hypothesis is experimentally verified. The…

  10. Laser-based measuring equipment controlled by microcomputer

    International Nuclear Information System (INIS)

    Miron, N.; Sporea, D.; Velculescu, V.G.; Petre, M.

    1988-03-01

    Some laser-based measuring equipment controlled by microcomputer developed for industrial and scientific purposes are described. These equipments are intended for dial indicators verification, graduated rules measurement, and for very accurate measurement of the gravitational constant. (authors)

  11. Electroencephalogram measurement using polymer-based dry microneedle electrode

    Science.gov (United States)

    Arai, Miyako; Nishinaka, Yuya; Miki, Norihisa

    2015-06-01

    In this paper, we report a successful electroencephalogram (EEG) measurement using polymer-based dry microneedle electrodes. The electrodes consist of needle-shaped substrates of SU-8, a silver film, and a nanoporous parylene protective film. Differently from conventional wet electrodes, microneedle electrodes do not require skin preparation and a conductive gel. SU-8 is superior as a structural material to poly(dimethylsiloxane) (PDMS; Dow Corning Toray Sylgard 184) in terms of hardness, which was used in our previous work, and facilitates the penetration of needles through the stratum corneum. SU-8 microneedles can be successfully inserted into the skin without breaking and could maintain a sufficiently low skin-electrode contact impedance for EEG measurement. The electrodes successfully measured EEG from the frontal pole, and the quality of acquired signals was verified to be as high as those obtained using commercially available wet electrodes without any skin preparation or a conductive gel. The electrodes are readily applicable to record brain activities for a long period with little stress involved in skin preparation to the users.

  12. Parkinson's disease detection based on dysphonia measurements

    Science.gov (United States)

    Lahmiri, Salim

    2017-04-01

    Assessing dysphonic symptoms is a noninvasive and effective approach to detect Parkinson's disease (PD) in patients. The main purpose of this study is to investigate the effect of different dysphonia measurements on PD detection by support vector machine (SVM). Seven categories of dysphonia measurements are considered. Experimental results from ten-fold cross-validation technique demonstrate that vocal fundamental frequency statistics yield the highest accuracy of 88 % ± 0.04. When all dysphonia measurements are employed, the SVM classifier achieves 94 % ± 0.03 accuracy. A refinement of the original patterns space by removing dysphonia measurements with similar variation across healthy and PD subjects allows achieving 97.03 % ± 0.03 accuracy. The latter performance is larger than what is reported in the literature on the same dataset with ten-fold cross-validation technique. Finally, it was found that measures of ratio of noise to tonal components in the voice are the most suitable dysphonic symptoms to detect PD subjects as they achieve 99.64 % ± 0.01 specificity. This finding is highly promising for understanding PD symptoms.

  13. Ground-based measurements of ionospheric dynamics

    Science.gov (United States)

    Kouba, Daniel; Chum, Jaroslav

    2018-05-01

    Different methods are used to research and monitor the ionospheric dynamics using ground measurements: Digisonde Drift Measurements (DDM) and Continuous Doppler Sounding (CDS). For the first time, we present comparison between both methods on specific examples. Both methods provide information about the vertical drift velocity component. The DDM provides more information about the drift velocity vector and detected reflection points. However, the method is limited by the relatively low time resolution. In contrast, the strength of CDS is its high time resolution. The discussed methods can be used for real-time monitoring of medium scale travelling ionospheric disturbances. We conclude that it is advantageous to use both methods simultaneously if possible. The CDS is then applied for the disturbance detection and analysis, and the DDM is applied for the reflection height control.

  14. Statistical Measures for Usage-Based Linguistics

    Science.gov (United States)

    Gries, Stefan Th.; Ellis, Nick C.

    2015-01-01

    The advent of usage-/exemplar-based approaches has resulted in a major change in the theoretical landscape of linguistics, but also in the range of methodologies that are brought to bear on the study of language acquisition/learning, structure, and use. In particular, methods from corpus linguistics are now frequently used to study distributional…

  15. Novel measurement-based indoor cellular radio system design

    OpenAIRE

    Aragón-Zavala, A

    2008-01-01

    A scaleable, measurement-based radio methodology has been created to use for the design, planing and optimisation of in door cellular radio systems. The development of this measurement-based methodology was performed having in mind that measurements are of ten required to valiate radio coverage in a building. Therefore, the concept of using care fully calibrated measurements to design and optimise a system is feasible since these measurements can easily be obtained prior to system deployment ...

  16. Miniaturized diffraction based interferometric distance measurement sensor

    Science.gov (United States)

    Kim, Byungki

    In this thesis, new metrology hardware is designed, fabricated, and tested to provide improvements over current MEMS metrology. The metrology system is a micromachined scanning interferometer (muSI) having a sub-nm resolution in a compact design. The proposed microinterferometer forms a phase sensitive diffraction grating with interferomeric sensitivity, while adding the capability of better lateral resolution by focusing the laser to a smaller spot size. A detailed diffraction model of the microinterferometer was developed to simulate the device performance and to suggest the location of photo detectors for integrated optoelectronics. A particular device is fabricated on a fused silica substrate using aluminum to form the deformable diffraction grating fingers and AZ P4620 photo resist (PR) for the microlens. The details of the fabrication processes are presented. The structure also enables optoelectronics to be integrated so that the interferometer with photo detectors can fit in an area that is 1 mm x 1 mm. The scanning results using a fixed grating muSI demonstrated that it could measure vibration profile as well as static vertical (less than a half wave length) and lateral dimension of MEMS. The muSI, which is integrated with photo diodes, demonstrated its operation by scanning a cMUT. The PID control has been tested and resulted in improvement in scanned images. The integrated muSI demonstrated that the deformable grating could be used to tune the measurement keep the interferometer in quadrature for highest sensitivity.

  17. Development of microcontroller based water flow measurement

    Science.gov (United States)

    Munir, Muhammad Miftahul; Surachman, Arif; Fathonah, Indra Wahyudin; Billah, Muhammad Aziz; Khairurrijal, Mahfudz, Hernawan; Rimawan, Ririn; Lestari, Slamet

    2015-04-01

    A digital instrument for measuring water flow was developed using an AT89S52 microcontroller, DS1302 real time clock (RTC), and EEPROM for an external memory. The sensor used for probing the current was a propeller that will rotate if immersed in a water flow. After rotating one rotation, the sensor sends one pulse and the number of pulses are counted for a certain time of counting. The measurement data, i.e. the number of pulses per unit time, are converted into water flow velocity (m/s) through a mathematical formula. The microcontroller counts the pulse sent by the sensor and the number of counted pulses are stored into the EEPROM memory. The time interval for counting is provided by the RTC and can be set by the operator. The instrument was tested under various time intervals ranging from 10 to 40 seconds and several standard propellers owned by Experimental Station for Hydraulic Structure and Geotechnics (BHGK), Research Institute for Water Resources (Pusair). Using the same propellers and water flows, it was shown that water flow velocities obtained from the developed digital instrument and those found by the provided analog one are almost similar.

  18. Bridge continuous deformation measurement technology based on fiber optic gyro

    Science.gov (United States)

    Gan, Weibing; Hu, Wenbin; Liu, Fang; Tang, Jianguang; Li, Sheng; Yang, Yan

    2016-03-01

    Bridge is an important part of modern transportation systems and deformation is a key index for bridge's safety evaluation. To achieve the long span bridge curve measurement rapidly and timely and accurately locate the bridge maximum deformation, the continuous deformation measurement system (CDMS) based on inertial platform is presented and validated in this paper. Firstly, based on various bridge deformation measurement methods, the method of deformation measurement based on the fiber optic gyro (FOG) is introduced. Secondly, the basic measurement principle based on FOG is presented and the continuous curve trajectory is derived by the formula. Then the measurement accuracy is analyzed in theory and the relevant factors are presented to ensure the measurement accuracy. Finally, the deformation measurement experiments are conducted on a bridge across the Yangtze River. Experimental results show that the presented deformation measurement method is feasible, practical, and reliable; the system can accurately and quickly locate the maximum deformation and has extensive and broad application prospects.

  19. A Transdermal Measurement Platform Based on Microfluidics

    Directory of Open Access Journals (Sweden)

    Wen-Ying Huang

    2017-01-01

    Full Text Available The Franz diffusion cell is one of the most widely used devices to evaluate transdermal drug delivery. However, this static and nonflowing system has some limitations, such as a relatively large solution volume and skin area and the development of gas bubbles during sampling. To overcome these disadvantages, this study provides a proof of concept for miniaturizing models of transdermal delivery by using a microfluidic chip combined with a diffusion cell. The proposed diffusion microchip system requires only 80 μL of sample solution and provides flow circulation. Two model compounds, Coomassie Brilliant Blue G-250 and potassium ferricyanide, were successfully tested for transdermal delivery experiments. The diffusion rate is high for a high sample concentration or a large membrane pore size. The developed diffusion microchip system, which is feasible, can be applied for transdermal measurement in the future.

  20. Competency-Based Education: A Framework for Measuring Quality Courses

    Science.gov (United States)

    Krause, Jackie; Dias, Laura Portolese; Schedler, Chris

    2015-01-01

    The growth of competency-based education in an online environment requires the development and measurement of quality competency-based courses. While quality measures for online courses have been developed and standardized, they do not directly align with emerging best practices and principles in the design of quality competency-based online…

  1. Calibration Base Lines for Electronic Distance Measuring Instruments (EDMI)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A calibration base line (CBL) is a precisely measured, straight-line course of approximately 1,400 m used to calibrate Electronic Distance Measuring Instruments...

  2. Fourier transform based scalable image quality measure.

    Science.gov (United States)

    Narwaria, Manish; Lin, Weisi; McLoughlin, Ian; Emmanuel, Sabu; Chia, Liang-Tien

    2012-08-01

    We present a new image quality assessment (IQA) algorithm based on the phase and magnitude of the 2D (twodimensional) Discrete Fourier Transform (DFT). The basic idea is to compare the phase and magnitude of the reference and distorted images to compute the quality score. However, it is well known that the Human Visual Systems (HVSs) sensitivity to different frequency components is not the same. We accommodate this fact via a simple yet effective strategy of nonuniform binning of the frequency components. This process also leads to reduced space representation of the image thereby enabling the reduced-reference (RR) prospects of the proposed scheme. We employ linear regression to integrate the effects of the changes in phase and magnitude. In this way, the required weights are determined via proper training and hence more convincing and effective. Lastly, using the fact that phase usually conveys more information than magnitude, we use only the phase for RR quality assessment. This provides the crucial advantage of further reduction in the required amount of reference image information. The proposed method is therefore further scalable for RR scenarios. We report extensive experimental results using a total of 9 publicly available databases: 7 image (with a total of 3832 distorted images with diverse distortions) and 2 video databases (totally 228 distorted videos). These show that the proposed method is overall better than several of the existing fullreference (FR) algorithms and two RR algorithms. Additionally, there is a graceful degradation in prediction performance as the amount of reference image information is reduced thereby confirming its scalability prospects. To enable comparisons and future study, a Matlab implementation of the proposed algorithm is available at http://www.ntu.edu.sg/home/wslin/reduced_phase.rar.

  3. Highlights from the previous volumes

    Science.gov (United States)

    Jacopo, Iacovacci; Takahiro, Kohsokabe; Kunihiko, Kaneko; al., Lange Steffen et; al., Helden Laurent et; et al.

    2017-04-01

    Functional Multiplex PageRank: The centrality is a functionPattern formation induced by fixed boundary conditionPower-law distributed Poincaré recurrences in higher-dimensional systemsMeasurement of second-order response without perturbation

  4. A new method to measure necrotic core and calcium content in coronary plaques using intravascular ultrasound radiofrequency-based analysis

    NARCIS (Netherlands)

    E.S. Shin (Eun-Seok); H.M. Garcia-Garcia (Hector); P.W.J.C. Serruys (Patrick)

    2010-01-01

    textabstractAlthough previous intravascular ultrasound (IVUS) radiofrequency-based analysis data showed acceptable reproducibility for plaque composition, measurements are not easily obtained, particularly that of lumen contour, because of the limited IVUS resolution. The purpose of this study was

  5. Triangulation-based edge measurement using polyview optics

    Science.gov (United States)

    Li, Yinan; Kästner, Markus; Reithmeier, Eduard

    2018-04-01

    Laser triangulation sensors as non-contact measurement devices are widely used in industry and research for profile measurements and quantitative inspections. Some technical applications e.g. edge measurements usually require a configuration of a single sensor and a translation stage or a configuration of multiple sensors, so that they can measure a large measurement range that is out of the scope of a single sensor. However, the cost of both configurations is high, due to the additional rotational axis or additional sensor. This paper provides a special measurement system for measurement of great curved surfaces based on a single sensor configuration. Utilizing a self-designed polyview optics and calibration process, the proposed measurement system allows an over 180° FOV (field of view) with a precise measurement accuracy as well as an advantage of low cost. The detailed capability of this measurement system based on experimental data is discussed in this paper.

  6. An Entropy-Based Measure for Assessing Fuzziness in Logistic Regression

    Science.gov (United States)

    Weiss, Brandi A.; Dardick, William

    2016-01-01

    This article introduces an entropy-based measure of data-model fit that can be used to assess the quality of logistic regression models. Entropy has previously been used in mixture-modeling to quantify how well individuals are classified into latent classes. The current study proposes the use of entropy for logistic regression models to quantify…

  7. History and measurement of the base and derived units

    CERN Document Server

    Treese, Steven A

    2018-01-01

    This book discusses how and why historical measurement units developed, and reviews useful methods for making conversions as well as situations in which dimensional analysis can be used. It starts from the history of length measurement, which is one of the oldest measures used by humans. It highlights the importance of area measurement, briefly discussing the methods for determining areas mathematically and by measurement. The book continues on to detail the development of measures for volume, mass, weight, time, temperature, angle, electrical units, amounts of substances, and light intensity. The seven SI/metric base units are highlighted, as well as a number of other units that have historically been used as base units. Providing a comprehensive reference for interconversion among the commonly measured quantities in the different measurement systems with engineering accuracy, it also examines the relationships among base units in fields such as mechanical/thermal, electromagnetic and physical flow rates and...

  8. The Reliability of Randomly Generated Math Curriculum-Based Measurements

    Science.gov (United States)

    Strait, Gerald G.; Smith, Bradley H.; Pender, Carolyn; Malone, Patrick S.; Roberts, Jarod; Hall, John D.

    2015-01-01

    "Curriculum-Based Measurement" (CBM) is a direct method of academic assessment used to screen and evaluate students' skills and monitor their responses to academic instruction and intervention. Interventioncentral.org offers a math worksheet generator at no cost that creates randomly generated "math curriculum-based measures"…

  9. Performance-Based Measurement: Action for Organizations and HPT Accountability

    Science.gov (United States)

    Larbi-Apau, Josephine A.; Moseley, James L.

    2010-01-01

    Basic measurements and applications of six selected general but critical operational performance-based indicators--effectiveness, efficiency, productivity, profitability, return on investment, and benefit-cost ratio--are presented. With each measurement, goals and potential impact are explored. Errors, risks, limitations to measurements, and a…

  10. Subcopula-based measure of asymmetric association for contingency tables.

    Science.gov (United States)

    Wei, Zheng; Kim, Daeyoung

    2017-10-30

    For the analysis of a two-way contingency table, a new asymmetric association measure is developed. The proposed method uses the subcopula-based regression between the discrete variables to measure the asymmetric predictive powers of the variables of interest. Unlike the existing measures of asymmetric association, the subcopula-based measure is insensitive to the number of categories in a variable, and thus, the magnitude of the proposed measure can be interpreted as the degree of asymmetric association in the contingency table. The theoretical properties of the proposed subcopula-based asymmetric association measure are investigated. We illustrate the performance and advantages of the proposed measure using simulation studies and real data examples. Copyright © 2017 John Wiley & Sons, Ltd.

  11. Highlights from the previous volumes

    Science.gov (United States)

    Vergini Eduardo, G.; Pan, Y.; al., Vardi R. et; al., Akkermans Eric et; et al.

    2014-01-01

    Semiclassical propagation up to the Heisenberg time Superconductivity and magnetic order in the half-Heusler compound ErPdBi An experimental evidence-based computational paradigm for new logic-gates in neuronal activity Universality in the symmetric exclusion process and diffusive systems

  12. Associations of genetic risk scores based on adult adiposity pathways with childhood growth and adiposity measures

    OpenAIRE

    Monnereau, Claire; Vogelezang, Suzanne; Kruithof, Claudia J.; Jaddoe, Vincent W. V.; Felix, Janine F.

    2016-01-01

    Background Results from genome-wide association studies (GWAS) identified many loci and biological pathways that influence adult body mass index (BMI). We aimed to identify if biological pathways related to adult BMI also affect infant growth and childhood adiposity measures. Methods We used data from a population-based prospective cohort study among 3,975 children with a mean age of 6?years. Genetic risk scores were constructed based on the 97 SNPs associated with adult BMI previously identi...

  13. SEM based overlay measurement between resist and buried patterns

    Science.gov (United States)

    Inoue, Osamu; Okagawa, Yutaka; Hasumi, Kazuhisa; Shao, Chuanyu; Leray, Philippe; Lorusso, Gian; Baudemprez, Bart

    2016-03-01

    With the continuous shrink in pattern size and increased density, overlay control has become one of the most critical issues in semiconductor manufacturing. Recently, SEM based overlay of AEI (After Etch Inspection) wafer has been used for reference and optimization of optical overlay (both Image Based Overlay (IBO) and Diffraction Based Overlay (DBO)). Overlay measurement at AEI stage contributes monitor and forecast the yield after formation by etch and calibrate optical measurement tools. however those overlay value seems difficult directly for feedback to a scanner. Therefore, there is a clear need to have SEM based overlay measurements of ADI (After Develop Inspection) wafers in order to serve as reference for optical overlay and make necessary corrections before wafers go to etch. Furthermore, to make the corrections as accurate as possible, actual device like feature dimensions need to be measured post ADI. This device size measurement is very unique feature of CDSEM , which can be measured with smaller area. This is currently possible only with the CD-SEM. This device size measurement is very unique feature of CD-SEM , which can be measured with smaller area. In this study, we assess SEM based overlay measurement of ADI and AEI wafer by using a sample from an N10 process flow. First, we demonstrate SEM based overlay performance at AEI by using dual damascene process for Via 0 (V0) and metal 1 (M1) layer. We also discuss the overlay measurements between litho-etch-litho stages of a triple patterned M1 layer and double pattern V0. Second, to illustrate the complexities in image acquisition and measurement we will measure overlay between M1B resist and buried M1A-Hard mask trench. Finally, we will show how high accelerating voltage can detect buried pattern information by BSE (Back Scattering Electron). In this paper we discuss the merits of this method versus standard optical metrology based corrections.

  14. Observational attachment theory-based parenting measures predict children's attachment narratives independently from social learning theory-based measures.

    Science.gov (United States)

    Matias, Carla; O'Connor, Thomas G; Futh, Annabel; Scott, Stephen

    2014-01-01

    Conceptually and methodologically distinct models exist for assessing quality of parent-child relationships, but few studies contrast competing models or assess their overlap in predicting developmental outcomes. Using observational methodology, the current study examined the distinctiveness of attachment theory-based and social learning theory-based measures of parenting in predicting two key measures of child adjustment: security of attachment narratives and social acceptance in peer nominations. A total of 113 5-6-year-old children from ethnically diverse families participated. Parent-child relationships were rated using standard paradigms. Measures derived from attachment theory included sensitive responding and mutuality; measures derived from social learning theory included positive attending, directives, and criticism. Child outcomes were independently-rated attachment narrative representations and peer nominations. Results indicated that Attachment theory-based and Social Learning theory-based measures were modestly correlated; nonetheless, parent-child mutuality predicted secure child attachment narratives independently of social learning theory-based measures; in contrast, criticism predicted peer-nominated fighting independently of attachment theory-based measures. In young children, there is some evidence that attachment theory-based measures may be particularly predictive of attachment narratives; however, no single model of measuring parent-child relationships is likely to best predict multiple developmental outcomes. Assessment in research and applied settings may benefit from integration of different theoretical and methodological paradigms.

  15. A prospective evaluation of treatment with Selective Internal Radiation Therapy (SIR-spheres) in patients with unresectable liver metastases from colorectal cancer previously treated with 5-FU based chemotherapy

    International Nuclear Information System (INIS)

    Lim, L; Gibbs, P; Yip, D; Shapiro, JD; Dowling, R; Smith, D; Little, A; Bailey, W; Liechtenstein, M

    2005-01-01

    To prospectively evaluate the efficacy and safety of selective internal radiation (SIR) spheres in patients with inoperable liver metastases from colorectal cancer who have failed 5FU based chemotherapy. Patients were prospectively enrolled at three Australian centres. All patients had previously received 5-FU based chemotherapy for metastatic colorectal cancer. Patients were ECOG 0–2 and had liver dominant or liver only disease. Concurrent 5-FU was given at investigator discretion. Thirty patients were treated between January 2002 and March 2004. As of July 2004 the median follow-up is 18.3 months. Median patient age was 61.7 years (range 36 – 77). Twenty-nine patients are evaluable for toxicity and response. There were 10 partial responses (33%), with the median duration of response being 8.3 months (range 2–18) and median time to progression of 5.3 mths. Response rates were lower (21%) and progression free survival shorter (3.9 mths) in patients that had received all standard chemotherapy options (n = 14). No responses were seen in patients with a poor performance status (n = 3) or extrahepatic disease (n = 6). Overall treatment related toxicity was acceptable, however significant late toxicity included 4 cases of gastric ulceration. In patients with metastatic colorectal cancer that have previously received treatment with 5-FU based chemotherapy, treatment with SIR-spheres has demonstrated encouraging activity. Further studies are required to better define the subsets of patients most likely to respond

  16. Fracture toughness measurements of WC-based hard metals

    International Nuclear Information System (INIS)

    Prakash, L.; Albert, B.

    1983-01-01

    The fracture toughness of WC-based cemented carbides was determined by different methods. The values obtained are dependent on the procedure of measurement. Each method thoughness of hard metals mutually. (orig.) [de

  17. Multivariate Methods Based Soft Measurement for Wine Quality Evaluation

    Directory of Open Access Journals (Sweden)

    Shen Yin

    2014-01-01

    a decision. However, since the physicochemical indexes of wine can to some extent reflect the quality of wine, the multivariate statistical methods based soft measure can help the oenologist in wine evaluation.

  18. Multi-beam synchronous measurement based on PSD phase detection using frequency-domain multiplexing

    Science.gov (United States)

    Duan, Ying; Qin, Lan; Xue, Lian; Xi, Feng; Mao, Jiubing

    2013-10-01

    According to the principle of centroid measurement, position-sensitive detectors (PSD) are commonly used for micro displacement detection. However, single-beam detection method cannot satisfy such tasks as multi-dimension position measurement, three dimension vision reconstruction, and robot precision positioning, which require synchronous measurement of multiple light beams. Consequently, we designed PSD phase detection method using frequency-domain multiplexing for synchronous detection of multiple modulated light beams. Compared to previous PSD amplitude detection method, the phase detection method using FDM has advantages of simplified measuring system, low cost, high capability of resistance to light interference as well as improved resolution. The feasibility of multi-beam synchronous measurement based on PSD phase detection using FDM was validated by multi-beam measuring experiments. The maximum non-linearity error of the multi-beam synchronous measurement is 6.62%.

  19. Calibration of Smartphone-Based Weather Measurements Using Pairwise Gossip

    Directory of Open Access Journals (Sweden)

    Jane Louie Fresco Zamora

    2015-01-01

    Full Text Available Accurate and reliable daily global weather reports are necessary for weather forecasting and climate analysis. However, the availability of these reports continues to decline due to the lack of economic support and policies in maintaining ground weather measurement systems from where these reports are obtained. Thus, to mitigate data scarcity, it is required to utilize weather information from existing sensors and built-in smartphone sensors. However, as smartphone usage often varies according to human activity, it is difficult to obtain accurate measurement data. In this paper, we present a heuristic-based pairwise gossip algorithm that will calibrate smartphone-based pressure sensors with respect to fixed weather stations as our referential ground truth. Based on actual measurements, we have verified that smartphone-based readings are unstable when observed during movement. Using our calibration algorithm on actual smartphone-based pressure readings, the updated values were significantly closer to the ground truth values.

  20. Calibration of Smartphone-Based Weather Measurements Using Pairwise Gossip.

    Science.gov (United States)

    Zamora, Jane Louie Fresco; Kashihara, Shigeru; Yamaguchi, Suguru

    2015-01-01

    Accurate and reliable daily global weather reports are necessary for weather forecasting and climate analysis. However, the availability of these reports continues to decline due to the lack of economic support and policies in maintaining ground weather measurement systems from where these reports are obtained. Thus, to mitigate data scarcity, it is required to utilize weather information from existing sensors and built-in smartphone sensors. However, as smartphone usage often varies according to human activity, it is difficult to obtain accurate measurement data. In this paper, we present a heuristic-based pairwise gossip algorithm that will calibrate smartphone-based pressure sensors with respect to fixed weather stations as our referential ground truth. Based on actual measurements, we have verified that smartphone-based readings are unstable when observed during movement. Using our calibration algorithm on actual smartphone-based pressure readings, the updated values were significantly closer to the ground truth values.

  1. Measurement properties of performance-based measures to assess physical function in hip and knee osteoarthritis

    DEFF Research Database (Denmark)

    Dobson, F; Hinman, R S; Hall, M

    2012-01-01

    OBJECTIVES: To systematically review the measurement properties of performance-based measures to assess physical function in people with hip and/or knee osteoarthritis (OA). METHODS: Electronic searches were performed in MEDLINE, CINAHL, Embase, and PsycINFO up to the end of June 2012. Two...... investigating measurement properties of performance measures, including responsiveness and interpretability in people with hip and/or knee OA, is needed. Consensus on which combination of measures will best assess physical function in people with hip/and or knee OA is urgently required....

  2. On-Line Voltage Stability Assessment based on PMU Measurements

    DEFF Research Database (Denmark)

    Garcia-Valle, Rodrigo; P. Da Silva, Luiz C.; Nielsen, Arne Hejde

    2009-01-01

    This paper presents a method for on-line monitoring of risk voltage collapse based on synchronised phasor measurement. As there is no room for intensive computation and analysis in real-time, the method is based on the combination of off-line computation and on-line monitoring, which are correlat...

  3. Dipole location using SQUID based measurements: Application to magnetocardiography

    Science.gov (United States)

    Mariyappa, N.; Parasakthi, C.; Sengottuvel, S.; Gireesan, K.; Patel, Rajesh; Janawadkar, M. P.; Sundar, C. S.; Radhakrishnan, T. S.

    2012-07-01

    We report a method of inferring the dipole location using iterative nonlinear least square optimization based on Levenberg-Marquardt algorithm, wherein, we use different sets of pseudo-random numbers as initial parameter values. The method has been applied to (i) the simulated data representing the calculated magnetic field distribution produced by a point dipole placed at a known position, (ii) the experimental data from SQUID based measurements of the magnetic field distribution produced by a source coil carrying current, and (iii) the actual experimentally measured magnetocardiograms of human subjects using a SQUID based system.

  4. Volatility and correlation-based systemic risk measures in the US market

    Science.gov (United States)

    Civitarese, Jamil

    2016-10-01

    This paper deals with the problem of how to use simple systemic risk measures to assess portfolio risk characteristics. Using three simple examples taken from previous literature, one based on raw and partial correlations, another based on the eigenvalue decomposition of the covariance matrix and the last one based on an eigenvalue entropy, a Granger-causation analysis revealed some of them are not always a good measure of risk in the S&P 500 and in the VIX. The measures selected do not Granger-cause the VIX index in all windows selected; therefore, in the sense of risk as volatility, the indicators are not always suitable. Nevertheless, their results towards returns are similar to previous works that accept them. A deeper analysis has shown that any symmetric measure based on eigenvalue decomposition of correlation matrices, however, is not useful as a measure of "correlation" risk. The empirical counterpart analysis of this proposition stated that negative correlations are usually small and, therefore, do not heavily distort the behavior of the indicator.

  5. Variable screening and ranking using sampling-based sensitivity measures

    International Nuclear Information System (INIS)

    Wu, Y-T.; Mohanty, Sitakanta

    2006-01-01

    This paper presents a methodology for screening insignificant random variables and ranking significant important random variables using sensitivity measures including two cumulative distribution function (CDF)-based and two mean-response based measures. The methodology features (1) using random samples to compute sensitivities and (2) using acceptance limits, derived from the test-of-hypothesis, to classify significant and insignificant random variables. Because no approximation is needed in either the form of the performance functions or the type of continuous distribution functions representing input variables, the sampling-based approach can handle highly nonlinear functions with non-normal variables. The main characteristics and effectiveness of the sampling-based sensitivity measures are investigated using both simple and complex examples. Because the number of samples needed does not depend on the number of variables, the methodology appears to be particularly suitable for problems with large, complex models that have large numbers of random variables but relatively few numbers of significant random variables

  6. Local high precision 3D measurement based on line laser measuring instrument

    Science.gov (United States)

    Zhang, Renwei; Liu, Wei; Lu, Yongkang; Zhang, Yang; Ma, Jianwei; Jia, Zhenyuan

    2018-03-01

    In order to realize the precision machining and assembly of the parts, the geometrical dimensions of the surface of the local assembly surfaces need to be strictly guaranteed. In this paper, a local high-precision three-dimensional measurement method based on line laser measuring instrument is proposed to achieve a high degree of accuracy of the three-dimensional reconstruction of the surface. Aiming at the problem of two-dimensional line laser measuring instrument which lacks one-dimensional high-precision information, a local three-dimensional profile measuring system based on an accurate single-axis controller is proposed. First of all, a three-dimensional data compensation method based on spatial multi-angle line laser measuring instrument is proposed to achieve the high-precision measurement of the default axis. Through the pretreatment of the 3D point cloud information, the measurement points can be restored accurately. Finally, the target spherical surface is needed to make local three-dimensional scanning measurements for accuracy verification. The experimental results show that this scheme can get the local three-dimensional information of the target quickly and accurately, and achieves the purpose of gaining the information and compensating the error for laser scanner information, and improves the local measurement accuracy.

  7. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument

    Directory of Open Access Journals (Sweden)

    Yin Peili

    2017-08-01

    Full Text Available Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI. The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  8. Active load reduction using individual pitch, based on local blade flow measurements

    DEFF Research Database (Denmark)

    Larsen, Torben J.; Aagaard Madsen, H.; Thomsen, K.

    2005-01-01

    -of-the-art load-reducing concepts. Since the new flow-based concept deviates significantly from previous published load-reducing strategies, a comparison of the performance based on aeroelastic simulations is included. Advantages and drawbacks of the systems are discussed. Copyright (C) 2004 John Wiley Sons, Ltd.......A new load-reducing control strategy for individual blade control of large pitch-controlled wind turbines is presented This control concept is based on local blade inflow measurements and offers the possibility of larger load reductions, without loss of power production, than seen in other state...

  9. pH measurements of FET-based (bio)chemical sensors using portable measurement system.

    Science.gov (United States)

    Voitsekhivska, T; Zorgiebel, F; Suthau, E; Wolter, K-J; Bock, K; Cuniberti, G

    2015-01-01

    In this study we demonstrate the sensing capabilities of a portable multiplex measurement system for FET-based (bio)chemical sensors with an integrated microfluidic interface. We therefore conducted pH measurements with Silicon Nanoribbon FET-based Sensors using different measurement procedures that are suitable for various applications. We have shown multiplexed measurements in aqueous medium for three different modes that are mutually specialized in fast data acquisition (constant drain current), calibration-less sensing (constant gate voltage) and in providing full information content (sweeping mode). Our system therefore allows surface charge sensing for a wide range of applications and is easily adaptable for multiplexed sensing with novel FET-based (bio)chemical sensors.

  10. Analysis of rocket flight stability based on optical image measurement

    Science.gov (United States)

    Cui, Shuhua; Liu, Junhu; Shen, Si; Wang, Min; Liu, Jun

    2018-02-01

    Based on the abundant optical image measurement data from the optical measurement information, this paper puts forward the method of evaluating the rocket flight stability performance by using the measurement data of the characteristics of the carrier rocket in imaging. On the basis of the method of measuring the characteristics of the carrier rocket, the attitude parameters of the rocket body in the coordinate system are calculated by using the measurements data of multiple high-speed television sets, and then the parameters are transferred to the rocket body attack angle and it is assessed whether the rocket has a good flight stability flying with a small attack angle. The measurement method and the mathematical algorithm steps through the data processing test, where you can intuitively observe the rocket flight stability state, and also can visually identify the guidance system or failure analysis.

  11. Three-dimensional hindfoot alignment measurements based on biplanar radiographs: comparison with standard radiographic measurements

    International Nuclear Information System (INIS)

    Sutter, Reto; Pfirrmann, Christian W.A.; Buck, Florian M.; Espinosa, Norman

    2013-01-01

    To establish a hindfoot alignment measurement technique based on low-dose biplanar radiographs and compare with hindfoot alignment measurements on long axial view radiographs, which is the current reference standard. Long axial view radiographs and low-dose biplanar radiographs of a phantom consisting of a human foot skeleton embedded in acrylic glass (phantom A) and a plastic model of a human foot in three different hindfoot positions (phantoms B1-B3) were imaged in different foot positions (20 internal to 20 external rotation). Two independent readers measured hindfoot alignment on long axial view radiographs and performed 3D hindfoot alignment measurements based on biplanar radiographs on two different occasions. Time for three-dimensional (3D) measurements was determined. Intraclass correlation coefficients (ICC) were calculated. Hindfoot alignment measurements on long axial view radiographs were characterized by a large positional variation, with a range of 14 /13 valgus to 22 /27 varus (reader 1/2 for phantom A), whereas the range of 3D hindfoot alignment measurements was 7.3 /6.0 to 9.0 /10.5 varus (reader 1/2 for phantom A), with a mean and standard deviation of 8.1 ± 0.6/8.7 ± 1.4 respectively. Interobserver agreement was high (ICC = 0.926 for phantom A, and ICC = 0.886 for phantoms B1-B3), and agreement between different readouts was high (ICC = 0.895-0.995 for reader 1, and ICC = 0.987-0.994 for reader 2) for 3D measurements. Mean duration of 3D measurements was 84 ± 15/113 ± 15 s for reader 1/2. Three-dimensional hindfoot alignment measurements based on biplanar radiographs were independent of foot positioning during image acquisition and reader independent. In this phantom study, the 3D measurements were substantially more precise than the standard radiographic measurements. (orig.)

  12. A New Laser Based Approach for Measuring Atmospheric Greenhouse Gases

    Directory of Open Access Journals (Sweden)

    Jeremy Dobler

    2013-11-01

    Full Text Available In 2012, we developed a proof-of-concept system for a new open-path laser absorption spectrometer concept for measuring atmospheric CO2. The measurement approach utilizes high-reliability all-fiber-based, continuous-wave laser technology, along with a unique all-digital lock-in amplifier method that, together, enables simultaneous transmission and reception of multiple fixed wavelengths of light. This new technique, which utilizes very little transmitted energy relative to conventional lidar systems, provides high signal-to-noise (SNR measurements, even in the presence of a large background signal. This proof-of-concept system, tested in both a laboratory environment and a limited number of field experiments over path lengths of 680 m and 1,600 m, demonstrated SNR values >1,000 for received signals of ~18 picoWatts averaged over 60 s. A SNR of 1,000 is equivalent to a measurement precision of ±0.001 or ~0.4 ppmv. The measurement method is expected to provide new capability for automated monitoring of greenhouse gas at fixed sites, such as carbon sequestration facilities, volcanoes, the short- and long-term assessment of urban plumes, and other similar applications. In addition, this concept enables active measurements of column amounts from a geosynchronous orbit for a network of ground-based receivers/stations that would complement other current and planned space-based measurement capabilities.

  13. Measures of Competitive Intensity – Analysis Based on Literature Review

    Directory of Open Access Journals (Sweden)

    Dariusz Kwieciński

    2017-03-01

    Full Text Available Purpose: To systematize the existing approaches and tools used for measuring competitive intensity. Methodology: Systematic literature review along with critical literature review. Findings: Identifcation of two main approaches to measuring competition intensity: the frst pertains to research based on experts’ opinions and involves the use of questionnaires (primary sources, while the second is based on structural variables used with a variety of indexes (secondary sources. In addition, variables applied for the purpose of measuring the intensity of competition are divided into structural and behavioural. Research implications: Research implications are two-fold. Firstly, a distinction is made between various types of existing approaches to measuring competitive intensity. Secondly, research is carried out, inter alia, with regard to the actual object of certain measures, as opposed to their object stemming from commonly accepted defnitions. Practical implications: The issue of measuring competition intensity occupies a prominent place in the discussion on the effectiveness of inter-organizational relationships. The fndings outlined in this paper may help managers to develop/adopt the right approach supporting their strategic decisions. Originality: The paper provides a complex review of the existing methods and measures of competitive intensity. It systematizes recent knowledge about competitive intensity measurements.

  14. Empirical wind retrieval model based on SAR spectrum measurements

    Science.gov (United States)

    Panfilova, Maria; Karaev, Vladimir; Balandina, Galina; Kanevsky, Mikhail; Portabella, Marcos; Stoffelen, Ad

    ambiguity from polarimetric SAR. A criterion based on the complex correlation coefficient between the VV and VH signals sign is applied to select the wind direction. An additional quality control on the wind speed value retrieved with the spectral method is applied. Here, we use the direction obtained with the spectral method and the backscattered signal for CMOD wind speed estimate. The algorithm described above may be refined by the use of numerous SAR data and wind measurements. In the present preliminary work the first results of SAR images combined with in situ data processing are presented. Our results are compared to the results obtained using previously developed models CMOD, C-2PO for VH polarization and statistical wind retrieval approaches [1]. Acknowledgments. This work is supported by the Russian Foundation of Basic Research (grants 13-05-00852-a). [1] M. Portabella, A. Stoffelen, J. A. Johannessen, Toward an optimal inversion method for synthetic aperture radar wind retrieval, Journal of geophysical research, V. 107, N C8, 2002

  15. Biometric identification based on novel frequency domain facial asymmetry measures

    Science.gov (United States)

    Mitra, Sinjini; Savvides, Marios; Vijaya Kumar, B. V. K.

    2005-03-01

    In the modern world, the ever-growing need to ensure a system's security has spurred the growth of the newly emerging technology of biometric identification. The present paper introduces a novel set of facial biometrics based on quantified facial asymmetry measures in the frequency domain. In particular, we show that these biometrics work well for face images showing expression variations and have the potential to do so in presence of illumination variations as well. A comparison of the recognition rates with those obtained from spatial domain asymmetry measures based on raw intensity values suggests that the frequency domain representation is more robust to intra-personal distortions and is a novel approach for performing biometric identification. In addition, some feature analysis based on statistical methods comparing the asymmetry measures across different individuals and across different expressions is presented.

  16. Phase Difference Measurement Method Based on Progressive Phase Shift

    Directory of Open Access Journals (Sweden)

    Min Zhang

    2018-06-01

    Full Text Available This paper proposes a method for phase difference measurement based on the principle of progressive phase shift (PPS. A phase difference measurement system based on PPS and implemented in the FPGA chip is proposed and tested. In the realized system, a fully programmable delay line (PDL is constructed, which provides accurate and stable delay, benefitting from the feed-back structure of the control module. The control module calibrates the delay according to process, voltage and temperature (PVT variations. Furthermore, a modified method based on double PPS is incorporated to improve the resolution. The obtained resolution is 25 ps. Moreover, to improve the resolution, the proposed method is implemented on the 20 nm Xilinx Kintex Ultrascale platform, and test results indicate that the obtained measurement error and clock synchronization error is within the range of ±5 ps.

  17. Modern gas-based temperature and pressure measurements

    CERN Document Server

    Pavese, Franco

    2013-01-01

    This 2nd edition volume of Modern Gas-Based Temperature and Pressure Measurements follows the first publication in 1992. It collects a much larger set of information, reference data, and bibliography in temperature and pressure metrology of gaseous substances, including the physical-chemical issues related to gaseous substances. The book provides solutions to practical applications where gases are used in different thermodynamic conditions. Modern Gas-Based Temperature and Pressure Measurements, 2nd edition is the only comprehensive survey of methods for pressure measurement in gaseous media used in the medium-to-low pressure range closely connected with thermometry. It assembles current information on thermometry and manometry that involve the use of gaseous substances which are likely to be valid methods for the future. As such, it is an important resource for the researcher. This edition is updated through the very latest scientific and technical developments of gas-based temperature and pressure measurem...

  18. Generalized phase retrieval algorithm based on information measures

    OpenAIRE

    Shioya, Hiroyuki; Gohara, Kazutoshi

    2006-01-01

    An iterative phase retrieval algorithm based on the maximum entropy method (MEM) is presented. Introducing a new generalized information measure, we derive a novel class of algorithms which includes the conventionally used error reduction algorithm and a MEM-type iterative algorithm which is presented for the first time. These different phase retrieval methods are unified on the basis of the framework of information measures used in information theory.

  19. A web-based tool for ranking landslide mitigation measures

    Science.gov (United States)

    Lacasse, S.; Vaciago, G.; Choi, Y. J.; Kalsnes, B.

    2012-04-01

    As part of the research done in the European project SafeLand "Living with landslide risk in Europe: Assessment, effects of global change, and risk management strategies", a compendium of structural and non-structural mitigation measures for different landslide types in Europe was prepared, and the measures were assembled into a web-based "toolbox". Emphasis was placed on providing a rational and flexible framework applicable to existing and future mitigation measures. The purpose of web-based toolbox is to assist decision-making and to guide the user in the choice of the most appropriate mitigation measures. The mitigation measures were classified into three categories, describing whether the mitigation measures addressed the landslide hazard, the vulnerability or the elements at risk themselves. The measures considered include structural measures reducing hazard and non-structural mitigation measures, reducing either the hazard or the consequences (or vulnerability and exposure of elements at risk). The structural measures include surface protection and control of surface erosion; measures modifying the slope geometry and/or mass distribution; measures modifying surface water regime - surface drainage; measures mo¬difying groundwater regime - deep drainage; measured modifying the mechanical charac¬teristics of unstable mass; transfer of loads to more competent strata; retaining structures (to modify slope geometry and/or to transfer stress to compe¬tent layer); deviating the path of landslide debris; dissipating the energy of debris flows; and arresting and containing landslide debris or rock fall. The non-structural mitigation measures, reducing either the hazard or the consequences: early warning systems; restricting or discouraging construction activities; increasing resistance or coping capacity of elements at risk; relocation of elements at risk; sharing of risk through insurance. The measures are described in the toolbox with fact sheets providing a

  20. Predictive Software Measures based on Z Specifications - A Case Study

    Directory of Open Access Journals (Sweden)

    Andreas Bollin

    2012-07-01

    Full Text Available Estimating the effort and quality of a system is a critical step at the beginning of every software project. It is necessary to have reliable ways of calculating these measures, and, it is even better when the calculation can be done as early as possible in the development life-cycle. Having this in mind, metrics for formal specifications are examined with a view to correlations to complexity and quality-based code measures. A case study, based on a Z specification and its implementation in ADA, analyzes the practicability of these metrics as predictors.

  1. Portable audio electronics for impedance-based measurements in microfluidics

    International Nuclear Information System (INIS)

    Wood, Paul; Sinton, David

    2010-01-01

    We demonstrate the use of audio electronics-based signals to perform on-chip electrochemical measurements. Cell phones and portable music players are examples of consumer electronics that are easily operated and are ubiquitous worldwide. Audio output (play) and input (record) signals are voltage based and contain frequency and amplitude information. A cell phone, laptop soundcard and two compact audio players are compared with respect to frequency response; the laptop soundcard provides the most uniform frequency response, while the cell phone performance is found to be insufficient. The audio signals in the common portable music players and laptop soundcard operate in the range of 20 Hz to 20 kHz and are found to be applicable, as voltage input and output signals, to impedance-based electrochemical measurements in microfluidic systems. Validated impedance-based measurements of concentration (0.1–50 mM), flow rate (2–120 µL min −1 ) and particle detection (32 µm diameter) are demonstrated. The prevailing, lossless, wave audio file format is found to be suitable for data transmission to and from external sources, such as a centralized lab, and the cost of all hardware (in addition to audio devices) is ∼10 USD. The utility demonstrated here, in combination with the ubiquitous nature of portable audio electronics, presents new opportunities for impedance-based measurements in portable microfluidic systems. (technical note)

  2. Principal Component Analysis Based Measure of Structural Holes

    Science.gov (United States)

    Deng, Shiguo; Zhang, Wenqing; Yang, Huijie

    2013-02-01

    Based upon principal component analysis, a new measure called compressibility coefficient is proposed to evaluate structural holes in networks. This measure incorporates a new effect from identical patterns in networks. It is found that compressibility coefficient for Watts-Strogatz small-world networks increases monotonically with the rewiring probability and saturates to that for the corresponding shuffled networks. While compressibility coefficient for extended Barabasi-Albert scale-free networks decreases monotonically with the preferential effect and is significantly large compared with that for corresponding shuffled networks. This measure is helpful in diverse research fields to evaluate global efficiency of networks.

  3. Patch near field acoustic holography based on particle velocity measurements

    DEFF Research Database (Denmark)

    Zhang, Yong-Bin; Jacobsen, Finn; Bi, Chuan-Xing

    2009-01-01

    Patch near field acoustic holography (PNAH) based on sound pressure measurements makes it possible to reconstruct the source field near a source by measuring the sound pressure at positions on a surface. that is comparable in size to the source region of concern. Particle velocity is an alternative...... examines the use of particle velocity as the input of PNAH. Because the particle velocity decays faster toward the edges of the measurement aperture than the pressure does and because the wave number ratio that enters into the inverse propagator from pressure to velocity amplifies high spatial frequencies...

  4. Confidence bounds of recurrence-based complexity measures

    International Nuclear Information System (INIS)

    Schinkel, Stefan; Marwan, N.; Dimigen, O.; Kurths, J.

    2009-01-01

    In the recent past, recurrence quantification analysis (RQA) has gained an increasing interest in various research areas. The complexity measures the RQA provides have been useful in describing and analysing a broad range of data. It is known to be rather robust to noise and nonstationarities. Yet, one key question in empirical research concerns the confidence bounds of measured data. In the present Letter we suggest a method for estimating the confidence bounds of recurrence-based complexity measures. We study the applicability of the suggested method with model and real-life data.

  5. Bread Water Content Measurement Based on Hyperspectral Imaging

    DEFF Research Database (Denmark)

    Liu, Zhi; Møller, Flemming

    2011-01-01

    Water content is one of the most important properties of the bread for tasting assesment or store monitoring. Traditional bread water content measurement methods mostly are processed manually, which is destructive and time consuming. This paper proposes an automated water content measurement...... for bread quality based on near-infrared hyperspectral imaging against the conventional manual loss-in-weight method. For this purpose, the hyperspectral components unmixing technology is used for measuring the water content quantitatively. And the definition on bread water content index is presented...

  6. Tethered balloon-based measurements of meteorological variables and aerosols

    Science.gov (United States)

    Sentell, R. J.; Storey, R. W.; Chang, J. J. C.; Jacobsen, S. J.

    1976-01-01

    Tethered balloon based measurements of the vertical distributions of temperature, humidity, wind speed, and aerosol concentrations were taken over a 4-hour period beginning at sunrise on June 29, 1976, at Wallops Island, Virginia. Twelve consecutive profiles of each variable were obtained from ground to about 500 meters. These measurements were in conjuction with a noise propagation study on remotely arrayed acoustic range (ROMAAR) at Wallops Flight Center. An organized listing of these vertical soundings is presented. The tethered balloon system configuration utilized for these measurements is described.

  7. A complex network-based importance measure for mechatronics systems

    Science.gov (United States)

    Wang, Yanhui; Bi, Lifeng; Lin, Shuai; Li, Man; Shi, Hao

    2017-01-01

    In view of the negative impact of functional dependency, this paper attempts to provide an alternative importance measure called Improved-PageRank (IPR) for measuring the importance of components in mechatronics systems. IPR is a meaningful extension of the centrality measures in complex network, which considers usage reliability of components and functional dependency between components to increase importance measures usefulness. Our work makes two important contributions. First, this paper integrates the literature of mechatronic architecture and complex networks theory to define component network. Second, based on the notion of component network, a meaningful IPR is brought into the identifying of important components. In addition, the IPR component importance measures, and an algorithm to perform stochastic ordering of components due to the time-varying nature of usage reliability of components and functional dependency between components, are illustrated with a component network of bogie system that consists of 27 components.

  8. Drone based measurement system for radiofrequency exposure assessment.

    Science.gov (United States)

    Joseph, Wout; Aerts, Sam; Vandenbossche, Matthias; Thielens, Arno; Martens, Luc

    2016-03-10

    For the first time, a method to assess radiofrequency (RF) electromagnetic field (EMF) exposure of the general public in real environments with a true free-space antenna system is presented. Using lightweight electronics and multiple antennas placed on a drone, it is possible to perform exposure measurements. This technique will enable researchers to measure three-dimensional RF-EMF exposure patterns accurately in the future and at locations currently difficult to access. A measurement procedure and appropriate measurement settings have been developed. As an application, outdoor measurements are performed as a function of height up to 60 m for Global System for Mobile Communications (GSM) 900 MHz base station exposure. Bioelectromagnetics. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  9. Forecasting method in multilateration accuracy based on laser tracker measurement

    International Nuclear Information System (INIS)

    Aguado, Sergio; Santolaria, Jorge; Samper, David; José Aguilar, Juan

    2017-01-01

    Multilateration based on a laser tracker (LT) requires the measurement of a set of points from three or more positions. Although the LTs’ angular information is not used, multilateration produces a volume of measurement uncertainty. This paper presents two new coefficients from which to determine whether the measurement of a set of points, before performing the necessary measurements, will improve or worsen the accuracy of the multilateration results, avoiding unnecessary measurement, and reducing the time and economic cost required. The first specific coefficient measurement coefficient (MC LT ) is unique for each laser tracker. It determines the relationship between the radial and angular laser tracker measurement noise. Similarly, the second coefficient is related with specific conditions of measurement β . It is related with the spatial angle between the laser tracker positions α and its effect on error reduction. Both parameters MC LT and β are linked in error reduction limits. Beside these, a new methodology to determine the multilateration reduction limit according to the multilateration technique of an ideal laser tracker distribution and a random one are presented. It provides general rules and advice from synthetic tests that are validated through a real test carried out in a coordinate measurement machine. (paper)

  10. Measurement channel of neutron flow based on software

    International Nuclear Information System (INIS)

    Rivero G, T.; Benitez R, J. S.

    2008-01-01

    The measurement of the thermal power in nuclear reactors is based mainly on the measurement of the neutron flow. The presence of these in the reactor core is associated to neutrons released by the fission reaction of the uranium-235. Once moderate, these neutrons are precursors of new fissions. This process it is known like chain reaction. Thus, the power to which works a nuclear reactor, he is proportional to the number of produced fissions and as these depend on released neutrons, also the power is proportional to the number of present neutrons. The measurement of the thermal power in a reactor is realized with called instruments nuclear channels. To low power (level source), these channels measure the individual counts of detected neutrons, whereas to a medium and high power, they measure the electrical current or fluctuation of the same one that generate the fission neutrons in ionization chambers especially designed to detect neutrons. For the case of TRIGA reactors, the measurement channels of neutron flow use discreet digital electronic technology makes some decades already. Recently new technological tools have arisen that allow developing new versions of nuclear channels of simple form and compacts. The present work consists of the development of a nuclear channel for TRIGA reactors based on the use of the correlated signal of a fission chamber for ample interval. This new measurement channel uses a data acquisition card of high speed and the data processing by software that to the being installed in a computer is created a virtual instrument, with what spreads in real time, in graphic and understandable form for the operator, the power indication to which it operates the nuclear reactor. This system when being based on software, offers a major versatility to realize changes in the signal processing and power monitoring algorithms. The experimental tests of neutronic power measurement show a reliable performance through seven decades of power, with a

  11. Integrated method for the measurement of trace nitrogenous atmospheric bases

    Directory of Open Access Journals (Sweden)

    D. Key

    2011-12-01

    Full Text Available Nitrogenous atmospheric bases are thought to play a key role in the global nitrogen cycle, but their sources, transport, and sinks remain poorly understood. Of the many methods available to measure such compounds in ambient air, few meet the current need of being applicable to the complete range of potential analytes and fewer still are convenient to implement using instrumentation that is standard to most laboratories. In this work, an integrated approach to measuring trace, atmospheric, gaseous nitrogenous bases has been developed and validated. The method uses a simple acid scrubbing step to capture and concentrate the bases as their phosphite salts, which then are derivatized and analyzed using GC/MS and/or LC/MS. The advantages of both techniques in the context of the present measurements are discussed. The approach is sensitive, selective, reproducible, as well as convenient to implement and has been validated for different sampling strategies. The limits of detection for the families of tested compounds are suitable for ambient measurement applications (e.g., methylamine, 1 pptv; ethylamine, 2 pptv; morpholine, 1 pptv; aniline, 1 pptv; hydrazine, 0.1 pptv; methylhydrazine, 2 pptv, as supported by field measurements in an urban park and in the exhaust of on-road vehicles.

  12. Resource management in Diffserv measurement-based admission control PHR

    NARCIS (Netherlands)

    Westberg, L.; Heijenk, Geert; Karagiannis, Georgios; Oosthoek, S.; Partain, D.; Rexhepi, Vlora; Szabo, R.; Wallentin, P.; El Allali, H.

    2002-01-01

    The purpose of this draft is to present the Resource Management in Diffserv (RMD) Measurement-Based Admission Control (RIMA) Per Hop Reservation (PHR) protocol. The RIMA PHR protocol is used on a per-hop basis in a Differentiated Services (Diffserv) domain and extends the Diffserv Per Hop Behavior

  13. Assessing Children's Writing Products: The Role of Curriculum Based Measures

    Science.gov (United States)

    Dockrell, Julie E.; Connelly, Vincent; Walter, Kirsty; Critten, Sarah

    2015-01-01

    The assessment of children's writing raises technical and practical challenges. In this paper we examine the potential use of a curriculum based measure for writing (CBM-W) to assess the written texts of pupils in Key Stage 2 (M age 107 months, range 88 to 125). Two hundred and thirty six Year three, five and six pupils completed a standardized…

  14. Hydrogel-based sensor for CO2 measurements

    NARCIS (Netherlands)

    Herber, S.; Olthuis, Wouter; Bergveld, Piet; van den Berg, Albert

    2004-01-01

    A hydrogel-based sensor is presented for CO2 measurements. The sensor consists of a pressure sensor and porous silicon cover. A pH-sensitive hydrogel is confined between the two parts. Furthermore the porous cover contains a bicarbonate solution and a gaspermeable membrane. CO2 reacts with the

  15. Functional Size Measurement applied to UML-based user requirements

    NARCIS (Netherlands)

    van den Berg, Klaas; Dekkers, Ton; Oudshoorn, Rogier; Dekkers, T.

    There is a growing interest in applying standardized methods for Functional Size Measurement (FSM) to Functional User Requirements (FUR) based on models in the Unified Modelling Language (UML). No consensus exists on this issue. We analyzed the demands that FSM places on FURs. We propose a

  16. Noninvasive microbubble-based pressure measurements: a simulation study

    NARCIS (Netherlands)

    Postema, Michiel; Postema, M.A.B.; Bouakaz, Ayache; de Jong, N.

    2004-01-01

    This paper describes a noninvasive method to measure local hydrostatic pressures in fluid filled cavities. The method is based on the disappearance time of a gas bubble, as the disappearance time is related to the hydrostatic pressure. When a bubble shrinks, its response to ultrasound changes. From

  17. Metrology of human-based and other qualitative measurements

    Science.gov (United States)

    Pendrill, Leslie; Petersson, Niclas

    2016-09-01

    The metrology of human-based and other qualitative measurements is in its infancy—concepts such as traceability and uncertainty are as yet poorly developed. This paper reviews how a measurement system analysis approach, particularly invoking as performance metric the ability of a probe (such as a human being) acting as a measurement instrument to make a successful decision, can enable a more general metrological treatment of qualitative observations. Measures based on human observations are typically qualitative, not only in sectors, such as health care, services and safety, where the human factor is obvious, but also in customer perception of traditional products of all kinds. A principal challenge is that the usual tools of statistics normally employed for expressing measurement accuracy and uncertainty will probably not work reliably if relations between distances on different portions of scales are not fully known, as is typical of ordinal or other qualitative measurements. A key enabling insight is to connect the treatment of decision risks associated with measurement uncertainty to generalized linear modelling (GLM). Handling qualitative observations in this way unites information theory, the perceptive identification and choice paradigms of psychophysics. The Rasch invariant measure psychometric GLM approach in particular enables a proper treatment of ordinal data; a clear separation of probe and item attribute estimates; simple expressions for instrument sensitivity; etc. Examples include two aspects of the care of breast cancer patients, from diagnosis to rehabilitation. The Rasch approach leads in turn to opportunities of establishing metrological references for quality assurance of qualitative measurements. In psychometrics, one could imagine a certified reference for knowledge challenge, for example, a particular concept in understanding physics or for product quality of a certain health care service. Multivariate methods, such as Principal Component

  18. Metrology of human-based and other qualitative measurements

    International Nuclear Information System (INIS)

    Pendrill, Leslie; Petersson, Niclas

    2016-01-01

    The metrology of human-based and other qualitative measurements is in its infancy—concepts such as traceability and uncertainty are as yet poorly developed. This paper reviews how a measurement system analysis approach, particularly invoking as performance metric the ability of a probe (such as a human being) acting as a measurement instrument to make a successful decision, can enable a more general metrological treatment of qualitative observations. Measures based on human observations are typically qualitative, not only in sectors, such as health care, services and safety, where the human factor is obvious, but also in customer perception of traditional products of all kinds. A principal challenge is that the usual tools of statistics normally employed for expressing measurement accuracy and uncertainty will probably not work reliably if relations between distances on different portions of scales are not fully known, as is typical of ordinal or other qualitative measurements. A key enabling insight is to connect the treatment of decision risks associated with measurement uncertainty to generalized linear modelling (GLM). Handling qualitative observations in this way unites information theory, the perceptive identification and choice paradigms of psychophysics. The Rasch invariant measure psychometric GLM approach in particular enables a proper treatment of ordinal data; a clear separation of probe and item attribute estimates; simple expressions for instrument sensitivity; etc. Examples include two aspects of the care of breast cancer patients, from diagnosis to rehabilitation. The Rasch approach leads in turn to opportunities of establishing metrological references for quality assurance of qualitative measurements. In psychometrics, one could imagine a certified reference for knowledge challenge, for example, a particular concept in understanding physics or for product quality of a certain health care service. Multivariate methods, such as Principal Component

  19. Depth Measurement Based on Infrared Coded Structured Light

    Directory of Open Access Journals (Sweden)

    Tong Jia

    2014-01-01

    Full Text Available Depth measurement is a challenging problem in computer vision research. In this study, we first design a new grid pattern and develop a sequence coding and decoding algorithm to process the pattern. Second, we propose a linear fitting algorithm to derive the linear relationship between the object depth and pixel shift. Third, we obtain depth information on an object based on this linear relationship. Moreover, 3D reconstruction is implemented based on Delaunay triangulation algorithm. Finally, we utilize the regularity of the error curves to correct the system errors and improve the measurement accuracy. The experimental results show that the accuracy of depth measurement is related to the step length of moving object.

  20. Visual Peoplemeter: A Vision-based Television Audience Measurement System

    Directory of Open Access Journals (Sweden)

    SKELIN, A. K.

    2014-11-01

    Full Text Available Visual peoplemeter is a vision-based measurement system that objectively evaluates the attentive behavior for TV audience rating, thus offering solution to some of drawbacks of current manual logging peoplemeters. In this paper, some limitations of current audience measurement system are reviewed and a novel vision-based system aiming at passive metering of viewers is prototyped. The system uses camera mounted on a television as a sensing modality and applies advanced computer vision algorithms to detect and track a person, and to recognize attentional states. Feasibility of the system is evaluated on a secondary dataset. The results show that the proposed system can analyze viewer's attentive behavior, therefore enabling passive estimates of relevant audience measurement categories.

  1. Observer-based Coal Mill Control using Oxygen Measurements

    DEFF Research Database (Denmark)

    Andersen, Palle; Bendtsen, Jan Dimon; S., Tom

    2006-01-01

    This paper proposes a novel approach to coal flow estimation in pulverized coal mills, which utilizes measurements of oxygen content in the flue gas. Pulverized coal mills are typically not equipped with sensors that detect the amount of coal injected into the furnace. This makes control...... of the coal flow difficult, causing stability problems and limits the plant's load following capabilities. To alleviate this problem without having to rely on expensive flow measurement equipment, a novel observer-based approach is investigated. A Kalman filter based on measurements of combustion air flow led...... into the furnace and oxygen concentration in the flue gas is designed to estimate the actual coal flow injected into the furnace. With this estimate, it becomes possible to close an inner loop around the coal mill itself, thus giving a better disturbance rejection capability. The approach is validated against...

  2. Microcontroller Power Consumption Measurement Based on PSoC

    Directory of Open Access Journals (Sweden)

    S. P. Janković

    2016-06-01

    Full Text Available Microcontrollers are often used as central processing elements in embedded systems. Because of different sleep and performance modes that microcontrollers support, their power consumption may have a high dynamic range, over 100 dB. In this paper, a data acquisition (DAQ system for measuring and analyzing the power consumption of microcontrollers is presented. DAQ system consists of a current measurement circuit using potentiostat technique, a DAQ device based on system on chip PSoC 5LP and Python PC program for the analysis, storage and visualization of measured data. Both Successive Approximation Register (SAR and Delta-Sigma (DS ADCs contained in the PSoC 5LP are used for measuring voltage drop across the shunt resistor. SAR ADC samples data at a 10 times higher rate than DS ADC, so the input range of DS ADC can be adjusted based on data measured by SAR ADC, thus enabling the extension of current measuring range by 28%. Implemented DAQ device is connected with a computer through a USB port and tested with developed Python PC program.

  3. ICF-based classification and measurement of functioning.

    Science.gov (United States)

    Stucki, G; Kostanjsek, N; Ustün, B; Cieza, A

    2008-09-01

    If we aim towards a comprehensive understanding of human functioning and the development of comprehensive programs to optimize functioning of individuals and populations we need to develop suitable measures. The approval of the International Classification, Disability and Health (ICF) in 2001 by the 54th World Health Assembly as the first universally shared model and classification of functioning, disability and health marks, therefore an important step in the development of measurement instruments and ultimately for our understanding of functioning, disability and health. The acceptance and use of the ICF as a reference framework and classification has been facilitated by its development in a worldwide, comprehensive consensus process and the increasing evidence regarding its validity. However, the broad acceptance and use of the ICF as a reference framework and classification will also depend on the resolution of conceptual and methodological challenges relevant for the classification and measurement of functioning. This paper therefore describes first how the ICF categories can serve as building blocks for the measurement of functioning and then the current state of the development of ICF based practical tools and international standards such as the ICF Core Sets. Finally it illustrates how to map the world of measures to the ICF and vice versa and the methodological principles relevant for the transformation of information obtained with a clinical test or a patient-oriented instrument to the ICF as well as the development of ICF-based clinical and self-reported measurement instruments.

  4. Computer Vision Based Measurement of Wildfire Smoke Dynamics

    Directory of Open Access Journals (Sweden)

    BUGARIC, M.

    2015-02-01

    Full Text Available This article presents a novel method for measurement of wildfire smoke dynamics based on computer vision and augmented reality techniques. The aspect of smoke dynamics is an important feature in video smoke detection that could distinguish smoke from visually similar phenomena. However, most of the existing smoke detection systems are not capable of measuring the real-world size of the detected smoke regions. Using computer vision and GIS-based augmented reality, we measure the real dimensions of smoke plumes, and observe the change in size over time. The measurements are performed on offline video data with known camera parameters and location. The observed data is analyzed in order to create a classifier that could be used to eliminate certain categories of false alarms induced by phenomena with different dynamics than smoke. We carried out an offline evaluation where we measured the improvement in the detection process achieved using the proposed smoke dynamics characteristics. The results show a significant increase in algorithm performance, especially in terms of reducing false alarms rate. From this it follows that the proposed method for measurement of smoke dynamics could be used to improve existing smoke detection algorithms, or taken into account when designing new ones.

  5. Event- and interval-based measurement of stuttering: a review.

    Science.gov (United States)

    Valente, Ana Rita S; Jesus, Luis M T; Hall, Andreia; Leahy, Margaret

    2015-01-01

    Event- and interval-based measurements are two different ways of computing frequency of stuttering. Interval-based methodology emerged as an alternative measure to overcome problems associated with reproducibility in the event-based methodology. No review has been made to study the effect of methodological factors in interval-based absolute reliability data or to compute the agreement between the two methodologies in terms of inter-judge, intra-judge and accuracy (i.e., correspondence between raters' scores and an established criterion). To provide a review related to reproducibility of event-based and time-interval measurement, and to verify the effect of methodological factors (training, experience, interval duration, sample presentation order and judgment conditions) on agreement of time-interval measurement; in addition, to determine if it is possible to quantify the agreement between the two methodologies The first two authors searched for articles on ERIC, MEDLINE, PubMed, B-on, CENTRAL and Dissertation Abstracts during January-February 2013 and retrieved 495 articles. Forty-eight articles were selected for review. Content tables were constructed with the main findings. Articles related to event-based measurements revealed values of inter- and intra-judge greater than 0.70 and agreement percentages beyond 80%. The articles related to time-interval measures revealed that, in general, judges with more experience with stuttering presented significantly higher levels of intra- and inter-judge agreement. Inter- and intra-judge values were beyond the references for high reproducibility values for both methodologies. Accuracy (regarding the closeness of raters' judgements with an established criterion), intra- and inter-judge agreement were higher for trained groups when compared with non-trained groups. Sample presentation order and audio/video conditions did not result in differences in inter- or intra-judge results. A duration of 5 s for an interval appears to be

  6. A measurement-based performability model for a multiprocessor system

    Science.gov (United States)

    Ilsueh, M. C.; Iyer, Ravi K.; Trivedi, K. S.

    1987-01-01

    A measurement-based performability model based on real error-data collected on a multiprocessor system is described. Model development from the raw errror-data to the estimation of cumulative reward is described. Both normal and failure behavior of the system are characterized. The measured data show that the holding times in key operational and failure states are not simple exponential and that semi-Markov process is necessary to model the system behavior. A reward function, based on the service rate and the error rate in each state, is then defined in order to estimate the performability of the system and to depict the cost of different failure types and recovery procedures.

  7. An improved visualization-based force-measurement technique for short-duration hypersonic facilities

    Energy Technology Data Exchange (ETDEWEB)

    Laurence, Stuart J.; Karl, Sebastian [Institute of Aerodynamics and Flow Technology, Spacecraft Section, German Aerospace Center (DLR), Goettingen (Germany)

    2010-06-15

    This article is concerned with describing and exploring the limitations of an improved version of a recently proposed visualization-based technique for the measurement of forces and moments in short-duration hypersonic wind tunnels. The technique is based on tracking the motion of a free-flying body over a sequence of high-speed visualizations; while this idea is not new in itself, the use of high-speed digital cinematography combined with a highly accurate least-squares tracking algorithm allows improved results over what have been previously possible with such techniques. The technique precision is estimated through the analysis of artificially constructed and experimental test images, and the resulting error in acceleration measurements is characterized. For wind-tunnel scale models, position measurements to within a few microns are shown to be readily attainable. Image data from two previous experimental studies in the T5 hypervelocity shock tunnel are then reanalyzed with the improved technique: the uncertainty in the mean drag acceleration is shown to be reduced to the order of the flow unsteadiness, 2-3%, and time-resolved acceleration measurements are also shown to be possible. The response time of the technique for the configurations studied is estimated to be {proportional_to}0.5 ms. Comparisons with computations using the DLR TAU code also yield agreement to within the overall experimental uncertainty. Measurement of the pitching moment for blunt geometries still appears challenging, however. (orig.)

  8. An Energy-Based Similarity Measure for Time Series

    Directory of Open Access Journals (Sweden)

    Pierre Brunagel

    2007-11-01

    Full Text Available A new similarity measure, called SimilB, for time series analysis, based on the cross-ΨB-energy operator (2004, is introduced. ΨB is a nonlinear measure which quantifies the interaction between two time series. Compared to Euclidean distance (ED or the Pearson correlation coefficient (CC, SimilB includes the temporal information and relative changes of the time series using the first and second derivatives of the time series. SimilB is well suited for both nonstationary and stationary time series and particularly those presenting discontinuities. Some new properties of ΨB are presented. Particularly, we show that ΨB as similarity measure is robust to both scale and time shift. SimilB is illustrated with synthetic time series and an artificial dataset and compared to the CC and the ED measures.

  9. Temperature measuring system based on ADuC812 MCU

    International Nuclear Information System (INIS)

    Zhou Dongmei; Ge Liangquan; Cheng Feng; Li Jinfeng

    2009-01-01

    This paper introduces a temperature measuring system which is composed of a single chip microcomputer ADuC812, new type digital temperature sensor TMP100,LED display circuit and based on I 2 C bus. I 2 C bus which is invented by PHILIPS company needs only two signal lines (SDA, SCL), can realized perfect duplex synchronous data transmission. Using the method of hardware setting of device address, can completely avoid the disadvantages of device selection addressing, thus can make hardware system has simplifier and more flexible extension method. The key part of the system is composed of a single chip microcomputer ADuC812 which is compatible with MCS-51 and is invented by AD company in america. The software is compiled with 8051 assembly language. The data acquisitin single chip microcomputer measurement system with I 2 C bus fully shows the features of flexibility, precise and high integration. Proposed high accuracy measurement method to realize environment temperature measure. (authors)

  10. Towards minimal resources of measurement-based quantum computation

    International Nuclear Information System (INIS)

    Perdrix, Simon

    2007-01-01

    We improve the upper bound on the minimal resources required for measurement-only quantum computation (M A Nielsen 2003 Phys. Rev. A 308 96-100; D W Leung 2004 Int. J. Quantum Inform. 2 33; S Perdrix 2005 Int. J. Quantum Inform. 3 219-23). Minimizing the resources required for this model is a key issue for experimental realization of a quantum computer based on projective measurements. This new upper bound also allows one to reply in the negative to the open question presented by Perdrix (2004 Proc. Quantum Communication Measurement and Computing) about the existence of a trade-off between observable and ancillary qubits in measurement-only QC

  11. High Precision Infrared Temperature Measurement System Based on Distance Compensation

    Directory of Open Access Journals (Sweden)

    Chen Jing

    2017-01-01

    Full Text Available To meet the need of real-time remote monitoring of human body surface temperature for optical rehabilitation therapy, a non-contact high-precision real-time temperature measurement method based on distance compensation was proposed, and the system design was carried out. The microcontroller controls the infrared temperature measurement module and the laser range module to collect temperature and distance data. The compensation formula of temperature with distance wass fitted according to the least square method. Testing had been performed on different individuals to verify the accuracy of the system. The results indicate that the designed non-contact infrared temperature measurement system has a residual error of less than 0.2°C and the response time isless than 0.1s in the range of 0 to 60cm. This provides a reference for developing long-distance temperature measurement equipment in optical rehabilitation therapy.

  12. Coherent lidar wind measurements from the Space Station base using 1.5 m all-reflective optics

    Science.gov (United States)

    Bilbro, J. W.; Beranek, R. G.

    1987-01-01

    This paper discusses the space-based measurement of atmospheric winds from the point of view of the requirements of the optical system of a coherent CO2 lidar. A brief description of the measurement technique is given and a discussion of previous study results provided. The telescope requirements for a Space Station based lidar are arrived at through discussions of the desired system sensitivity and the need for lag angle compensation.

  13. Measurable realistic image-based 3D mapping

    Science.gov (United States)

    Liu, W.; Wang, J.; Wang, J. J.; Ding, W.; Almagbile, A.

    2011-12-01

    Maps with 3D visual models are becoming a remarkable feature of 3D map services. High-resolution image data is obtained for the construction of 3D visualized models.The3D map not only provides the capabilities of 3D measurements and knowledge mining, but also provides the virtual experienceof places of interest, such as demonstrated in the Google Earth. Applications of 3D maps are expanding into the areas of architecture, property management, and urban environment monitoring. However, the reconstruction of high quality 3D models is time consuming, and requires robust hardware and powerful software to handle the enormous amount of data. This is especially for automatic implementation of 3D models and the representation of complicated surfacesthat still need improvements with in the visualisation techniques. The shortcoming of 3D model-based maps is the limitation of detailed coverage since a user can only view and measure objects that are already modelled in the virtual environment. This paper proposes and demonstrates a 3D map concept that is realistic and image-based, that enables geometric measurements and geo-location services. Additionally, image-based 3D maps provide more detailed information of the real world than 3D model-based maps. The image-based 3D maps use geo-referenced stereo images or panoramic images. The geometric relationships between objects in the images can be resolved from the geometric model of stereo images. The panoramic function makes 3D maps more interactive with users but also creates an interesting immersive circumstance. Actually, unmeasurable image-based 3D maps already exist, such as Google street view, but only provide virtual experiences in terms of photos. The topographic and terrain attributes, such as shapes and heights though are omitted. This paper also discusses the potential for using a low cost land Mobile Mapping System (MMS) to implement realistic image 3D mapping, and evaluates the positioning accuracy that a measureable

  14. Measurement errors in polymerase chain reaction are a confounding factor for a correct interpretation of 5-HTTLPR polymorphism effects on lifelong premature ejaculation: a critical analysis of a previously published meta-analysis of six studies.

    Science.gov (United States)

    Janssen, Paddy K C; Olivier, Berend; Zwinderman, Aeilko H; Waldinger, Marcel D

    2014-01-01

    To analyze a recently published meta-analysis of six studies on 5-HTTLPR polymorphism and lifelong premature ejaculation (PE). Calculation of fraction observed and expected genotype frequencies and Hardy Weinberg equilibrium (HWE) of cases and controls. LL,SL and SS genotype frequencies of patients were subtracted from genotype frequencies of an ideal population (LL25%, SL50%, SS25%, p = 1 for HWE). Analysis of PCRs of six studies and re-analysis of the analysis and Odds ratios (ORs) reported in the recently published meta-analysis. Three studies deviated from HWE in patients and one study deviated from HWE in controls. In three studies in-HWE the mean deviation of genotype frequencies from a theoretical population not-deviating from HWE was small: LL(1.7%), SL(-2.3%), SS(0.6%). In three studies not-in-HWE the mean deviation of genotype frequencies was high: LL(-3.3%), SL(-18.5%) and SS(21.8%) with very low percentage SL genotype concurrent with very high percentage SS genotype. The most serious PCR deviations were reported in the three not-in-HWE studies. The three in-HWE studies had normal OR. In contrast, the three not-in-HWE studies had a low OR. In three studies not-in-HWE and with very low OR, inadequate PCR analysis and/or inadequate interpretation of its gel electrophoresis resulted in very low SL and a resulting shift to very high SS genotype frequency outcome. Consequently, PCRs of these three studies are not reliable. Failure to note the inadequacy of PCR tests makes such PCRs a confounding factor in clinical interpretation of genetic studies. Currently, a meta-analysis can only be performed on three studies-in-HWE. However, based on the three studies-in-HWE with OR of about 1 there is not any indication that in men with lifelong PE the frequency of LL,SL and SS genotype deviates from the general male population and/or that the SL or SS genotype is in any way associated with lifelong PE.

  15. Measurement errors in polymerase chain reaction are a confounding factor for a correct interpretation of 5-HTTLPR polymorphism effects on lifelong premature ejaculation: a critical analysis of a previously published meta-analysis of six studies.

    Directory of Open Access Journals (Sweden)

    Paddy K C Janssen

    Full Text Available OBJECTIVE: To analyze a recently published meta-analysis of six studies on 5-HTTLPR polymorphism and lifelong premature ejaculation (PE. METHODS: Calculation of fraction observed and expected genotype frequencies and Hardy Weinberg equilibrium (HWE of cases and controls. LL,SL and SS genotype frequencies of patients were subtracted from genotype frequencies of an ideal population (LL25%, SL50%, SS25%, p = 1 for HWE. Analysis of PCRs of six studies and re-analysis of the analysis and Odds ratios (ORs reported in the recently published meta-analysis. RESULTS: Three studies deviated from HWE in patients and one study deviated from HWE in controls. In three studies in-HWE the mean deviation of genotype frequencies from a theoretical population not-deviating from HWE was small: LL(1.7%, SL(-2.3%, SS(0.6%. In three studies not-in-HWE the mean deviation of genotype frequencies was high: LL(-3.3%, SL(-18.5% and SS(21.8% with very low percentage SL genotype concurrent with very high percentage SS genotype. The most serious PCR deviations were reported in the three not-in-HWE studies. The three in-HWE studies had normal OR. In contrast, the three not-in-HWE studies had a low OR. CONCLUSIONS: In three studies not-in-HWE and with very low OR, inadequate PCR analysis and/or inadequate interpretation of its gel electrophoresis resulted in very low SL and a resulting shift to very high SS genotype frequency outcome. Consequently, PCRs of these three studies are not reliable. Failure to note the inadequacy of PCR tests makes such PCRs a confounding factor in clinical interpretation of genetic studies. Currently, a meta-analysis can only be performed on three studies-in-HWE. However, based on the three studies-in-HWE with OR of about 1 there is not any indication that in men with lifelong PE the frequency of LL,SL and SS genotype deviates from the general male population and/or that the SL or SS genotype is in any way associated with lifelong PE.

  16. Measurement of energy efficiency based on economic foundations

    International Nuclear Information System (INIS)

    Filippini, Massimo; Hunt, Lester C.

    2015-01-01

    Energy efficiency policy is seen as a very important activity by almost all policy makers. In practical energy policy analysis, the typical indicator used as a proxy for energy efficiency is energy intensity. However, this simple indicator is not necessarily an accurate measure given changes in energy intensity are a function of changes in several factors as well as ‘true’ energy efficiency; hence, it is difficult to make conclusions for energy policy based upon simple energy intensity measures. Related to this, some published academic papers over the last few years have attempted to use empirical methods to measure the efficient use of energy based on the economic theory of production. However, these studies do not generally provide a systematic discussion of the theoretical basis nor the possible parametric empirical approaches that are available for estimating the level of energy efficiency. The objective of this paper, therefore, is to sketch out and explain from an economic perspective the theoretical framework as well as the empirical methods for measuring the level of energy efficiency. Additionally, in the second part of the paper, some of the empirical studies that have attempted to measure energy efficiency using such an economics approach are summarized and discussed.

  17. Self-guaranteed measurement-based quantum computation

    Science.gov (United States)

    Hayashi, Masahito; Hajdušek, Michal

    2018-05-01

    In order to guarantee the output of a quantum computation, we usually assume that the component devices are trusted. However, when the total computation process is large, it is not easy to guarantee the whole system when we have scaling effects, unexpected noise, or unaccounted for correlations between several subsystems. If we do not trust the measurement basis or the prepared entangled state, we do need to be worried about such uncertainties. To this end, we propose a self-guaranteed protocol for verification of quantum computation under the scheme of measurement-based quantum computation where no prior-trusted devices (measurement basis or entangled state) are needed. The approach we present enables the implementation of verifiable quantum computation using the measurement-based model in the context of a particular instance of delegated quantum computation where the server prepares the initial computational resource and sends it to the client, who drives the computation by single-qubit measurements. Applying self-testing procedures, we are able to verify the initial resource as well as the operation of the quantum devices and hence the computation itself. The overhead of our protocol scales with the size of the initial resource state to the power of 4 times the natural logarithm of the initial state's size.

  18. Evaluating airline energy efficiency: An integrated approach with Network Epsilon-based Measure and Network Slacks-based Measure

    International Nuclear Information System (INIS)

    Xu, Xin; Cui, Qiang

    2017-01-01

    This paper focuses on evaluating airline energy efficiency, which is firstly divided into four stages: Operations Stage, Fleet Maintenance Stage, Services Stage and Sales Stage. The new four-stage network structure of airline energy efficiency is a modification of existing models. A new approach, integrated with Network Epsilon-based Measure and Network Slacks-based Measure, is applied to assess the overall energy efficiency and divisional efficiency of 19 international airlines from 2008 to 2014. The influencing factors of airline energy efficiency are analyzed through the regression analysis. The results indicate the followings: 1. The integrated model can identify the benchmarking airlines in the overall system and stages. 2. Most airlines' energy efficiencies keep steady during the period, except for some sharply fluctuations. The efficiency decreases mainly centralized in the year 2008–2011, affected by the financial crisis in the USA. 3. The average age of fleet is positively correlated with the overall energy efficiency, and each divisional efficiency has different significant influencing factors. - Highlights: • An integrated approach with Network Epsilon-based Measure and Network Slacks-based Measure is developed. • 19 airlines' energy efficiencies are evaluated. • Garuda Indonesia has the highest overall energy efficiency.

  19. Real cell overlay measurement through design based metrology

    Science.gov (United States)

    Yoo, Gyun; Kim, Jungchan; Park, Chanha; Lee, Taehyeong; Ji, Sunkeun; Jo, Gyoyeon; Yang, Hyunjo; Yim, Donggyu; Yamamoto, Masahiro; Maruyama, Kotaro; Park, Byungjun

    2014-04-01

    Until recent device nodes, lithography has been struggling to improve its resolution limit. Even though next generation lithography technology is now facing various difficulties, several innovative resolution enhancement technologies, based on 193nm wavelength, were introduced and implemented to keep the trend of device scaling. Scanner makers keep developing state-of-the-art exposure system which guarantees higher productivity and meets a more aggressive overlay specification. "The scaling reduction of the overlay error has been a simple matter of the capability of exposure tools. However, it is clear that the scanner contributions may no longer be the majority component in total overlay performance. The ability to control correctable overlay components is paramount to achieve the desired performance.(2)" In a manufacturing fab, the overlay error, determined by a conventional overlay measurement: by using an overlay mark based on IBO and DBO, often does not represent the physical placement error in the cell area of a memory device. The mismatch may arise from the size or pitch difference between the overlay mark and the cell pattern. Pattern distortion, caused by etching or CMP, also can be a source of the mismatch. Therefore, the requirement of a direct overlay measurement in the cell pattern gradually increases in the manufacturing field, and also in the development level. In order to overcome the mismatch between conventional overlay measurement and the real placement error of layer to layer in the cell area of a memory device, we suggest an alternative overlay measurement method utilizing by design, based metrology tool. A basic concept of this method is shown in figure1. A CD-SEM measurement of the overlay error between layer 1 and 2 could be the ideal method but it takes too long time to extract a lot of data from wafer level. An E-beam based DBM tool provides high speed to cover the whole wafer with high repeatability. It is enabled by using the design as a

  20. Indirect measurement of molten steel level in tundish based on laser triangulation

    Energy Technology Data Exchange (ETDEWEB)

    Su, Zhiqi; He, Qing, E-mail: heqing@ise.neu.edu.cn; Xie, Zhi [State Key Laboratory of Synthetical Automation for Process Industries, School of Information Science and Engineering, Northeastern University, Shenyang 110819 (China)

    2016-03-15

    For real-time and precise measurement of molten steel level in tundish during continuous casting, slag level and slag thickness are needed. Among which, the problem of slag thickness measurement has been solved in our previous work. In this paper, a systematic solution for slag level measurement based on laser triangulation is proposed. Being different from traditional laser triangulation, several aspects for measuring precision and robustness have been done. First, laser line is adopted for multi-position measurement to overcome the deficiency of single point laser range finder caused by the uneven surface of the slag. Second, the key parameters, such as installing angle and minimum requirement of the laser power, are analyzed and determined based on the gray-body radiation theory to fulfill the rigorous requirement of measurement accuracy. Third, two kinds of severe noises in the acquired images, which are, respectively, caused by heat radiation and Electro-Magnetic Interference (EMI), are cleaned via morphological characteristic of the liquid slag and color difference between EMI and the laser signals, respectively. Fourth, as false target created by stationary slag usually disorders the measurement, valid signals of the slag are distinguished from the false ones to calculate the slag level. Then, molten steel level is obtained by the slag level minus the slag thickness. The measuring error of this solution is verified by the applications in steel plants, which is ±2.5 mm during steady casting and ±3.2 mm at the end of casting.

  1. Spherical aberration compensation method for long focal-length measurement based on Talbot interferometry

    Science.gov (United States)

    Luo, Yupeng; Huang, Xiao; Bai, Jian; Du, Juan; Liu, Qun; Luo, Yujie; Luo, Jia

    2017-08-01

    Large-aperture and long focal-length lens is widely used in high energy laser system. The method based on Talbot interferometry is a reliable method to measure the focal length of such elements. By employing divergent beam and two gratings of different periods, this method could realize full-aperture measurement, higher accuracy and better repeatability. However, it does not take into account the spherical aberration of the measured lens resulting in the moiré fringes bending, which will introduce measurement error. Furthermore, in long-focal measurement with divergent beam, this error is an important factor affecting the measurement accuracy. In this paper, we propose a new spherical aberration compensation method, which could significantly reduce the measurement error. Characterized by central-symmetric scanning window, the proposed method is based on the relationship between spherical aberration and the lens aperture. Angle data of moiré fringes in each scanning window is retrieved by Fourier analysis and statistically fitted to estimate a globally optimum value for spherical-aberration-free focal length calculation. Simulation and experiment have been carried out. Compared to the previous work, the proposed method is able to reduce the relative measurement error by 50%. The effect of scanning window size and shift step length on the results is also discussed.

  2. Indirect measurement of molten steel level in tundish based on laser triangulation

    Science.gov (United States)

    Su, Zhiqi; He, Qing; Xie, Zhi

    2016-03-01

    For real-time and precise measurement of molten steel level in tundish during continuous casting, slag level and slag thickness are needed. Among which, the problem of slag thickness measurement has been solved in our previous work. In this paper, a systematic solution for slag level measurement based on laser triangulation is proposed. Being different from traditional laser triangulation, several aspects for measuring precision and robustness have been done. First, laser line is adopted for multi-position measurement to overcome the deficiency of single point laser range finder caused by the uneven surface of the slag. Second, the key parameters, such as installing angle and minimum requirement of the laser power, are analyzed and determined based on the gray-body radiation theory to fulfill the rigorous requirement of measurement accuracy. Third, two kinds of severe noises in the acquired images, which are, respectively, caused by heat radiation and Electro-Magnetic Interference (EMI), are cleaned via morphological characteristic of the liquid slag and color difference between EMI and the laser signals, respectively. Fourth, as false target created by stationary slag usually disorders the measurement, valid signals of the slag are distinguished from the false ones to calculate the slag level. Then, molten steel level is obtained by the slag level minus the slag thickness. The measuring error of this solution is verified by the applications in steel plants, which is ±2.5 mm during steady casting and ±3.2 mm at the end of casting.

  3. Graph-Based Cooperative Localization Using Symmetric Measurement Equations.

    Science.gov (United States)

    Gulati, Dhiraj; Zhang, Feihu; Clarke, Daniel; Knoll, Alois

    2017-06-17

    Precise localization is a key requirement for the success of highly assisted or autonomous vehicles. The diminishing cost of hardware has resulted in a proliferation of the number of sensors in the environment. Cooperative localization (CL) presents itself as a feasible and effective solution for localizing the ego-vehicle and its neighboring vehicles. However, one of the major challenges to fully realize the effective use of infrastructure sensors for jointly estimating the state of a vehicle in cooperative vehicle-infrastructure localization is an effective data association. In this paper, we propose a method which implements symmetric measurement equations within factor graphs in order to overcome the data association challenge with a reduced bandwidth overhead. Simulated results demonstrate the benefits of the proposed approach in comparison with our previously proposed approach of topology factors.

  4. A Game Map Complexity Measure Based on Hamming Distance

    Science.gov (United States)

    Li, Yan; Su, Pan; Li, Wenliang

    With the booming of PC game market, Game AI has attracted more and more researches. The interesting and difficulty of a game are relative with the map used in game scenarios. Besides, the path-finding efficiency in a game is also impacted by the complexity of the used map. In this paper, a novel complexity measure based on Hamming distance, called the Hamming complexity, is introduced. This measure is able to estimate the complexity of binary tileworld. We experimentally demonstrated that Hamming complexity is highly relative with the efficiency of A* algorithm, and therefore it is a useful reference to the designer when developing a game map.

  5. A microprocessor based picture analysis system for automatic track measurements

    International Nuclear Information System (INIS)

    Heinrich, W.; Trakowski, W.; Beer, J.; Schucht, R.

    1982-01-01

    In the last few years picture analysis became a powerful technique for measurements of nuclear tracks in plastic detectors. For this purpose rather expensive commercial systems are available. Two inexpensive microprocessor based systems with different resolution were developed. The video pictures of particles seen through a microscope are digitized in real time and the picture analysis is done by software. The microscopes are equipped with stages driven by stepping motors, which are controlled by separate microprocessors. A PDP 11/03 supervises the operation of all microprocessors and stores the measured data on its mass storage devices. (author)

  6. Atmospheric profiles from active space-based radio measurements

    Science.gov (United States)

    Hardy, Kenneth R.; Hinson, David P.; Tyler, G. L.; Kursinski, E. R.

    1992-01-01

    The paper describes determinations of atmospheric profiles from space-based radio measurements and the retrieval methodology used, with special attention given to the measurement procedure and the characteristics of the soundings. It is speculated that reliable profiles of the terrestrial atmosphere can be obtained by the occultation technique from the surface to a height of about 60 km. With the full complement of 21 the Global Positioning System (GPS) satellites and one GPS receiver in sun synchronous polar orbit, a maximum of 42 soundings could be obtained for each complete orbit or about 670 per day, providing almost uniform global coverage.

  7. Analysis of earth albedo effect on sun sensor measurements based on theoretical model and mission experience

    Science.gov (United States)

    Brasoveanu, Dan; Sedlak, Joseph

    1998-01-01

    Analysis of flight data from previous missions indicates that anomalous Sun sensor readings could be caused by Earth albedo interference. A previous Sun sensor study presented a detailed mathematical model of this effect. The model can be used to study the effect of both diffusive and specular reflections and to improve Sun angle determination based on perturbed Sun sensor measurements, satellite position, and an approximate knowledge of attitude. The model predicts that diffuse reflected light can cause errors of up to 10 degrees in Coarse Sun Sensor (CSS) measurements and 5 to 10 arc sec in Fine Sun Sensor (FSS) measurements, depending on spacecraft orbit and attitude. The accuracy of these sensors is affected as long as part of the illuminated Earth surface is present in the sensor field of view. Digital Sun Sensors (DSS) respond in a different manner to the Earth albedo interference. Most of the time DSS measurements are not affected, but for brief periods of time the Earth albedo can cause errors which are a multiple of the sensor least significant bit and may exceed one degree. This paper compares model predictions with Tropical Rainfall Measuring Mission (TRMM) CSS measurements in order to validate and refine the model. Methods of reducing and mitigating the impact of Earth albedo are discussed. ne CSS sensor errors are roughly proportional to the Earth albedo coefficient. Photocells that are sensitive only to ultraviolet emissions would reduce the effective Earth albedo by up to a thousand times, virtually eliminating all errors caused by Earth albedo interference.

  8. The correction of vibration in frequency scanning interferometry based absolute distance measurement system for dynamic measurements

    Science.gov (United States)

    Lu, Cheng; Liu, Guodong; Liu, Bingguo; Chen, Fengdong; Zhuang, Zhitao; Xu, Xinke; Gan, Yu

    2015-10-01

    Absolute distance measurement systems are of significant interest in the field of metrology, which could improve the manufacturing efficiency and accuracy of large assemblies in fields such as aircraft construction, automotive engineering, and the production of modern windmill blades. Frequency scanning interferometry demonstrates noticeable advantages as an absolute distance measurement system which has a high precision and doesn't depend on a cooperative target. In this paper , the influence of inevitable vibration in the frequency scanning interferometry based absolute distance measurement system is analyzed. The distance spectrum is broadened as the existence of Doppler effect caused by vibration, which will bring in a measurement error more than 103 times bigger than the changes of optical path difference. In order to decrease the influence of vibration, the changes of the optical path difference are monitored by a frequency stabilized laser, which runs parallel to the frequency scanning interferometry. The experiment has verified the effectiveness of this method.

  9. Books average previous decade of economic misery.

    Science.gov (United States)

    Bentley, R Alexander; Acerbi, Alberto; Ormerod, Paul; Lampos, Vasileios

    2014-01-01

    For the 20(th) century since the Depression, we find a strong correlation between a 'literary misery index' derived from English language books and a moving average of the previous decade of the annual U.S. economic misery index, which is the sum of inflation and unemployment rates. We find a peak in the goodness of fit at 11 years for the moving average. The fit between the two misery indices holds when using different techniques to measure the literary misery index, and this fit is significantly better than other possible correlations with different emotion indices. To check the robustness of the results, we also analysed books written in German language and obtained very similar correlations with the German economic misery index. The results suggest that millions of books published every year average the authors' shared economic experiences over the past decade.

  10. Books Average Previous Decade of Economic Misery

    Science.gov (United States)

    Bentley, R. Alexander; Acerbi, Alberto; Ormerod, Paul; Lampos, Vasileios

    2014-01-01

    For the 20th century since the Depression, we find a strong correlation between a ‘literary misery index’ derived from English language books and a moving average of the previous decade of the annual U.S. economic misery index, which is the sum of inflation and unemployment rates. We find a peak in the goodness of fit at 11 years for the moving average. The fit between the two misery indices holds when using different techniques to measure the literary misery index, and this fit is significantly better than other possible correlations with different emotion indices. To check the robustness of the results, we also analysed books written in German language and obtained very similar correlations with the German economic misery index. The results suggest that millions of books published every year average the authors' shared economic experiences over the past decade. PMID:24416159

  11. Developing a community-based flood resilience measurement standard

    Science.gov (United States)

    Keating, Adriana; Szoenyi, Michael; Chaplowe, Scott; McQuistan, Colin; Campbell, Karen

    2015-04-01

    Given the increased attention to resilience-strengthening in international humanitarian and development work, there has been concurrent interest in its measurement and the overall accountability of "resilience strengthening" initiatives. The literature is reaching beyond the polemic of defining resilience to its measurement. Similarly, donors are increasingly expecting organizations to go beyond claiming resilience programing to measuring and showing it. However, key questions must be asked, in particular "Resilience of whom and to what?". There is no one-size-fits-all solution. The approach to measuring resilience is dependent on the audience and the purpose of the measurement exercise. Deriving a resilience measurement system needs to be based on the question it seeks to answer and needs to be specific. This session highlights key lessons from the Zurich Flood Resilience Alliance approach to develop a flood resilience measurement standard to measure and assess the impact of community based flood resilience interventions, and to inform decision-making to enhance the effectiveness of these interventions. We draw on experience in methodology development to-date, together with lessons from application in two case study sites in Latin America. Attention will be given to the use of a consistent measurement methodology for community resilience to floods over time and place; challenges to measuring a complex and dynamic phenomenon such as community resilience; methodological implications of measuring community resilience versus impact on and contribution to this goal; and using measurement and tools such as cost-benefit analysis to prioritize and inform strategic decision making for resilience interventions. The measurement tool follows the five categories of the Sustainable Livelihoods Framework and the 4Rs of complex adaptive systems - robustness, rapidity, redundancy and resourcefulness -5C-4R. A recent white paper by the Zurich Flood Resilience Alliance traces the

  12. Observer-Based Fuel Control Using Oxygen Measurement

    DEFF Research Database (Denmark)

    Andersen, Palle; Bendtsen, Jan Dimon; Mortensen, Jan Henrik

    is constructed and validated against data obtained at the plant. A Kalman filter based on measurements of combustion air flow led into the furnace and oxygen concentration in the flue gas is designed to estimate the actual coal flow. With this estimate, it becomes possible to close an inner loop around the coal......This report describes an attempt to improve the existing control af coal mills used at the Danish power plant Nordjyllandsværket Unit 3. The coal mills are not equipped with coal flow sensors; thus an observer-based approach is investigated. A nonlinear differential equation model of the boiler...

  13. Fault Location Based on Synchronized Measurements: A Comprehensive Survey

    Science.gov (United States)

    Al-Mohammed, A. H.; Abido, M. A.

    2014-01-01

    This paper presents a comprehensive survey on transmission and distribution fault location algorithms that utilize synchronized measurements. Algorithms based on two-end synchronized measurements and fault location algorithms on three-terminal and multiterminal lines are reviewed. Series capacitors equipped with metal oxide varistors (MOVs), when set on a transmission line, create certain problems for line fault locators and, therefore, fault location on series-compensated lines is discussed. The paper reports the work carried out on adaptive fault location algorithms aiming at achieving better fault location accuracy. Work associated with fault location on power system networks, although limited, is also summarized. Additionally, the nonstandard high-frequency-related fault location techniques based on wavelet transform are discussed. Finally, the paper highlights the area for future research. PMID:24701191

  14. Fault Location Based on Synchronized Measurements: A Comprehensive Survey

    Directory of Open Access Journals (Sweden)

    A. H. Al-Mohammed

    2014-01-01

    Full Text Available This paper presents a comprehensive survey on transmission and distribution fault location algorithms that utilize synchronized measurements. Algorithms based on two-end synchronized measurements and fault location algorithms on three-terminal and multiterminal lines are reviewed. Series capacitors equipped with metal oxide varistors (MOVs, when set on a transmission line, create certain problems for line fault locators and, therefore, fault location on series-compensated lines is discussed. The paper reports the work carried out on adaptive fault location algorithms aiming at achieving better fault location accuracy. Work associated with fault location on power system networks, although limited, is also summarized. Additionally, the nonstandard high-frequency-related fault location techniques based on wavelet transform are discussed. Finally, the paper highlights the area for future research.

  15. Animal-Based Measures to Assess the Welfare of Extensively Managed Ewes

    Science.gov (United States)

    Hemsworth, Paul; Doyle, Rebecca

    2017-01-01

    Simple Summary The aim of this study was to assess the reliability and practicality of 10 animal-based welfare measures for extensively managed ewes, which were derived from the scientific literature, previous welfare protocols and through consultation with veterinarians and animal welfare scientists. Measures were examined on 100 Merino ewes, which were individually identified and repeatedly examined at mid-pregnancy, mid-lactation and weaning. Body condition score, fleece condition, skin lesions, tail length, dag score and lameness are proposed for on-farm use in welfare assessments of extensive sheep production systems. These six welfare measures, which address the main welfare concerns for extensively managed ewes, can be reliably and feasibly measured in the field. Abstract The reliability and feasibility of 10 animal-based measures of ewe welfare were examined for use in extensive sheep production systems. Measures were: Body condition score (BCS), rumen fill, fleece cleanliness, fleece condition, skin lesions, tail length, dag score, foot-wall integrity, hoof overgrowth and lameness, and all were examined on 100 Merino ewes (aged 2–4 years) during mid-pregnancy, mid-lactation and weaning by a pool of nine trained observers. The measures of BCS, fleece condition, skin lesions, tail length, dag score and lameness were deemed to be reliable and feasible. All had good observer agreement, as determined by the percentage of agreement, Kendall’s coefficient of concordance (W) and Kappa (k) values. When combined, these nutritional and health measures provide a snapshot of the current welfare status of ewes, as well as evidencing previous or potential welfare issues. PMID:29295551

  16. Automated pavement horizontal curve measurement methods based on inertial measurement unit and 3D profiling data

    Directory of Open Access Journals (Sweden)

    Wenting Luo

    2016-04-01

    Full Text Available Pavement horizontal curve is designed to serve as a transition between straight segments, and its presence may cause a series of driving-related safety issues to motorists and drivers. As is recognized that traditional methods for curve geometry investigation are time consuming, labor intensive, and inaccurate, this study attempts to develop a method that can automatically conduct horizontal curve identification and measurement at network level. The digital highway data vehicle (DHDV was utilized for data collection, in which three Euler angles, driving speed, and acceleration of survey vehicle were measured with an inertial measurement unit (IMU. The 3D profiling data used for cross slope calibration was obtained with PaveVision3D Ultra technology at 1 mm resolution. In this study, the curve identification was based on the variation of heading angle, and the curve radius was calculated with kinematic method, geometry method, and lateral acceleration method. In order to verify the accuracy of the three methods, the analysis of variance (ANOVA test was applied by using the control variable of curve radius measured by field test. Based on the measured curve radius, a curve safety analysis model was used to predict the crash rates and safe driving speeds at horizontal curves. Finally, a case study on 4.35 km road segment demonstrated that the proposed method could efficiently conduct network level analysis.

  17. An Improved Dissonance Measure Based on Auditory Memory

    DEFF Research Database (Denmark)

    Jensen, Kristoffer; Hjortkjær, Jens

    2012-01-01

    Dissonance is an important feature in music audio analysis. We present here a dissonance model that accounts for the temporal integration of dissonant events in auditory short term memory. We compare the memory-based dissonance extracted from musical audio sequences to the response of human...... listeners. In a number of tests, the memory model predicts listener’s response better than traditional dissonance measures....

  18. Biomass burning aerosols characterization from ground based and profiling measurements

    Science.gov (United States)

    Marin, Cristina; Vasilescu, Jeni; Marmureanu, Luminita; Ene, Dragos; Preda, Liliana; Mihailescu, Mona

    2018-04-01

    The study goal is to assess the chemical and optical properties of aerosols present in the lofted layers and at the ground. The biomass burning aerosols were evaluated in low level layers from multi-wavelength lidar measurements, while chemical composition at ground was assessed using an Aerosol Chemical Speciation Monitor (ACSM) and an Aethalometer. Classification of aerosol type and specific organic markers were used to explore the potential to sense the particles from the same origin at ground base and on profiles.

  19. Pulsed electric field sensor based on original waveform measurement

    International Nuclear Information System (INIS)

    Ma Liang; Wu Wei; Cheng Yinhui; Zhou Hui; Li Baozhong; Li Jinxi; Zhu Meng

    2010-01-01

    The paper introduces the differential and original waveform measurement principles for pulsed E-field, and develops an pulsed E-field sensor based on original waveform measurement along with its theoretical correction model. The sensor consists of antenna, integrator, amplifier and driver, optic-electric/electric-optic conversion module and transmission module. The time-domain calibration in TEM cell indicates that, its risetime response is shorter than 1.0 ns, and the output pulse width at 90% of the maximum amplitude is wider than 10.0 μs. The output amplitude of the sensor is linear to the electric field intensity in a dynamic range of 20 dB. The measurement capability can be extended to 10 V/m or 50 kV/m by changing the system's antenna and other relative modules. (authors)

  20. Arrester Resistive Current Measuring System Based on Heterogeneous Network

    Science.gov (United States)

    Zhang, Yun Hua; Li, Zai Lin; Yuan, Feng; Hou Pan, Feng; Guo, Zhan Nan; Han, Yue

    2018-03-01

    Metal Oxide Arrester (MOA) suffers from aging and poor insulation due to long-term impulse voltage and environmental impact, and the value and variation tendency of resistive current can reflect the health conditions of MOA. The common wired MOA detection need to use long cables, which is complicated to operate, and that wireless measurement methods are facing the problems of poor data synchronization and instability. Therefore a novel synchronous measurement system of arrester current resistive based on heterogeneous network is proposed, which simplifies the calculation process and improves synchronization, accuracy and stability and of the measuring system. This system combines LoRa wireless network, high speed wireless personal area network and the process layer communication, and realizes the detection of arrester working condition. Field test data shows that the system has the characteristics of high accuracy, strong anti-interference ability and good synchronization, which plays an important role in ensuring the stable operation of the power grid.

  1. Nano-displacement measurement based on virtual pinhole confocal method

    International Nuclear Information System (INIS)

    Li, Long; Kuang, Cuifang; Xue, Yi; Liu, Xu

    2013-01-01

    A virtual pinhole confocal system based on charge-coupled device (CCD) detection and image processing techniques is built to measure axial displacement with 10 nm resolution, preeminent flexibility and excellent robustness when facing spot drifting. Axial displacement of the sample surface is determined by capturing the confocal laser spot using a CCD detector and quantifying the energy collected by programmable virtual pinholes. Experiments indicate an applicable measuring range of 1000 nm (Gaussian fitting r = 0.9902) with a highly linear range of 500 nm (linear fitting r = 0.9993). A concentric subtraction algorithm is introduced to further enhance resolution. Factors affecting measuring precision, sensitivity and signal-to-noise ratio are discussed using theoretical deductions and diffraction simulations. The virtual pinhole technique has promising applications in surface profiling and confocal imaging applications which require easily-customizable pinhole configurations. (paper)

  2. Neurally based measurement and evaluation of environmental noise

    CERN Document Server

    Soeta, Yoshiharu

    2015-01-01

    This book deals with methods of measurement and evaluation of environmental noise based on an auditory neural and brain-oriented model. The model consists of the autocorrelation function (ACF) and the interaural cross-correlation function (IACF) mechanisms for signals arriving at the two ear entrances. Even when the sound pressure level of a noise is only about 35 dBA, people may feel annoyed due to the aspects of sound quality. These aspects can be formulated by the factors extracted from the ACF and IACF. Several examples of measuring environmental noise—from outdoor noise such as that of aircraft, traffic, and trains, and indoor noise such as caused by floor impact, toilets, and air-conditioning—are demonstrated. According to the noise measurement and evaluation, applications for sound design are discussed. This book provides an excellent resource for students, researchers, and practitioners in a wide range of fields, such as the automotive, railway, and electronics industries, and soundscape, architec...

  3. Experimental nonlocality-based randomness generation with nonprojective measurements

    Science.gov (United States)

    Gómez, S.; Mattar, A.; Gómez, E. S.; Cavalcanti, D.; Farías, O. Jiménez; Acín, A.; Lima, G.

    2018-04-01

    We report on an optical setup generating more than one bit of randomness from one entangled bit (i.e., a maximally entangled state of two qubits). The amount of randomness is certified through the observation of Bell nonlocal correlations. To attain this result we implemented a high-purity entanglement source and a nonprojective three-outcome measurement. Our implementation achieves a gain of 27% of randomness as compared with the standard methods using projective measurements. Additionally, we estimate the amount of randomness certified in a one-sided device-independent scenario, through the observation of Einstein-Podolsky-Rosen steering. Our results prove that nonprojective quantum measurements allow extending the limits for nonlocality-based certified randomness generation using current technology.

  4. Measurement of unattached radon progeny based in electrostatic deposition method

    International Nuclear Information System (INIS)

    Canoba, A.C.; Lopez, F.O.

    1999-01-01

    A method for the measurement of unattached radon progeny based on its electrostatic deposition onto wire screens, using only one pump, has been implemented and calibrated. The importance of being able of making use of this method is related with the special radiological significance that has the unattached fraction of the short-lived radon progeny. Because of this, the assessment of exposure could be directly related to dose with far greater accuracy than before. The advantages of this method are its simplicity, even with the tools needed for the sample collection, as well as the measurement instruments used. Also, the suitability of this method is enhanced by the fact that it can effectively be used with a simple measuring procedure such as the Kusnetz method. (author)

  5. Measurement system for nitrous oxide based on amperometric gas sensor

    Science.gov (United States)

    Siswoyo, S.; Persaud, K. C.; Phillips, V. R.; Sneath, R.

    2017-03-01

    It has been well known that nitrous oxide is an important greenhouse gas, so monitoring and control of its concentration and emission is very important. In this work a nitrous oxide measurement system has been developed consisting of an amperometric sensor and an appropriate lab-made potentiostat that capable measuring picoampere current ranges. The sensor was constructed using a gold microelectrode as working electrode surrounded by a silver wire as quasi reference electrode, with tetraethyl ammonium perchlorate and dimethylsulphoxide as supporting electrolyte and solvent respectively. The lab-made potentiostat was built incorporating a transimpedance amplifier capable of picoampere measurements. This also incorporated a microcontroller based data acquisition system, controlled by a host personal computer using a dedicated computer program. The system was capable of detecting N2O concentrations down to 0.07 % v/v.

  6. Evidence conflict measure based on OWA operator in open world.

    Directory of Open Access Journals (Sweden)

    Wen Jiang

    Full Text Available Dempster-Shafer evidence theory has been extensively used in many information fusion systems since it was proposed by Dempster and extended by Shafer. Many scholars have been conducted on conflict management of Dempster-Shafer evidence theory in past decades. However, how to determine a potent parameter to measure evidence conflict, when the given environment is in an open world, namely the frame of discernment is incomplete, is still an open issue. In this paper, a new method which combines generalized conflict coefficient, generalized evidence distance, and generalized interval correlation coefficient based on ordered weighted averaging (OWA operator, to measure the conflict of evidence is presented. Through ordered weighted average of these three parameters, the combinatorial coefficient can still measure the conflict effectively when one or two parameters are not valid. Several numerical examples demonstrate the effectiveness of the proposed method.

  7. Beam based measurement of beam position monitor electrode gains

    Directory of Open Access Journals (Sweden)

    D. L. Rubin

    2010-09-01

    Full Text Available Low emittance tuning at the Cornell Electron Storage Ring (CESR test accelerator depends on precision measurement of vertical dispersion and transverse coupling. The CESR beam position monitors (BPMs consist of four button electrodes, instrumented with electronics that allow acquisition of turn-by-turn data. The response to the beam will vary among the four electrodes due to differences in electronic gain and/or misalignment. This variation in the response of the BPM electrodes will couple real horizontal offset to apparent vertical position, and introduce spurious measurements of coupling and vertical dispersion. To alleviate this systematic effect, a beam based technique to measure the relative response of the four electrodes has been developed. With typical CESR parameters, simulations show that turn-by-turn BPM data can be used to determine electrode gains to within ∼0.1%.

  8. Beam based measurement of beam position monitor electrode gains

    Science.gov (United States)

    Rubin, D. L.; Billing, M.; Meller, R.; Palmer, M.; Rendina, M.; Rider, N.; Sagan, D.; Shanks, J.; Strohman, C.

    2010-09-01

    Low emittance tuning at the Cornell Electron Storage Ring (CESR) test accelerator depends on precision measurement of vertical dispersion and transverse coupling. The CESR beam position monitors (BPMs) consist of four button electrodes, instrumented with electronics that allow acquisition of turn-by-turn data. The response to the beam will vary among the four electrodes due to differences in electronic gain and/or misalignment. This variation in the response of the BPM electrodes will couple real horizontal offset to apparent vertical position, and introduce spurious measurements of coupling and vertical dispersion. To alleviate this systematic effect, a beam based technique to measure the relative response of the four electrodes has been developed. With typical CESR parameters, simulations show that turn-by-turn BPM data can be used to determine electrode gains to within ˜0.1%.

  9. Soil-Carbon Measurement System Based on Inelastic Neutron Scattering

    International Nuclear Information System (INIS)

    Orion, I.; Wielopolski, L.

    2002-01-01

    Increase in the atmospheric CO 2 is associated with concurrent increase in the amount of carbon sequestered in the soil. For better understanding of the carbon cycle it is imperative to establish a better and extensive database of the carbon concentrations in various soil types, in order to develop improved models for changes in the global climate. Non-invasive soil carbon measurement is based on Inelastic Neutron Scattering (INS). This method has been used successfully to measure total body carbon in human beings. The system consists of a pulsed neutron generator that is based on D-T reaction, which produces 14 MeV neutrons, a neutron flux monitoring detector and a couple of large NaI(Tl), 6'' diameter by 6'' high, spectrometers [4]. The threshold energy for INS reaction in carbon is 4.8 MeV. Following INS of 14 MeV neutrons in carbon 4.44 MeV photons are emitted and counted during a gate pulse period of 10 μsec. The repetition rate of the neutron generator is 104 pulses per sec. The gamma spectra are acquired only during the neutron generator gate pulses. The INS method for soil carbon content measurements provides a non-destructive, non-invasive tool, which can be optimized in order to develop a system for in field measurements

  10. Computer vision based nacre thickness measurement of Tahitian pearls

    Science.gov (United States)

    Loesdau, Martin; Chabrier, Sébastien; Gabillon, Alban

    2017-03-01

    The Tahitian Pearl is the most valuable export product of French Polynesia contributing with over 61 million Euros to more than 50% of the total export income. To maintain its excellent reputation on the international market, an obligatory quality control for every pearl deemed for exportation has been established by the local government. One of the controlled quality parameters is the pearls nacre thickness. The evaluation is currently done manually by experts that are visually analyzing X-ray images of the pearls. In this article, a computer vision based approach to automate this procedure is presented. Even though computer vision based approaches for pearl nacre thickness measurement exist in the literature, the very specific features of the Tahitian pearl, namely the large shape variety and the occurrence of cavities, have so far not been considered. The presented work closes the. Our method consists of segmenting the pearl from X-ray images with a model-based approach, segmenting the pearls nucleus with an own developed heuristic circle detection and segmenting possible cavities with region growing. Out of the obtained boundaries, the 2-dimensional nacre thickness profile can be calculated. A certainty measurement to consider imaging and segmentation imprecisions is included in the procedure. The proposed algorithms are tested on 298 manually evaluated Tahitian pearls, showing that it is generally possible to automatically evaluate the nacre thickness of Tahitian pearls with computer vision. Furthermore the results show that the automatic measurement is more precise and faster than the manual one.

  11. Measurements of Electromagnetic Fields Emitted from Cellular Base Stations in

    Directory of Open Access Journals (Sweden)

    K. J. Ali

    2013-05-01

    Full Text Available With increasing the usage of mobile communication devices and internet network information, the entry of private telecommunications companies in Iraq has been started since 2003. These companies began to build up cellular towers to accomplish the telecommunication works but they ignore the safety conditions imposed for the health and environment that are considered in random way. These negative health effects which may cause a health risk for life beings and environment pollution. The aim of this work is to determine the safe and unsafe ranges and discuss damage caused by radiation emitted from Asia cell base stations in Shirqat city and discuses the best ways in which can be minimize its exposure level to avoid its negative health effects. Practical measurements of power density around base stations has been accomplished by using a radiation survey meter type (Radio frequency EMF Strength Meter 480846 in two ways. The first way of measurements has been accomplished at a height of 2 meters above ground for different distances from (0-300 meters .The second way is at a distance of 150 meters for different levels from (2-15 meters above ground level. The maximum measured power density is about (3 mW/m2. Results indicate that the levels of power density are far below the RF radiation exposure of USSR safety standards levels. And that means these cellular base station don't cause negative the health effect for life being if the exposure is within the acceptable international standard levels.

  12. Smartphone based hemispherical photography for canopy structure measurement

    Science.gov (United States)

    Wan, Xuefen; Cui, Jian; Jiang, Xueqin; Zhang, Jingwen; Yang, Yi; Zheng, Tao

    2018-01-01

    The canopy is the most direct and active interface layer of the interaction between plant and environment, and has important influence on energy exchange, biodiversity, ecosystem matter and climate change. The measurement about canopy structure of plant is an important foundation to analyze the pattern, process and operation mechanism of forest ecosystem. Through the study of canopy structure of plant, solar radiation, ambient wind speed, air temperature and humidity, soil evaporation, soil temperature and other forest environmental climate characteristics can be evaluated. Because of its accuracy and effectiveness, canopy structure measurement based on hemispherical photography has been widely studied. However, the traditional method of canopy structure hemispherical photogrammetry based on SLR camera and fisheye lens. This method is expensive and difficult to be used in some low-cost occasions. In recent years, smartphone technology has been developing rapidly. The smartphone not only has excellent image acquisition ability, but also has the considerable computational processing ability. In addition, the gyroscope and positioning function on the smartphone will also help to measure the structure of the canopy. In this paper, we present a smartphone based hemispherical photography system. The system consists of smart phones, low-cost fisheye lenses and PMMA adapters. We designed an Android based App to obtain the canopy hemisphere images through low-cost fisheye lenses and provide horizontal collimation information. In addition, the App will add the acquisition location tag obtained by GPS and auxiliary positioning method in hemisphere image information after the canopy structure hemisphere image acquisition. The system was tested in the urban forest after it was completed. The test results show that the smartphone based hemispherical photography system can effectively collect the high-resolution canopy structure image of the plant.

  13. A Feature-Based Structural Measure: An Image Similarity Measure for Face Recognition

    Directory of Open Access Journals (Sweden)

    Noor Abdalrazak Shnain

    2017-08-01

    Full Text Available Facial recognition is one of the most challenging and interesting problems within the field of computer vision and pattern recognition. During the last few years, it has gained special attention due to its importance in relation to current issues such as security, surveillance systems and forensics analysis. Despite this high level of attention to facial recognition, the success is still limited by certain conditions; there is no method which gives reliable results in all situations. In this paper, we propose an efficient similarity index that resolves the shortcomings of the existing measures of feature and structural similarity. This measure, called the Feature-Based Structural Measure (FSM, combines the best features of the well-known SSIM (structural similarity index measure and FSIM (feature similarity index measure approaches, striking a balance between performance for similar and dissimilar images of human faces. In addition to the statistical structural properties provided by SSIM, edge detection is incorporated in FSM as a distinctive structural feature. Its performance is tested for a wide range of PSNR (peak signal-to-noise ratio, using ORL (Olivetti Research Laboratory, now AT&T Laboratory Cambridge and FEI (Faculty of Industrial Engineering, São Bernardo do Campo, São Paulo, Brazil databases. The proposed measure is tested under conditions of Gaussian noise; simulation results show that the proposed FSM outperforms the well-known SSIM and FSIM approaches in its efficiency of similarity detection and recognition of human faces.

  14. Novel Schemes for Measurement-Based Quantum Computation

    International Nuclear Information System (INIS)

    Gross, D.; Eisert, J.

    2007-01-01

    We establish a framework which allows one to construct novel schemes for measurement-based quantum computation. The technique develops tools from many-body physics--based on finitely correlated or projected entangled pair states--to go beyond the cluster-state based one-way computer. We identify resource states radically different from the cluster state, in that they exhibit nonvanishing correlations, can be prepared using nonmaximally entangling gates, or have very different local entanglement properties. In the computational models, randomness is compensated in a different manner. It is shown that there exist resource states which are locally arbitrarily close to a pure state. We comment on the possibility of tailoring computational models to specific physical systems

  15. Novel schemes for measurement-based quantum computation.

    Science.gov (United States)

    Gross, D; Eisert, J

    2007-06-01

    We establish a framework which allows one to construct novel schemes for measurement-based quantum computation. The technique develops tools from many-body physics-based on finitely correlated or projected entangled pair states-to go beyond the cluster-state based one-way computer. We identify resource states radically different from the cluster state, in that they exhibit nonvanishing correlations, can be prepared using nonmaximally entangling gates, or have very different local entanglement properties. In the computational models, randomness is compensated in a different manner. It is shown that there exist resource states which are locally arbitrarily close to a pure state. We comment on the possibility of tailoring computational models to specific physical systems.

  16. Developing safety performance functions incorporating reliability-based risk measures.

    Science.gov (United States)

    Ibrahim, Shewkar El-Bassiouni; Sayed, Tarek

    2011-11-01

    Current geometric design guides provide deterministic standards where the safety margin of the design output is generally unknown and there is little knowledge of the safety implications of deviating from these standards. Several studies have advocated probabilistic geometric design where reliability analysis can be used to account for the uncertainty in the design parameters and to provide a risk measure of the implication of deviation from design standards. However, there is currently no link between measures of design reliability and the quantification of safety using collision frequency. The analysis presented in this paper attempts to bridge this gap by incorporating a reliability-based quantitative risk measure such as the probability of non-compliance (P(nc)) in safety performance functions (SPFs). Establishing this link will allow admitting reliability-based design into traditional benefit-cost analysis and should lead to a wider application of the reliability technique in road design. The present application is concerned with the design of horizontal curves, where the limit state function is defined in terms of the available (supply) and stopping (demand) sight distances. A comprehensive collision and geometric design database of two-lane rural highways is used to investigate the effect of the probability of non-compliance on safety. The reliability analysis was carried out using the First Order Reliability Method (FORM). Two Negative Binomial (NB) SPFs were developed to compare models with and without the reliability-based risk measures. It was found that models incorporating the P(nc) provided a better fit to the data set than the traditional (without risk) NB SPFs for total, injury and fatality (I+F) and property damage only (PDO) collisions. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. Assessment of Eye Fatigue Caused by 3D Displays Based on Multimodal Measurements

    Science.gov (United States)

    Bang, Jae Won; Heo, Hwan; Choi, Jong-Suk; Park, Kang Ryoung

    2014-01-01

    With the development of 3D displays, user's eye fatigue has been an important issue when viewing these displays. There have been previous studies conducted on eye fatigue related to 3D display use, however, most of these have employed a limited number of modalities for measurements, such as electroencephalograms (EEGs), biomedical signals, and eye responses. In this paper, we propose a new assessment of eye fatigue related to 3D display use based on multimodal measurements. compared to previous works Our research is novel in the following four ways: first, to enhance the accuracy of assessment of eye fatigue, we measure EEG signals, eye blinking rate (BR), facial temperature (FT), and a subjective evaluation (SE) score before and after a user watches a 3D display; second, in order to accurately measure BR in a manner that is convenient for the user, we implement a remote gaze-tracking system using a high speed (mega-pixel) camera that measures eye blinks of both eyes; thirdly, changes in the FT are measured using a remote thermal camera, which can enhance the measurement of eye fatigue, and fourth, we perform various statistical analyses to evaluate the correlation between the EEG signal, eye BR, FT, and the SE score based on the T-test, correlation matrix, and effect size. Results show that the correlation of the SE with other data (FT, BR, and EEG) is the highest, while those of the FT, BR, and EEG with other data are second, third, and fourth highest, respectively. PMID:25192315

  18. Assessment of Eye Fatigue Caused by 3D Displays Based on Multimodal Measurements

    Directory of Open Access Journals (Sweden)

    Jae Won Bang

    2014-09-01

    Full Text Available With the development of 3D displays, user’s eye fatigue has been an important issue when viewing these displays. There have been previous studies conducted on eye fatigue related to 3D display use, however, most of these have employed a limited number of modalities for measurements, such as electroencephalograms (EEGs, biomedical signals, and eye responses. In this paper, we propose a new assessment of eye fatigue related to 3D display use based on multimodal measurements. compared to previous works Our research is novel in the following four ways: first, to enhance the accuracy of assessment of eye fatigue, we measure EEG signals, eye blinking rate (BR, facial temperature (FT, and a subjective evaluation (SE score before and after a user watches a 3D display; second, in order to accurately measure BR in a manner that is convenient for the user, we implement a remote gaze-tracking system using a high speed (mega-pixel camera that measures eye blinks of both eyes; thirdly, changes in the FT are measured using a remote thermal camera, which can enhance the measurement of eye fatigue, and fourth, we perform various statistical analyses to evaluate the correlation between the EEG signal, eye BR, FT, and the SE score based on the T-test, correlation matrix, and effect size. Results show that the correlation of the SE with other data (FT, BR, and EEG is the highest, while those of the FT, BR, and EEG with other data are second, third, and fourth highest, respectively.

  19. New Genome Similarity Measures based on Conserved Gene Adjacencies.

    Science.gov (United States)

    Doerr, Daniel; Kowada, Luis Antonio B; Araujo, Eloi; Deshpande, Shachi; Dantas, Simone; Moret, Bernard M E; Stoye, Jens

    2017-06-01

    Many important questions in molecular biology, evolution, and biomedicine can be addressed by comparative genomic approaches. One of the basic tasks when comparing genomes is the definition of measures of similarity (or dissimilarity) between two genomes, for example, to elucidate the phylogenetic relationships between species. The power of different genome comparison methods varies with the underlying formal model of a genome. The simplest models impose the strong restriction that each genome under study must contain the same genes, each in exactly one copy. More realistic models allow several copies of a gene in a genome. One speaks of gene families, and comparative genomic methods that allow this kind of input are called gene family-based. The most powerful-but also most complex-models avoid this preprocessing of the input data and instead integrate the family assignment within the comparative analysis. Such methods are called gene family-free. In this article, we study an intermediate approach between family-based and family-free genomic similarity measures. Introducing this simpler model, called gene connections, we focus on the combinatorial aspects of gene family-free genome comparison. While in most cases, the computational costs to the general family-free case are the same, we also find an instance where the gene connections model has lower complexity. Within the gene connections model, we define three variants of genomic similarity measures that have different expression powers. We give polynomial-time algorithms for two of them, while we show NP-hardness for the third, most powerful one. We also generalize the measures and algorithms to make them more robust against recent local disruptions in gene order. Our theoretical findings are supported by experimental results, proving the applicability and performance of our newly defined similarity measures.

  20. Wind turbine transformer admittance characterization based on online time-domain measurements and preliminary results from measurements done in two transformers using a SFRA

    DEFF Research Database (Denmark)

    Arana Aristi, Iván; Holbøll, Joachim; Nielsen, Arne Hejde

    2009-01-01

    This paper presents the analysis of online time-domain measurements on the primary and secondary side of a wind turbine transformer in an Offshore Wind Farm (OWF), during one switching operation realized in the collection grid. The frequency characteristics up to 10 kHz of the current and voltage...... signals of each phase were compared and the transformers admittance characteristic was estimated based on these measurements. Based on the results from the previous analysis, it was decided to acquire a Sweep Frequency Response Analyzer (SFRA) to realize detailed transformer measurements. First...... the results from the measurements in a small dry-type transformer under laboratory conditions are presented, and finally the results from a large transformer measured in a in an industrial setting are shown....

  1. Covariance-Based Measurement Selection Criterion for Gaussian-Based Algorithms

    Directory of Open Access Journals (Sweden)

    Fernando A. Auat Cheein

    2013-01-01

    Full Text Available Process modeling by means of Gaussian-based algorithms often suffers from redundant information which usually increases the estimation computational complexity without significantly improving the estimation performance. In this article, a non-arbitrary measurement selection criterion for Gaussian-based algorithms is proposed. The measurement selection criterion is based on the determination of the most significant measurement from both an estimation convergence perspective and the covariance matrix associated with the measurement. The selection criterion is independent from the nature of the measured variable. This criterion is used in conjunction with three Gaussian-based algorithms: the EIF (Extended Information Filter, the EKF (Extended Kalman Filter and the UKF (Unscented Kalman Filter. Nevertheless, the measurement selection criterion shown herein can also be applied to other Gaussian-based algorithms. Although this work is focused on environment modeling, the results shown herein can be applied to other Gaussian-based algorithm implementations. Mathematical descriptions and implementation results that validate the proposal are also included in this work.

  2. Measurement of radiation dose with a PC-based instrument

    International Nuclear Information System (INIS)

    Jangland, L.; Neubeck, R.

    1994-01-01

    The purpose of this study was to investigate in what way the introduction of Digital Subtraction Angiography has influenced absorbed doses to the patient and personnel. Calculation of the energy imparted to the patient, ε, was based on measurements of the dose-area product, tube potential and tube current which were registered with a PC-based instrument. The absorbed doses to the personnel were measured with TLD. The measurements on the personnel were made only at the digital system. The results indicate large variations in ε between different types of angiographic examinations of the same type. The total ε were similar on both systems, although the relative contribution from image acquisition and fluoroscopy were different. At the conventional system fluoroscopy and image acquisition contributed almost equally to the total ε. At the digital system 25% of the total ε was due to fluoroscopy and 75% to image acquisition. The differences were due to longer fluoroscopic times on the conventional system, mainly due to lack of image memory and road mapping, and lower ε/image, due to lower dose settings to the film changer compared to the image intensifier on the digital system. 11 refs., 8 figs., 9 tabs

  3. IMU-Based Joint Angle Measurement for Gait Analysis

    Directory of Open Access Journals (Sweden)

    Thomas Seel

    2014-04-01

    Full Text Available This contribution is concerned with joint angle calculation based on inertial measurement data in the context of human motion analysis. Unlike most robotic devices, the human body lacks even surfaces and right angles. Therefore, we focus on methods that avoid assuming certain orientations in which the sensors are mounted with respect to the body segments. After a review of available methods that may cope with this challenge, we present a set of new methods for: (1 joint axis and position identification; and (2 flexion/extension joint angle measurement. In particular, we propose methods that use only gyroscopes and accelerometers and, therefore, do not rely on a homogeneous magnetic field. We provide results from gait trials of a transfemoral amputee in which we compare the inertial measurement unit (IMU-based methods to an optical 3D motion capture system. Unlike most authors, we place the optical markers on anatomical landmarks instead of attaching them to the IMUs. Root mean square errors of the knee flexion/extension angles are found to be less than 1° on the prosthesis and about 3° on the human leg. For the plantar/dorsiflexion of the ankle, both deviations are about 1°.

  4. Noninvasive blood pressure measurement scheme based on optical fiber sensor

    Science.gov (United States)

    Liu, Xianxuan; Yuan, Xueguang; Zhang, Yangan

    2016-10-01

    Optical fiber sensing has many advantages, such as volume small, light quality, low loss, strong in anti-jamming. Since the invention of the optical fiber sensing technology in 1977, optical fiber sensing technology has been applied in the military, national defense, aerospace, industrial, medical and other fields in recent years, and made a great contribution to parameter measurement in the environment under the limited condition .With the rapid development of computer, network system, the intelligent optical fiber sensing technology, the sensor technology, the combination of computer and communication technology , the detection, diagnosis and analysis can be automatically and efficiently completed. In this work, we proposed a noninvasive blood pressure detection and analysis scheme which uses optical fiber sensor. Optical fiber sensing system mainly includes the light source, optical fiber, optical detector, optical modulator, the signal processing module and so on. wavelength optical signals were led into the optical fiber sensor and the signals reflected by the human body surface were detected. By comparing actual testing data with the data got by traditional way to measure the blood pressure we can establish models for predicting the blood pressure and achieve noninvasive blood pressure measurement by using spectrum analysis technology. Blood pressure measurement method based on optical fiber sensing system is faster and more convenient than traditional way, and it can get accurate analysis results in a shorter period of time than before, so it can efficiently reduce the time cost and manpower cost.

  5. Evaluation of a titanium dioxide-based DGT technique for measuring inorganic uranium species in fresh and marine waters

    DEFF Research Database (Denmark)

    Hutchins, Colin M.; Panther, Jared G.; Teasdale, Peter R.

    2012-01-01

    A new diffusive gradients in a thin film (DGT) technique for measuring dissolved uranium (U) in freshwater is reported. The new method utilises a previously described binding phase, Metsorb (a titanium dioxide based adsorbent). This binding phase was evaluated and compared to the well-established...

  6. Examining the Perceived Value of Integration of Earned Value Management with Risk Management-Based Performance Measurement Baseline

    Science.gov (United States)

    Shah, Akhtar H.

    2014-01-01

    Many projects fail despite the use of evidence-based project management practices such as Performance Measurement Baseline (PMB), Earned Value Management (EVM) and Risk Management (RM). Although previous researchers have found that integrated project management techniques could be more valuable than the same techniques used by themselves, these…

  7. Implementing nonprojective measurements via linear optics: An approach based on optimal quantum-state discrimination

    International Nuclear Information System (INIS)

    Loock, Peter van; Nemoto, Kae; Munro, William J.; Raynal, Philippe; Luetkenhaus, Norbert

    2006-01-01

    We discuss the problem of implementing generalized measurements [positive operator-valued measures (POVMs)] with linear optics, either based upon a static linear array or including conditional dynamics. In our approach, a given POVM shall be identified as a solution to an optimization problem for a chosen cost function. We formulate a general principle: the implementation is only possible if a linear-optics circuit exists for which the quantum mechanical optimum (minimum) is still attainable after dephasing the corresponding quantum states. The general principle enables us, for instance, to derive a set of necessary conditions for the linear-optics implementation of the POVM that realizes the quantum mechanically optimal unambiguous discrimination of two pure nonorthogonal states. This extends our previous results on projection measurements and the exact discrimination of orthogonal states

  8. WSN-Based Space Charge Density Measurement System.

    Science.gov (United States)

    Deng, Dawei; Yuan, Haiwen; Lv, Jianxun; Ju, Yong

    2017-01-01

    It is generally acknowledged that high voltage direct current (HVDC) transmission line endures the drawback of large area, because of which the utilization of cable for space charge density monitoring system is of inconvenience. Compared with the traditional communication network, wireless sensor network (WSN) shows advantages in small volume, high flexibility and strong self-organization, thereby presenting great potential in solving the problem. Additionally, WSN is more suitable for the construction of distributed space charge density monitoring system as it has longer distance and higher mobility. A distributed wireless system is designed for collecting and monitoring the space charge density under HVDC transmission lines, which has been widely applied in both Chinese state grid HVDC test base and power transmission projects. Experimental results of the measuring system demonstrated its adaptability in the complex electromagnetic environment under the transmission lines and the ability in realizing accurate, flexible, and stable demands for the measurement of space charge density.

  9. Radiotomography Based on Monostatic Interference Measurements with Controlled Oscillator

    Directory of Open Access Journals (Sweden)

    Sukhanov Dmitry

    2016-01-01

    Full Text Available The method of three-dimensional tomography based on radioholography measurements with the reference signal transmitted by the transmitter in the near zone and the receiver near zone. We solve the problem of repairing the object signal phase due to the reference signal in the near field in a wide frequency band and the consideration of analytical signals. Here are presented results of experimental studies on application of a tunable YIG (yttrium iron garnet oscillator in the frequency range from 6.5 to 10.7 GHz for radio tomography of metal objects in air. Holographic principle is applied on the basis of measuring of the interference field amplitude by the detector diode. The interference occurs with the direct wave and waves scattered by the object. To restore the radio images the method of aperture synthesis and extraction of quadrature components at all frequencies sensing are applied. Experimental study on test object shows resolution about 15 mm.

  10. Systems-Based Aspects in the Training of IMG or Previously Trained Residents: Comparison of Psychiatry Residency Training in the United States, Canada, the United Kingdom, India, and Nigeria

    Science.gov (United States)

    Jain, Gaurav; Mazhar, Mir Nadeem; Uga, Aghaegbulam; Punwani, Manisha; Broquet, Karen E.

    2012-01-01

    Objectives: International medical graduates (IMGs) account for a significant proportion of residents in psychiatric training in the United States. Many IMGs may have previously completed psychiatry residency training in other countries. Their experiences may improve our system. Authors compared and contrasted psychiatry residency training in the…

  11. Measuring party nationalisation: A new Gini-based indicator that corrects for the number of units

    DEFF Research Database (Denmark)

    Bochsler, Daniel

    2010-01-01

    The study of the territorial distribution of votes in elections has become an important field of the political party research in recent years. Quantitative studies on the homogeneity of votes and turnout employ different indicators of territorial variance, but despite important progresses...... in measurement, many of them are sensitive to size and number of political parties or electoral districts. This article proposes a new 'standardised party nationalisation score', which is based on the Gini coefficient of inequalities in distribution. Different from previous indicators, the standardised party...

  12. Optical character recognition based on nonredundant correlation measurements.

    Science.gov (United States)

    Braunecker, B; Hauck, R; Lohmann, A W

    1979-08-15

    The essence of character recognition is a comparison between the unknown character and a set of reference patterns. Usually, these reference patterns are all possible characters themselves, the whole alphabet in the case of letter characters. Obviously, N analog measurements are highly redundant, since only K = log(2)N binary decisions are enough to identify one out of N characters. Therefore, we devised K reference patterns accordingly. These patterns, called principal components, are found by digital image processing, but used in an optical analog computer. We will explain the concept of principal components, and we will describe experiments with several optical character recognition systems, based on this concept.

  13. MASS MEASUREMENTS OF ISOLATED OBJECTS FROM SPACE-BASED MICROLENSING

    DEFF Research Database (Denmark)

    Zhu, Wei; Novati, S. Calchi; Gould, A.

    2016-01-01

    lies behind the same amount of dust as the Bulge red clump, we find the lens is a 45 ± 7 {M}{{J}} BD at 5.9 ± 1.0 kpc. The lens of of the second event, OGLE-2015-BLG-0763, is a 0.50 ± 0.04 {M}⊙ star at 6.9 ± 1.0 kpc. We show that the probability to definitively measure the mass of isolated microlenses...... is dramatically increased once simultaneous ground- and space-based observations are conducted....

  14. FLEXIBLE PH SENSOR WITH POLYANILINE LAYER BASED ON IMPEDANCE MEASUREMENT

    OpenAIRE

    Chuang, Cheng-Hsin; Wu, Hsun-Pei; Chen, Cheng-Ho; Wu, Peng-Rong

    2012-01-01

    A flexible sensor with conducting polyaniline layer for detecting pH value based on the impedance measurement is fabricated and demonstrated in this study. The pH sensor consists of an interdigital electrode array on a flexible printed circuit and a thin-film polyaniline as the sensing layer. As the conductivity of polyaniline depends on the redox state, the impedance change of the polyaniline after it has reacted with different pH value solutions works as the sensing mechanism. In order to o...

  15. Model-based cartilage thickness measurement in the submillimeter range

    International Nuclear Information System (INIS)

    Streekstra, G. J.; Strackee, S. D.; Maas, M.; Wee, R. ter; Venema, H. W.

    2007-01-01

    Current methods of image-based thickness measurement in thin sheet structures utilize second derivative zero crossings to locate the layer boundaries. It is generally acknowledged that the nonzero width of the point spread function (PSF) limits the accuracy of this measurement procedure. We propose a model-based method that strongly reduces PSF-induced bias by incorporating the PSF into the thickness estimation method. We estimated the bias in thickness measurements in simulated thin sheet images as obtained from second derivative zero crossings. To gain insight into the range of sheet thickness where our method is expected to yield improved results, sheet thickness was varied between 0.15 and 1.2 mm with an assumed PSF as present in the high-resolution modes of current computed tomography (CT) scanners [full width at half maximum (FWHM) 0.5-0.8 mm]. Our model-based method was evaluated in practice by measuring layer thickness from CT images of a phantom mimicking two parallel cartilage layers in an arthrography procedure. CT arthrography images of cadaver wrists were also evaluated, and thickness estimates were compared to those obtained from high-resolution anatomical sections that served as a reference. The thickness estimates from the simulated images reveal that the method based on second derivative zero crossings shows considerable bias for layers in the submillimeter range. This bias is negligible for sheet thickness larger than 1 mm, where the size of the sheet is more than twice the FWHM of the PSF but can be as large as 0.2 mm for a 0.5 mm sheet. The results of the phantom experiments show that the bias is effectively reduced by our method. The deviations from the true thickness, due to random fluctuations induced by quantum noise in the CT images, are of the order of 3% for a standard wrist imaging protocol. In the wrist the submillimeter thickness estimates from the CT arthrography images correspond within 10% to those estimated from the anatomical

  16. Laser-Based Diagnostic Measurements of Low Emissions Combustor Concepts

    Science.gov (United States)

    Hicks, Yolanda R.

    2011-01-01

    This presentation provides a summary of primarily laser-based measurement techniques we use at NASA Glenn Research Center to characterize fuel injection, fuel/air mixing, and combustion. The report highlights using Planar Laser-Induced Fluorescence, Particle Image Velocimetry, and Phase Doppler Interferometry to obtain fuel injector patternation, fuel and air velocities, and fuel drop sizes and turbulence intensities during combustion. We also present a brief comparison between combustors burning standard JP-8 Jet fuel and an alternative fuels. For this comparison, we used flame chemiluminescence and high speed imaging.

  17. A large-scale measurement of electromagnetic fields near GSM base stations in Guangxi, China for risk communication.

    Science.gov (United States)

    Wu, Tongning; Shao, Qing; Yang, Lei; Qi, Dianyuan; Lin, Jun; Lin, Xiaojun; Yu, Zongying

    2013-06-01

    Radiofrequency (RF) electromagnetic field (EMF) exposure from wireless telecommunication base station antennae can lead to debates, conflicts or litigations among the adjacent residents if inappropriately managed. This paper presents a measurement campaign for the GSM band EMF exposure in the vicinity of 827 base station sites (totally 6207 measurement points) in Guangxi, China. Measurement specifications are designed for risk communication with the residents who previously complained of over-exposure. The EMF power densities with the global positioning system coordinate at each measured point were recorded. Compliance with the International Commission on Non-Ionizing Radiation Protection guidelines and Chinese environmental EMF safety standards was studied. The results show that the GSM band EMF level near the base stations is very low. The measurement results and the EMF risk communication procedures positively influence public perception of the RF EMF exposure from the base stations and promote the exchange of EMF exposure-related knowledge.

  18. A large-scale measurement of electromagnetic fields near GSM base stations in Guangxi, China for risk communication

    International Nuclear Information System (INIS)

    Wu, T.; Shao, Q.; Yang, L.; Qi, D.; Lin, J.; Lin, X.; Yu, Z.

    2013-01-01

    Radiofrequency (RF) electromagnetic field (EMF) exposure from wireless telecommunication base station antennae can lead to debates, conflicts or litigations among the adjacent residents if inappropriately managed. This paper presents a measurement campaign for the GSM band EMF exposure in the vicinity of 827 base station sites (totally 6207 measurement points) in Guangxi, China. Measurement specifications are designed for risk communication with the residents who previously complained of over-exposure. The EMF power densities with the global positioning system coordinate at each measured point were recorded. Compliance with the International Commission on Non-Ionizing Radiation Protection guidelines and Chinese environmental EMF safety standards was studied. The results show that the GSM band EMF level near the base stations is very low. The measurement results and the EMF risk communication procedures positively influence public perception of the RF EMF exposure from the base stations and promote the exchange of EMF exposure-related knowledge. (authors)

  19. Microrheometric upconversion-based techniques for intracellular viscosity measurements

    Science.gov (United States)

    Rodríguez-Sevilla, Paloma; Zhang, Yuhai; de Sousa, Nuno; Marqués, Manuel I.; Sanz-Rodríguez, Francisco; Jaque, Daniel; Liu, Xiaogang; Haro-González, Patricia

    2017-08-01

    Rheological parameters (viscosity, creep compliance and elasticity) play an important role in cell function and viability. For this reason different strategies have been developed for their study. In this work, two new microrheometric techniques are presented. Both methods take advantage of the analysis of the polarized emission of an upconverting particle to determine its orientation inside the optical trap. Upconverting particles are optical materials that are able to convert infrared radiation into visible light. Their usefulness has been further boosted by the recent demonstration of their three-dimensional control and tracking by single beam infrared optical traps. In this work it is demonstrated that optical torques are responsible of the stable orientation of the upconverting particle inside the trap. Moreover, numerical calculations and experimental data allowed to use the rotation dynamics of the optically trapped upconverting particle for environmental sensing. In particular, the cytoplasm viscosity could be measured by using the rotation time and thermal fluctuations of an intracellular optically trapped upconverting particle, by means of the two previously mentioned microrheometric techniques.

  20. Measuring participant rurality in Web-based interventions

    Directory of Open Access Journals (Sweden)

    McKay H Garth

    2007-08-01

    Full Text Available Abstract Background Web-based health behavior change programs can reach large groups of disparate participants and thus they provide promise of becoming important public health tools. Data on participant rurality can complement other demographic measures to deepen our understanding of the success of these programs. Specifically, analysis of participant rurality can inform recruitment and social marketing efforts, and facilitate the targeting and tailoring of program content. Rurality analysis can also help evaluate the effectiveness of interventions across population groupings. Methods We describe how the RUCAs (Rural-Urban Commuting Area Codes methodology can be used to examine results from two Randomized Controlled Trials of Web-based tobacco cessation programs: the ChewFree.com project for smokeless tobacco cessation and the Smokers' Health Improvement Program (SHIP project for smoking cessation. Results Using RUCAs methodology helped to highlight the extent to which both Web-based interventions reached a substantial percentage of rural participants. The ChewFree program was found to have more rural participation which is consistent with the greater prevalence of smokeless tobacco use in rural settings as well as ChewFree's multifaceted recruitment program that specifically targeted rural settings. Conclusion Researchers of Web-based health behavior change programs targeted to the US should routinely include RUCAs as a part of analyzing participant demographics. Researchers in other countries should examine rurality indices germane to their country.

  1. Air temperature measurements based on the speed of sound to compensate long distance interferometric measurements

    Directory of Open Access Journals (Sweden)

    Astrua Milena

    2014-01-01

    Full Text Available A method to measure the real time temperature distribution along an interferometer path based on the propagation of acoustic waves is presented. It exploits the high sensitivity of the speed of sound in air to the air temperature. In particular, it takes advantage of a special set-up where the generation of the acoustic waves is synchronous with the amplitude modulation of a laser source. A photodetector converts the laser light to an electronic signal considered as reference, while the incoming acoustic waves are focused on a microphone and generate a second signal. In this condition, the phase difference between the two signals substantially depends on the temperature of the air volume interposed between the sources and the receivers. The comparison with the traditional temperature sensors highlighted the limit of the latter in case of fast temperature variations and the advantage of a measurement integrated along the optical path instead of a sampling measurement. The capability of the acoustic method to compensate the interferometric distance measurements due to air temperature variations has been demonstrated for distances up to 27 m.

  2. Validation of Portable Muscle Tone Measurement Device Based on a Motor-Driven System

    National Research Council Canada - National Science Library

    Chen, Jia-Jin

    2001-01-01

    .... The aim of this study is to extend a sophisticated motor-driven measurement system, developed in our previous research, as a validation platform for developing a portable muscle tone measurement system...

  3. Effects of curriculum-based measurement on teachers' instructional planning.

    Science.gov (United States)

    Fuchs, L S; Fuchs, D; Stecker, P M

    1989-01-01

    This study assessed the effects of curriculum-based measurement (CBM) on teachers' instructional planning. Subjects were 30 teachers, assigned randomly to a computer-assisted CBM group, a noncomputer CBM group, and a contrast group. In the CBM groups, teachers specified 15-week reading goals, established CBM systems to measure student progress toward goals at least twice weekly, and systematically evaluated those data bases to determine when instructional modifications were necessary. Contrast teachers monitored student progress toward Individualized Education Program (IEP) goals as they wished and were encouraged to develop instructional programs as necessary. At the end of a 12- to 15-week implementation period, teachers completed a questionnaire with reference to one randomly selected pupil. Analyses of variance indicated no difference between the CBM groups. However, compared to the contrast group, CBM teachers (a) used more specific, acceptable goals; (b) were less optimistic about goal attainment; (c) cited more objective and frequent data sources for determining the adequacy of student progress and for deciding whether program modifications were necessary; and (d) modified student programs more frequently. Questionnaire responses were correlated with verifiable data sources, and results generally supported the usefulness of the self-report information. Implications for special education research and practice are discussed.

  4. Fiber Bragg Grating Based System for Temperature Measurements

    Science.gov (United States)

    Tahir, Bashir Ahmed; Ali, Jalil; Abdul Rahman, Rosly

    In this study, a fiber Bragg grating sensor for temperature measurement is proposed and experimentally demonstrated. In particular, we point out that the method is well-suited for monitoring temperature because they are able to withstand a high temperature environment, where standard thermocouple methods fail. The interrogation technologies of the sensor systems are all simple, low cost and effective as well. In the sensor system, fiber grating was dipped into a water beaker that was placed on a hotplate to control the temperature of water. The temperature was raised in equal increments. The sensing principle is based on tracking of Bragg wavelength shifts caused by the temperature change. So the temperature is measured based on the wavelength-shifts of the FBG induced by the heating water. The fiber grating is high temperature stable excimer-laser-induced grating and has a linear function of wavelength-temperature in the range of 0-285°C. A dynamic range of 0-285°C and a sensitivity of 0.0131 nm/°C almost equal to that of general FBG have been obtained by this sensor system. Furthermore, the correlation of theoretical analysis and experimental results show the capability and feasibility of the purposed technique.

  5. Quantum Jarzynski equality of measurement-based work extraction.

    Science.gov (United States)

    Morikuni, Yohei; Tajima, Hiroyasu; Hatano, Naomichi

    2017-03-01

    Many studies of quantum-size heat engines assume that the dynamics of an internal system is unitary and that the extracted work is equal to the energy loss of the internal system. Both assumptions, however, should be under scrutiny. In the present paper, we analyze quantum-scale heat engines, employing the measurement-based formulation of the work extraction recently introduced by Hayashi and Tajima [M. Hayashi and H. Tajima, arXiv:1504.06150]. We first demonstrate the inappropriateness of the unitary time evolution of the internal system (namely, the first assumption above) using a simple two-level system; we show that the variance of the energy transferred to an external system diverges when the dynamics of the internal system is approximated to a unitary time evolution. Second, we derive the quantum Jarzynski equality based on the formulation of Hayashi and Tajima as a relation for the work measured by an external macroscopic apparatus. The right-hand side of the equality reduces to unity for "natural" cyclic processes but fluctuates wildly for noncyclic ones, exceeding unity often. This fluctuation should be detectable in experiments and provide evidence for the present formulation.

  6. Security Measurement for Unknown Threats Based on Attack Preferences

    Directory of Open Access Journals (Sweden)

    Lihua Yin

    2018-01-01

    Full Text Available Security measurement matters to every stakeholder in network security. It provides security practitioners the exact security awareness. However, most of the works are not applicable to the unknown threat. What is more, existing efforts on security metric mainly focus on the ease of certain attack from a theoretical point of view, ignoring the “likelihood of exploitation.” To help administrator have a better understanding, we analyze the behavior of attackers who exploit the zero-day vulnerabilities and predict their attack timing. Based on the prediction, we propose a method of security measurement. In detail, we compute the optimal attack timing from the perspective of attacker, using a long-term game to estimate the risk of being found and then choose the optimal timing based on the risk and profit. We design a learning strategy to model the information sharing mechanism among multiattackers and use spatial structure to model the long-term process. After calculating the Nash equilibrium for each subgame, we consider the likelihood of being attacked for each node as the security metric result. The experiment results show the efficiency of our approach.

  7. Environmental dose measurement with microprocessor based portable TLD reader

    International Nuclear Information System (INIS)

    Deme, S.; Apathy, I.; Feher, I.

    1996-01-01

    Application of TL method for environmental gamma-radiation dosimetry involves uncertainty caused by the dose collected during the transport from the point of annealing to the place of exposure and back to the place of evaluation. Should an accident occur read out is delayed due to the need to transport to a laboratory equipped with a TLD reader. A portable reader capable of reading out the TL dosemeter at the place of exposure ('in situ TLD reader') eliminates the above mentioned disadvantages. We have developed a microprocessor based portable TLD reader for monitoring environmental gamma-radiation doses and for on board reading out of doses on space stations. The first version of our portable, battery operated reader (named Pille - 'butterfly') was made at the beginning of the 80s. These devices used CaSO 4 bulb dosemeters and the evaluation technique was based on analogue timing circuits and analogue to digital conversion of the photomultiplier current with a read out precision of 1 μGy and a measuring range up to 10 Gy. The measured values were displayed and manually recorded. The version with an external power supply was used for space dosimetry as an onboard TLD reader

  8. Determining root correspondence between previously and newly detected objects

    Science.gov (United States)

    Paglieroni, David W.; Beer, N Reginald

    2014-06-17

    A system that applies attribute and topology based change detection to networks of objects that were detected on previous scans of a structure, roadway, or area of interest. The attributes capture properties or characteristics of the previously detected objects, such as location, time of detection, size, elongation, orientation, etc. The topology of the network of previously detected objects is maintained in a constellation database that stores attributes of previously detected objects and implicitly captures the geometrical structure of the network. A change detection system detects change by comparing the attributes and topology of new objects detected on the latest scan to the constellation database of previously detected objects.

  9. Improving mass measurement accuracy in mass spectrometry based proteomics by combining open source tools for chromatographic alignment and internal calibration.

    Science.gov (United States)

    Palmblad, Magnus; van der Burgt, Yuri E M; Dalebout, Hans; Derks, Rico J E; Schoenmaker, Bart; Deelder, André M

    2009-05-02

    Accurate mass determination enhances peptide identification in mass spectrometry based proteomics. We here describe the combination of two previously published open source software tools to improve mass measurement accuracy in Fourier transform ion cyclotron resonance mass spectrometry (FTICRMS). The first program, msalign, aligns one MS/MS dataset with one FTICRMS dataset. The second software, recal2, uses peptides identified from the MS/MS data for automated internal calibration of the FTICR spectra, resulting in sub-ppm mass measurement errors.

  10. Heart rate measurement based on face video sequence

    Science.gov (United States)

    Xu, Fang; Zhou, Qin-Wu; Wu, Peng; Chen, Xing; Yang, Xiaofeng; Yan, Hong-jian

    2015-03-01

    This paper proposes a new non-contact heart rate measurement method based on photoplethysmography (PPG) theory. With this method we can measure heart rate remotely with a camera and ambient light. We collected video sequences of subjects, and detected remote PPG signals through video sequences. Remote PPG signals were analyzed with two methods, Blind Source Separation Technology (BSST) and Cross Spectral Power Technology (CSPT). BSST is a commonly used method, and CSPT is used for the first time in the study of remote PPG signals in this paper. Both of the methods can acquire heart rate, but compared with BSST, CSPT has clearer physical meaning, and the computational complexity of CSPT is lower than that of BSST. Our work shows that heart rates detected by CSPT method have good consistency with the heart rates measured by a finger clip oximeter. With good accuracy and low computational complexity, the CSPT method has a good prospect for the application in the field of home medical devices and mobile health devices.

  11. Coordinate measuring system based on microchip lasers for reverse prototyping

    Science.gov (United States)

    Iakovlev, Alexey; Grishkanich, Alexsandr S.; Redka, Dmitriy; Tsvetkov, Konstantin

    2017-02-01

    According to the current great interest concerning Large-Scale Metrology applications in many different fields of manufacturing industry, technologies and techniques for dimensional measurement have recently shown a substantial improvement. Ease-of-use, logistic and economic issues, as well as metrological performance, are assuming a more and more important role among system requirements. The project is planned to conduct experimental studies aimed at identifying the impact of the application of the basic laws of chip and microlasers as radiators on the linear-angular characteristics of existing measurement systems. The project is planned to conduct experimental studies aimed at identifying the impact of the application of the basic laws of microlasers as radiators on the linear-angular characteristics of existing measurement systems. The system consists of a distributed network-based layout, whose modularity allows to fit differently sized and shaped working volumes by adequately increasing the number of sensing units. Differently from existing spatially distributed metrological instruments, the remote sensor devices are intended to provide embedded data elaboration capabilities, in order to share the overall computational load.

  12. Analogy between gambling and measurement-based work extraction

    Science.gov (United States)

    Vinkler, Dror A.; Permuter, Haim H.; Merhav, Neri

    2016-04-01

    In information theory, one area of interest is gambling, where mutual information characterizes the maximal gain in wealth growth rate due to knowledge of side information; the betting strategy that achieves this maximum is named the Kelly strategy. In the field of physics, it was recently shown that mutual information can characterize the maximal amount of work that can be extracted from a single heat bath using measurement-based control protocols, i.e. using ‘information engines’. However, to the best of our knowledge, no relation between gambling and information engines has been presented before. In this paper, we briefly review the two concepts and then demonstrate an analogy between gambling, where bits are converted into wealth, and information engines, where bits representing measurements are converted into energy. From this analogy follows an extension of gambling to the continuous-valued case, which is shown to be useful for investments in currency exchange rates or in the stock market using options. Moreover, the analogy enables us to use well-known methods and results from one field to solve problems in the other. We present three such cases: maximum work extraction when the probability distributions governing the system and measurements are unknown, work extraction when some energy is lost in each cycle, e.g. due to friction, and an analysis of systems with memory. In all three cases, the analogy enables us to use known results in order to obtain new ones.

  13. A Method to Measure the Bracelet Based on Feature Energy

    Science.gov (United States)

    Liu, Hongmin; Li, Lu; Wang, Zhiheng; Huo, Zhanqiang

    2017-12-01

    To measure the bracelet automatically, a novel method based on feature energy is proposed. Firstly, the morphological method is utilized to preprocess the image, and the contour consisting of a concentric circle is extracted. Then, a feature energy function, which is relevant to the distances from one pixel to the edge points, is defined taking into account the geometric properties of the concentric circle. The input image is subsequently transformed to the feature energy distribution map (FEDM) by computing the feature energy of each pixel. The center of the concentric circle is thus located by detecting the maximum on the FEDM; meanwhile, the radii of the concentric circle are determined according to the feature energy function of the center pixel. Finally, with the use of a calibration template, the internal diameter and thickness of the bracelet are measured. The experimental results show that the proposed method can measure the true sizes of the bracelet accurately with the simplicity, directness and robustness compared to the existing methods.

  14. EIGENVECTOR-BASED CENTRALITY MEASURES FOR TEMPORAL NETWORKS*

    Science.gov (United States)

    TAYLOR, DANE; MYERS, SEAN A.; CLAUSET, AARON; PORTER, MASON A.; MUCHA, PETER J.

    2017-01-01

    Numerous centrality measures have been developed to quantify the importances of nodes in time-independent networks, and many of them can be expressed as the leading eigenvector of some matrix. With the increasing availability of network data that changes in time, it is important to extend such eigenvector-based centrality measures to time-dependent networks. In this paper, we introduce a principled generalization of network centrality measures that is valid for any eigenvector-based centrality. We consider a temporal network with N nodes as a sequence of T layers that describe the network during different time windows, and we couple centrality matrices for the layers into a supra-centrality matrix of size NT × NT whose dominant eigenvector gives the centrality of each node i at each time t. We refer to this eigenvector and its components as a joint centrality, as it reflects the importances of both the node i and the time layer t. We also introduce the concepts of marginal and conditional centralities, which facilitate the study of centrality trajectories over time. We find that the strength of coupling between layers is important for determining multiscale properties of centrality, such as localization phenomena and the time scale of centrality changes. In the strong-coupling regime, we derive expressions for time-averaged centralities, which are given by the zeroth-order terms of a singular perturbation expansion. We also study first-order terms to obtain first-order-mover scores, which concisely describe the magnitude of nodes’ centrality changes over time. As examples, we apply our method to three empirical temporal networks: the United States Ph.D. exchange in mathematics, costarring relationships among top-billed actors during the Golden Age of Hollywood, and citations of decisions from the United States Supreme Court. PMID:29046619

  15. Measuring energy efficiency: Is energy intensity a good evidence base?

    International Nuclear Information System (INIS)

    Proskuryakova, L.; Kovalev, A.

    2015-01-01

    Highlights: • Energy intensity measure reflects consumption, not energy efficiency. • Thermodynamic indicators should describe energy efficiency at all levels. • These indicators should have no reference to economic or financial parameters. • A set of energy efficiency indicators should satisfy several basic principles. • There are trade-offs between energy efficiency, power and costs. - Abstract: There is a widespread assumption in energy statistics and econometrics that energy intensity and energy efficiency are equivalent measures of energy performance of economies. The paper points to the discrepancy between the engineering concept of energy efficiency and the energy intensity as it is understood in macroeconomic statistics. This double discrepancy concerns definitions (while engineering concept of energy efficiency is based on the thermodynamic definition, energy intensity includes economic measures) and use. With regard to the latter, the authors conclude that energy intensity can only provide indirect and delayed evidence of technological and engineering energy efficiency of energy conversion processes, which entails shortcomings for management and policymaking. Therefore, we suggest to stop considering subsectoral, sectoral and other levels of energy intensities as aggregates of lower-level energy efficiency. It is suggested that the insufficiency of energy intensity indicators can be compensated with the introduction of thermodynamic indicators describing energy efficiency at the physical, technological, enterprise, sub-sector, sectoral and national levels without references to any economic or financial parameters. Structured statistical data on thermodynamic efficiency is offered as a better option for identifying break-through technologies and technological bottle-necks that constrain efficiency advancements. It is also suggested that macro-level thermodynamic indicators should be based on the thermodynamic first law efficiency and the energy

  16. Accurate fluid force measurement based on control surface integration

    Science.gov (United States)

    Lentink, David

    2018-01-01

    Nonintrusive 3D fluid force measurements are still challenging to conduct accurately for freely moving animals, vehicles, and deforming objects. Two techniques, 3D particle image velocimetry (PIV) and a new technique, the aerodynamic force platform (AFP), address this. Both rely on the control volume integral for momentum; whereas PIV requires numerical integration of flow fields, the AFP performs the integration mechanically based on rigid walls that form the control surface. The accuracy of both PIV and AFP measurements based on the control surface integration is thought to hinge on determining the unsteady body force associated with the acceleration of the volume of displaced fluid. Here, I introduce a set of non-dimensional error ratios to show which fluid and body parameters make the error negligible. The unsteady body force is insignificant in all conditions where the average density of the body is much greater than the density of the fluid, e.g., in gas. Whenever a strongly deforming body experiences significant buoyancy and acceleration, the error is significant. Remarkably, this error can be entirely corrected for with an exact factor provided that the body has a sufficiently homogenous density or acceleration distribution, which is common in liquids. The correction factor for omitting the unsteady body force, {{{ {ρ f}} {1 - {ρ f} ( {{ρ b}+{ρ f}} )}.{( {{{{ρ }}b}+{ρ f}} )}}} , depends only on the fluid, {ρ f}, and body, {{ρ }}b, density. Whereas these straightforward solutions work even at the liquid-gas interface in a significant number of cases, they do not work for generalized bodies undergoing buoyancy in combination with appreciable body density inhomogeneity, volume change (PIV), or volume rate-of-change (PIV and AFP). In these less common cases, the 3D body shape needs to be measured and resolved in time and space to estimate the unsteady body force. The analysis shows that accounting for the unsteady body force is straightforward to non

  17. Flexure mechanism-based parallelism measurements for chip-on-glass bonding

    International Nuclear Information System (INIS)

    Jung, Seung Won; Yun, Won Soo; Jin, Songwan; Jeong, Young Hun; Kim, Bo Sun

    2011-01-01

    Recently, liquid crystal displays (LCDs) have played vital roles in a variety of electronic devices such as televisions, cellular phones, and desktop/laptop monitors because of their enhanced volume, performance, and functionality. However, there is still a need for thinner LCD panels due to the trend of miniaturization in electronic applications. Thus, chip-on-glass (COG) bonding has become one of the most important aspects in the LCD panel manufacturing process. In this study, a novel sensor was developed to measure the parallelism between the tooltip planes of the bonding head and the backup of the COG main bonder, which has previously been estimated by prescale pressure films in industry. The sensor developed in this study is based on a flexure mechanism, and it can measure the total pressing force and the inclination angles in two directions that satisfy the quantitative definition of parallelism. To improve the measurement accuracy, the sensor was calibrated based on the estimation of the total pressing force and the inclination angles using the least-squares method. To verify the accuracy of the sensor, the estimation results for parallelism were compared with those from prescale pressure film measurements. In addition, the influence of parallelism on the bonding quality was experimentally demonstrated. The sensor was successfully applied to the measurement of parallelism in the COG-bonding process with an accuracy of more than three times that of the conventional method using prescale pressure films

  18. Coherence and entanglement measures based on Rényi relative entropies

    International Nuclear Information System (INIS)

    Zhu, Huangjun; Hayashi, Masahito; Chen, Lin

    2017-01-01

    We study systematically resource measures of coherence and entanglement based on Rényi relative entropies, which include the logarithmic robustness of coherence, geometric coherence, and conventional relative entropy of coherence together with their entanglement analogues. First, we show that each Rényi relative entropy of coherence is equal to the corresponding Rényi relative entropy of entanglement for any maximally correlated state. By virtue of this observation, we establish a simple operational connection between entanglement measures and coherence measures based on Rényi relative entropies. We then prove that all these coherence measures, including the logarithmic robustness of coherence, are additive. Accordingly, all these entanglement measures are additive for maximally correlated states. In addition, we derive analytical formulas for Rényi relative entropies of entanglement of maximally correlated states and bipartite pure states, which reproduce a number of classic results on the relative entropy of entanglement and logarithmic robustness of entanglement in a unified framework. Several nontrivial bounds for Rényi relative entropies of coherence (entanglement) are further derived, which improve over results known previously. Moreover, we determine all states whose relative entropy of coherence is equal to the logarithmic robustness of coherence. As an application, we provide an upper bound for the exact coherence distillation rate, which is saturated for pure states. (paper)

  19. A Time-Measurement System Based on Isotopic Ratios

    International Nuclear Information System (INIS)

    Vo, Duc T.; Karpius, P.J.; MacArthur, D.W.; Thron, J.L.

    2007-01-01

    A time-measurement system can be built based on the ratio of gamma-ray peak intensities from two radioactive isotopes. The ideal system would use a parent isotope with a short half-life decaying to a long half-life daughter. The activities of the parent-daughter isotopes would be measured using a gamma-ray detector system. The time can then be determined from the ratio of the activities. The best-known candidate for such a system is the 241 Pu- 241 Am parent-daughter pair. However, this 241 Pu- 241 Am system would require a high-purity germanium detector system and sophisticated software to separate and distinguish between the many gamma-ray peaks produced by the decays of the two isotopes. An alternate system would use two different isotopes, again one with a short half-life and one with a half-life that is long relative to the other. The pair of isotopes 210 Pb and 241 Am (with half-lives of 22 and 432 years, respectively) appears suitable for such a system. This time-measurement system operates by measuring the change in the ratio of the 47-keV peak of 210 Pb to the 60-keV peak of 241 Am. For the system to work reasonably well, the resolution of the detector would need to be such that the two gamma-ray peaks are well separated so that their peak areas can be accurately determined using a simple region-of-interest (ROI) method. A variety of detectors were tested to find a suitable system for this application. The results of these tests are presented here.

  20. Ozone Measurements Monitoring Using Data-Based Approach

    KAUST Repository

    Harrou, Fouzi; Kadri, Farid; Khadraoui, Sofiane; Sun, Ying

    2016-01-01

    The complexity of ozone (O3) formation mechanisms in the troposphere make the fast and accurate modeling of ozone very challenging. In the absence of a process model, principal component analysis (PCA) has been extensively used as a data-based monitoring technique for highly correlated process variables; however conventional PCA-based detection indices often fail to detect small or moderate anomalies. In this work, we propose an innovative method for detecting small anomalies in highly correlated multivariate data. The developed method combine the multivariate exponentially weighted moving average (MEWMA) monitoring scheme with PCA modelling in order to enhance anomaly detection performance. Such a choice is mainly motivated by the greater ability of the MEWMA monitoring scheme to detect small changes in the process mean. The proposed PCA-based MEWMA monitoring scheme is successfully applied to ozone measurements data collected from Upper Normandy region, France, via the network of air quality monitoring stations. The detection results of the proposed method are compared to that declared by Air Normand air monitoring association.

  1. Ozone Measurements Monitoring Using Data-Based Approach

    KAUST Repository

    Harrou, Fouzi

    2016-02-01

    The complexity of ozone (O3) formation mechanisms in the troposphere make the fast and accurate modeling of ozone very challenging. In the absence of a process model, principal component analysis (PCA) has been extensively used as a data-based monitoring technique for highly correlated process variables; however conventional PCA-based detection indices often fail to detect small or moderate anomalies. In this work, we propose an innovative method for detecting small anomalies in highly correlated multivariate data. The developed method combine the multivariate exponentially weighted moving average (MEWMA) monitoring scheme with PCA modelling in order to enhance anomaly detection performance. Such a choice is mainly motivated by the greater ability of the MEWMA monitoring scheme to detect small changes in the process mean. The proposed PCA-based MEWMA monitoring scheme is successfully applied to ozone measurements data collected from Upper Normandy region, France, via the network of air quality monitoring stations. The detection results of the proposed method are compared to that declared by Air Normand air monitoring association.

  2. Output power distributions of mobile radio base stations based on network measurements

    International Nuclear Information System (INIS)

    Colombi, D; Thors, B; Persson, T; Törnevik, C; Wirén, N; Larsson, L-E

    2013-01-01

    In this work output power distributions of mobile radio base stations have been analyzed for 2G and 3G telecommunication systems. The approach is based on measurements in selected networks using performance surveillance tools part of the network Operational Support System (OSS). For the 3G network considered, direct measurements of output power levels were possible, while for the 2G networks, output power levels were estimated from measurements of traffic volumes. Both voice and data services were included in the investigation. Measurements were conducted for large geographical areas, to ensure good overall statistics, as well as for smaller areas to investigate the impact of different environments. For high traffic hours, the 90th percentile of the averaged output power was found to be below 65% and 45% of the available output power for the 2G and 3G systems, respectively.

  3. Output power distributions of mobile radio base stations based on network measurements

    Science.gov (United States)

    Colombi, D.; Thors, B.; Persson, T.; Wirén, N.; Larsson, L.-E.; Törnevik, C.

    2013-04-01

    In this work output power distributions of mobile radio base stations have been analyzed for 2G and 3G telecommunication systems. The approach is based on measurements in selected networks using performance surveillance tools part of the network Operational Support System (OSS). For the 3G network considered, direct measurements of output power levels were possible, while for the 2G networks, output power levels were estimated from measurements of traffic volumes. Both voice and data services were included in the investigation. Measurements were conducted for large geographical areas, to ensure good overall statistics, as well as for smaller areas to investigate the impact of different environments. For high traffic hours, the 90th percentile of the averaged output power was found to be below 65% and 45% of the available output power for the 2G and 3G systems, respectively.

  4. Validation of OMI UV measurements against ground-based measurements at a station in Kampala, Uganda

    Science.gov (United States)

    Muyimbwa, Dennis; Dahlback, Arne; Stamnes, Jakob; Hamre, Børge; Frette, Øyvind; Ssenyonga, Taddeo; Chen, Yi-Chun

    2015-04-01

    We present solar ultraviolet (UV) irradiance data measured with a NILU-UV instrument at a ground site in Kampala (0.31°N, 32.58°E), Uganda for the period 2005-2014. The data were analyzed and compared with UV irradiances inferred from the Ozone Monitoring Instrument (OMI) for the same period. Kampala is located on the shores of lake Victoria, Africa's largest fresh water lake, which may influence the climate and weather conditions of the region. Also, there is an excessive use of worn cars, which may contribute to a high anthropogenic loading of absorbing aerosols. The OMI surface UV algorithm does not account for absorbing aerosols, which may lead to systematic overestimation of surface UV irradiances inferred from OMI satellite data. We retrieved UV index values from OMI UV irradiances and validated them against the ground-based UV index values obtained from NILU-UV measurements. The UV index values were found to follow a seasonal pattern similar to that of the clouds and the rainfall. OMI inferred UV index values were overestimated with a mean bias of about 28% under all-sky conditions, but the mean bias was reduced to about 8% under clear-sky conditions when only days with radiation modification factor (RMF) greater than 65% were considered. However, when days with RMF greater than 70, 75, and 80% were considered, OMI inferred UV index values were found to agree with the ground-based UV index values to within 5, 3, and 1%, respectively. In the validation we identified clouds/aerosols, which were present in 88% of the measurements, as the main cause of OMI inferred overestimation of the UV index.

  5. Developing barbed microtip-based electrode arrays for biopotential measurement.

    Science.gov (United States)

    Hsu, Li-Sheng; Tung, Shu-Wei; Kuo, Che-Hsi; Yang, Yao-Joe

    2014-07-10

    This study involved fabricating barbed microtip-based electrode arrays by using silicon wet etching. KOH anisotropic wet etching was employed to form a standard pyramidal microtip array and HF/HNO3 isotropic etching was used to fabricate barbs on these microtips. To improve the electrical conductance between the tip array on the front side of the wafer and the electrical contact on the back side, a through-silicon via was created during the wet etching process. The experimental results show that the forces required to detach the barbed microtip arrays from human skin, a polydimethylsiloxane (PDMS) polymer, and a polyvinylchloride (PVC) film were larger compared with those required to detach microtip arrays that lacked barbs. The impedances of the skin-electrode interface were measured and the performance levels of the proposed dry electrode were characterized. Electrode prototypes that employed the proposed tip arrays were implemented. Electroencephalogram (EEG) and electrocardiography (ECG) recordings using these electrode prototypes were also demonstrated.

  6. EPR-based distance measurements at ambient temperature.

    Science.gov (United States)

    Krumkacheva, Olesya; Bagryanskaya, Elena

    2017-07-01

    Pulsed dipolar (PD) EPR spectroscopy is a powerful technique allowing for distance measurements between spin labels in the range of 2.5-10.0nm. It was proposed more than 30years ago, and nowadays is widely used in biophysics and materials science. Until recently, PD EPR experiments were limited to cryogenic temperatures (TEPR as well as other approaches based on EPR (e.g., relaxation enhancement; RE). In this paper, we review the features of PD EPR and RE at ambient temperatures, in particular, requirements on electron spin phase memory time, ways of immobilization of biomolecules, the influence of a linker between the spin probe and biomolecule, and future opportunities. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Smart phone-based Chemistry Instrumentation: Digitization of Colorimetric Measurements

    International Nuclear Information System (INIS)

    Chang, Byoung Yong

    2012-01-01

    This report presents a mobile instrumentation platform based on a smart phone using its built-in functions for colorimetric diagnosis. The color change as a result of detection is taken as a picture through a CCD camera built in the smart phone, and is evaluated in the form of the hue value to give the well-defined relationship between the color and the concentration. To prove the concept in the present work, proton concentration measurements were conducted on pH paper coupled with a smart phone for demonstration. This report is believed to show the possibility of adapting a smart phone to a mobile analytical transducer, and more applications for bioanalysis are expected to be developed using other built-in functions of the smart phone

  8. Radiation-damage measurements on PVT-based plastic scintillators

    International Nuclear Information System (INIS)

    Ilie, S.; Schoenbacher, H.; Tavlet, M.

    1993-01-01

    Samples of PVT-based plastic scintillators produced by Nuclear Enterprise Technology Ltd. (NET) were irradiated up to 9 kGy, both with a gamma source and within a typical accelerator radiation field (CERN PS ACOL Irradiation Facility). The consequent reduction of scintillating efficiency and light transmission were measured, as well as subsequent recovery, over a period of several months. The main results show that irradiation affects more the light transmission than the light emission. The radiation type does not affect either the amount of transmission reduction or the recovery. Observations were also made by means of polarized light. Non-uniformities and internal stresses were observed in scintillator bulks which were polymerized too quickly. These defects influence the light transmission. (orig.)

  9. Measurement-Based Entanglement of Noninteracting Bosonic Atoms.

    Science.gov (United States)

    Lester, Brian J; Lin, Yiheng; Brown, Mark O; Kaufman, Adam M; Ball, Randall J; Knill, Emanuel; Rey, Ana M; Regal, Cindy A

    2018-05-11

    We demonstrate the ability to extract a spin-entangled state of two neutral atoms via postselection based on a measurement of their spatial configuration. Typically, entangled states of neutral atoms are engineered via atom-atom interactions. In contrast, in our Letter, we use Hong-Ou-Mandel interference to postselect a spin-singlet state after overlapping two atoms in distinct spin states on an effective beam splitter. We verify the presence of entanglement and determine a bound on the postselected fidelity of a spin-singlet state of (0.62±0.03). The experiment has direct analogy to creating polarization entanglement with single photons and hence demonstrates the potential to use protocols developed for photons to create complex quantum states with noninteracting atoms.

  10. Monte Carlo evaluation of derivative-based global sensitivity measures

    Energy Technology Data Exchange (ETDEWEB)

    Kucherenko, S. [Centre for Process Systems Engineering, Imperial College London, London SW7 2AZ (United Kingdom)], E-mail: s.kucherenko@ic.ac.uk; Rodriguez-Fernandez, M. [Process Engineering Group, Instituto de Investigaciones Marinas, Spanish Council for Scientific Research (C.S.I.C.), C/ Eduardo Cabello, 6, 36208 Vigo (Spain); Pantelides, C.; Shah, N. [Centre for Process Systems Engineering, Imperial College London, London SW7 2AZ (United Kingdom)

    2009-07-15

    A novel approach for evaluation of derivative-based global sensitivity measures (DGSM) is presented. It is compared with the Morris and the Sobol' sensitivity indices methods. It is shown that there is a link between DGSM and Sobol' sensitivity indices. DGSM are very easy to implement and evaluate numerically. The computational time required for numerical evaluation of DGSM is many orders of magnitude lower than that for estimation of the Sobol' sensitivity indices. It is also lower than that for the Morris method. Efficiencies of Monte Carlo (MC) and quasi-Monte Carlo (QMC) sampling methods for calculation of DGSM are compared. It is shown that the superiority of QMC over MC depends on the problem's effective dimension, which can also be estimated using DGSM.

  11. Monte Carlo evaluation of derivative-based global sensitivity measures

    International Nuclear Information System (INIS)

    Kucherenko, S.; Rodriguez-Fernandez, M.; Pantelides, C.; Shah, N.

    2009-01-01

    A novel approach for evaluation of derivative-based global sensitivity measures (DGSM) is presented. It is compared with the Morris and the Sobol' sensitivity indices methods. It is shown that there is a link between DGSM and Sobol' sensitivity indices. DGSM are very easy to implement and evaluate numerically. The computational time required for numerical evaluation of DGSM is many orders of magnitude lower than that for estimation of the Sobol' sensitivity indices. It is also lower than that for the Morris method. Efficiencies of Monte Carlo (MC) and quasi-Monte Carlo (QMC) sampling methods for calculation of DGSM are compared. It is shown that the superiority of QMC over MC depends on the problem's effective dimension, which can also be estimated using DGSM.

  12. Measuring intracellular redox conditions using GFP-based sensors

    DEFF Research Database (Denmark)

    Björnberg, Olof; Ostergaard, Henrik; Winther, Jakob R

    2006-01-01

    Recent years have seen the development of methods for analyzing the redox conditions in specific compartments in living cells. These methods are based on genetically encoded sensors comprising variants of Green Fluorescent Protein in which vicinal cysteine residues have been introduced at solvent......-exposed positions. Several mutant forms have been identified in which formation of a disulfide bond between these cysteine residues results in changes of their fluorescence properties. The redox sensors have been characterized biochemically and found to behave differently, both spectroscopically and in terms...... of redox properties. As genetically encoded sensors they can be expressed in living cells and used for analysis of intracellular redox conditions; however, which parameters are measured depends on how the sensors interact with various cellular redox components. Results of both biochemical and cell...

  13. Accurate position estimation methods based on electrical impedance tomography measurements

    Science.gov (United States)

    Vergara, Samuel; Sbarbaro, Daniel; Johansen, T. A.

    2017-08-01

    than 0.05% of the tomograph radius value. These results demonstrate that the proposed approaches can estimate an object’s position accurately based on EIT measurements if enough process information is available for training or modelling. Since they do not require complex calculations it is possible to use them in real-time applications without requiring high-performance computers.

  14. Surface characterization of hemodialysis membranes based on streaming potential measurements.

    Science.gov (United States)

    Werner, C; Jacobasch, H J; Reichelt, G

    1995-01-01

    Hemodialysis membranes made from cellulose (CUPROPHAN, HEMOPHAN) and sulfonated polyethersulfone (SPES) were characterized using the streaming potential technique to determine the zeta potential at their interfaces against well-defined aqueous solutions of varied pH and potassium chloride concentrations. Streaming potential measurements enable distinction between different membrane materials. In addition to parameters of the electrochemical double layer at membrane interfaces, thermodynamic characteristics of adsorption of different solved species were evaluated. For that aim a description of double layer formation as suggested by Börner and Jacobasch (in: Electrokinetic Phenomena, p. 231. Institut für Technologie der Polymere, Dresden (1989)) was applied which is based on the generally accepted model of the electrochemical double layer according to Stern (Z. Elektrochemie 30, 508 (1924)) and Grahame (Chem. Rev. 41, 441 (1947)). The membranes investigated show different surface acidic/basic and polar/nonpolar behavior. Furthermore, alterations of membrane interfaces through adsorption processes of components of biologically relevant solutions were shown to be detectable by streaming potential measurements.

  15. Uav Positioning and Collision Avoidance Based on RSS Measurements

    Science.gov (United States)

    Masiero, A.; Fissore, F.; Guarnieri, A.; Pirotti, F.; Vettore, A.

    2015-08-01

    In recent years, Unmanned Aerial Vehicles (UAVs) are attracting more and more attention in both the research and industrial communities: indeed, the possibility to use them in a wide range of remote sensing applications makes them a very flexible and attractive solution in both civil and commercial cases (e.g. precision agriculture, security and control, monitoring of sites, exploration of areas difficult to reach). Most of the existing UAV positioning systems rely on the use of the GPS signal. Despite this can be a satisfactory solution in open environments where the GPS signal is available, there are several operating conditions of interest where it is unavailable or unreliable (e.g. close to high buildings, or mountains, in indoor environments). Consequently, a different approach has to be adopted in these cases. This paper considers the use ofWiFi measurements in order to obtain position estimations of the device of interest. More specifically, to limit the costs for the devices involved in the positioning operations, an approach based on radio signal strengths (RSS) measurements is considered. Thanks to the use of a Kalman filter, the proposed approach takes advantage of the temporal dynamic of the device of interest in order to improve the positioning results initially provided by means of maximum likelihood estimations. The considered UAVs are assumed to be provided with communication devices, which can allow them to communicate with each other in order to improve their cooperation abilities. In particular, the collision avoidance problem is examined in this work.

  16. Extrapolated HPGe efficiency estimates based on a single calibration measurement

    International Nuclear Information System (INIS)

    Winn, W.G.

    1994-01-01

    Gamma spectroscopists often must analyze samples with geometries for which their detectors are not calibrated. The effort to experimentally recalibrate a detector for a new geometry can be quite time consuming, causing delay in reporting useful results. Such concerns have motivated development of a method for extrapolating HPGe efficiency estimates from an existing single measured efficiency. Overall, the method provides useful preliminary results for analyses that do not require exceptional accuracy, while reliably bracketing the credible range. The estimated efficiency element-of for a uniform sample in a geometry with volume V is extrapolated from the measured element-of 0 of the base sample of volume V 0 . Assuming all samples are centered atop the detector for maximum efficiency, element-of decreases monotonically as V increases about V 0 , and vice versa. Extrapolation of high and low efficiency estimates element-of h and element-of L provides an average estimate of element-of = 1/2 [element-of h + element-of L ] ± 1/2 [element-of h - element-of L ] (general) where an uncertainty D element-of = 1/2 (element-of h - element-of L ] brackets limits for a maximum possible error. The element-of h and element-of L both diverge from element-of 0 as V deviates from V 0 , causing D element-of to increase accordingly. The above concepts guided development of both conservative and refined estimates for element-of

  17. A computer-based measure of resultant achievement motivation.

    Science.gov (United States)

    Blankenship, V

    1987-08-01

    Three experiments were conducted to develop a computer-based measure of individual differences in resultant achievement motivation (RAM) on the basis of level-of-aspiration, achievement motivation, and dynamics-of-action theories. In Experiment 1, the number of atypical shifts and greater responsiveness to incentives on 21 trials with choices among easy, intermediate, and difficult levels of an achievement-oriented game were positively correlated and were found to differentiate the 62 subjects (31 men, 31 women) on the amount of time they spent at a nonachievement task (watching a color design) 1 week later. In Experiment 2, test-retest reliability was established with the use of 67 subjects (15 men, 52 women). Point and no-point trials were offered in blocks, with point trials first for half the subjects and no-point trials first for the other half. Reliability was higher for the atypical-shift measure than for the incentive-responsiveness measure and was higher when points were offered first. In Experiment 3, computer anxiety was manipulated by creating a simulated computer breakdown in the experimental condition. Fifty-nine subjects (13 men, 46 women) were randomly assigned to the experimental condition or to one of two control conditions (an interruption condition and a no-interruption condition). Subjects with low RAM, as demonstrated by a low number of typical shifts, took longer to choose the achievement-oriented task, as predicted by the dynamics-of-action theory. The difference was evident in all conditions and most striking in the computer-breakdown condition. A change of focus from atypical to typical shifts is discussed.

  18. A Dynamic Attitude Measurement System Based on LINS

    Directory of Open Access Journals (Sweden)

    Hanzhou Li

    2014-08-01

    Full Text Available A dynamic attitude measurement system (DAMS is developed based on a laser inertial navigation system (LINS. Three factors of the dynamic attitude measurement error using LINS are analyzed: dynamic error, time synchronization and phase lag. An optimal coning errors compensation algorithm is used to reduce coning errors, and two-axis wobbling verification experiments are presented in the paper. The tests indicate that the attitude accuracy is improved 2-fold by the algorithm. In order to decrease coning errors further, the attitude updating frequency is improved from 200 Hz to 2000 Hz. At the same time, a novel finite impulse response (FIR filter with three notches is designed to filter the dither frequency of the ring laser gyro (RLG. The comparison tests suggest that the new filter is five times more effective than the old one. The paper indicates that phase-frequency characteristics of FIR filter and first-order holder of navigation computer constitute the main sources of phase lag in LINS. A formula to calculate the LINS attitude phase lag is introduced in the paper. The expressions of dynamic attitude errors induced by phase lag are derived. The paper proposes a novel synchronization mechanism that is able to simultaneously solve the problems of dynamic test synchronization and phase compensation. A single-axis turntable and a laser interferometer are applied to verify the synchronization mechanism. The experiments results show that the theoretically calculated values of phase lag and attitude error induced by phase lag can both match perfectly with testing data. The block diagram of DAMS and physical photos are presented in the paper. The final experiments demonstrate that the real-time attitude measurement accuracy of DAMS can reach up to 20″ (1σ and the synchronization error is less than 0.2 ms on the condition of three axes wobbling for 10 min.

  19. A Dynamic Attitude Measurement System Based on LINS

    Science.gov (United States)

    Li, Hanzhou; Pan, Quan; Wang, Xiaoxu; Zhang, Juanni; Li, Jiang; Jiang, Xiangjun

    2014-01-01

    A dynamic attitude measurement system (DAMS) is developed based on a laser inertial navigation system (LINS). Three factors of the dynamic attitude measurement error using LINS are analyzed: dynamic error, time synchronization and phase lag. An optimal coning errors compensation algorithm is used to reduce coning errors, and two-axis wobbling verification experiments are presented in the paper. The tests indicate that the attitude accuracy is improved 2-fold by the algorithm. In order to decrease coning errors further, the attitude updating frequency is improved from 200 Hz to 2000 Hz. At the same time, a novel finite impulse response (FIR) filter with three notches is designed to filter the dither frequency of the ring laser gyro (RLG). The comparison tests suggest that the new filter is five times more effective than the old one. The paper indicates that phase-frequency characteristics of FIR filter and first-order holder of navigation computer constitute the main sources of phase lag in LINS. A formula to calculate the LINS attitude phase lag is introduced in the paper. The expressions of dynamic attitude errors induced by phase lag are derived. The paper proposes a novel synchronization mechanism that is able to simultaneously solve the problems of dynamic test synchronization and phase compensation. A single-axis turntable and a laser interferometer are applied to verify the synchronization mechanism. The experiments results show that the theoretically calculated values of phase lag and attitude error induced by phase lag can both match perfectly with testing data. The block diagram of DAMS and physical photos are presented in the paper. The final experiments demonstrate that the real-time attitude measurement accuracy of DAMS can reach up to 20″ (1σ) and the synchronization error is less than 0.2 ms on the condition of three axes wobbling for 10 min. PMID:25177802

  20. Measuring Costs to Community-Based Agencies for Implementation of an Evidence-Based Practice.

    Science.gov (United States)

    Lang, Jason M; Connell, Christian M

    2017-01-01

    Healthcare reform has led to an increase in dissemination of evidence-based practices. Cost is frequently cited as a significant yet rarely studied barrier to dissemination of evidence-based practices and the associated improvements in quality of care. This study describes an approach to measuring the incremental, unreimbursed costs in staff time and direct costs to community-based clinics implementing an evidence-based practice through participating in a learning collaborative. Initial implementation costs exceeding those for providing "treatment as usual" were collected for ten clinics implementing trauma-focused cognitive behavioral therapy through participation in 10-month learning collaboratives. Incremental implementation costs of these ten community-based clinic teams averaged the equivalent of US$89,575 (US$ 2012). The most costly activities were training, supervision, preparation time, and implementation team meetings. Recommendations are made for further research on implementation costs, dissemination of evidence-based practices, and implications for researchers and policy makers.

  1. Defining Primary Care Shortage Areas: Do GIS-based Measures Yield Different Results?

    Science.gov (United States)

    Daly, Michael R; Mellor, Jennifer M; Millones, Marco

    2018-02-12

    To examine whether geographic information systems (GIS)-based physician-to-population ratios (PPRs) yield determinations of geographic primary care shortage areas that differ from those based on bounded-area PPRs like those used in the Health Professional Shortage Area (HPSA) designation process. We used geocoded data on primary care physician (PCP) locations and census block population counts from 1 US state to construct 2 shortage area indicators. The first is a bounded-area shortage indicator defined without GIS methods; the second is a GIS-based measure that measures the populations' spatial proximity to PCP locations. We examined agreement and disagreement between bounded shortage areas and GIS-based shortage areas. Bounded shortage area indicators and GIS-based shortage area indicators agree for the census blocks where the vast majority of our study populations reside. Specifically, 95% and 98% of the populations in our full and urban samples, respectively, reside in census blocks where the 2 indicators agree. Although agreement is generally high in rural areas (ie, 87% of the rural population reside in census blocks where the 2 indicators agree), agreement is significantly lower compared to urban areas. One source of disagreement suggests that bounded-area measures may "overlook" some shortages in rural areas; however, other aspects of the HPSA designation process likely mitigate this concern. Another source of disagreement arises from the border-crossing problem, and it is more prevalent. The GIS-based PPRs we employed would yield shortage area determinations that are similar to those based on bounded-area PPRs defined for Primary Care Service Areas. Disagreement rates were lower than previous studies have found. © 2018 National Rural Health Association.

  2. A technique for measuring oxygen saturation in biological tissues based on diffuse optical spectroscopy

    Science.gov (United States)

    Kleshnin, Mikhail; Orlova, Anna; Kirillin, Mikhail; Golubiatnikov, German; Turchin, Ilya

    2017-07-01

    A new approach to optical measuring blood oxygen saturation was developed and implemented. This technique is based on an original three-stage algorithm for reconstructing the relative concentration of biological chromophores (hemoglobin, water, lipids) from the measured spectra of diffusely scattered light at different distances from the probing radiation source. The numerical experiments and approbation of the proposed technique on a biological phantom have shown the high reconstruction accuracy and the possibility of correct calculation of hemoglobin oxygenation in the presence of additive noise and calibration errors. The obtained results of animal studies have agreed with the previously published results of other research groups and demonstrated the possibility to apply the developed technique to monitor oxygen saturation in tumor tissue.

  3. Generalized sample entropy analysis for traffic signals based on similarity measure

    Science.gov (United States)

    Shang, Du; Xu, Mengjia; Shang, Pengjian

    2017-05-01

    Sample entropy is a prevailing method used to quantify the complexity of a time series. In this paper a modified method of generalized sample entropy and surrogate data analysis is proposed as a new measure to assess the complexity of a complex dynamical system such as traffic signals. The method based on similarity distance presents a different way of signals patterns match showing distinct behaviors of complexity. Simulations are conducted over synthetic data and traffic signals for providing the comparative study, which is provided to show the power of the new method. Compared with previous sample entropy and surrogate data analysis, the new method has two main advantages. The first one is that it overcomes the limitation about the relationship between the dimension parameter and the length of series. The second one is that the modified sample entropy functions can be used to quantitatively distinguish time series from different complex systems by the similar measure.

  4. At Home Photography-Based Method for Measuring Wrist Range of Motion.

    Science.gov (United States)

    Trehan, Samir K; Rancy, Schneider K; Johnsen, Parker H; Hillstrom, Howard J; Lee, Steve K; Wolfe, Scott W

    2017-11-01

    Purpose  To determine the reliability of wrist range of motion (WROM) measurements based on digital photographs taken by patients at home compared with traditional measurements done in the office with a goniometer. Methods  Sixty-nine postoperative patients were enrolled in this study at least 3 months postoperatively. Active and passive wrist flexion/extension and radial/ulnar deviation were recorded by one of the two attending surgeons with a 1-degree resolution goniometer at the last postoperative office visit. Patients were provided an illustrated instruction sheet detailing how to take digital photographic images at home in six wrist positions (active and passive flexion/extension, and radial/ulnar deviation). Wrist position was measured from digital images by both the attending surgeons in a randomized, blinded fashion on two separate occasions greater than 2 weeks apart using the same goniometer. Reliability analysis was performed using the intraclass correlation coefficient to assess agreement between clinical and photography-based goniometry, as well as intra- and interobserver agreement. Results  Out of 69 enrolled patients, 30 (43%) patients sent digital images. Of the 180 digital photographs, only 9 (5%) were missing or deemed inadequate for WROM measurements. Agreement between clinical and photography-based measurements was "almost perfect" for passive wrist flexion/extension and "substantial" for active wrist flexion/extension and radial/ulnar deviation. Inter- and intraobserver agreement for the attending surgeons was "almost perfect" for all measurements. Discussion  This study validates a photography-based goniometry protocol allowing accurate and reliable WROM measurements without direct physician contact. Passive WROM was more accurately measured from photographs than active WROM. This study builds on previous photography-based goniometry literature by validating a protocol in which patients or their families take and submit their own

  5. Preoperative screening: value of previous tests.

    Science.gov (United States)

    Macpherson, D S; Snow, R; Lofgren, R P

    1990-12-15

    To determine the frequency of tests done in the year before elective surgery that might substitute for preoperative screening tests and to determine the frequency of test results that change from a normal value to a value likely to alter perioperative management. Retrospective cohort analysis of computerized laboratory data (complete blood count, sodium, potassium, and creatinine levels, prothrombin time, and partial thromboplastin time). Urban tertiary care Veterans Affairs Hospital. Consecutive sample of 1109 patients who had elective surgery in 1988. At admission, 7549 preoperative tests were done, 47% of which duplicated tests performed in the previous year. Of 3096 previous results that were normal as defined by hospital reference range and done closest to the time of but before admission (median interval, 2 months), 13 (0.4%; 95% CI, 0.2% to 0.7%), repeat values were outside a range considered acceptable for surgery. Most of the abnormalities were predictable from the patient's history, and most were not noted in the medical record. Of 461 previous tests that were abnormal, 78 (17%; CI, 13% to 20%) repeat values at admission were outside a range considered acceptable for surgery (P less than 0.001, frequency of clinically important abnormalities of patients with normal previous results with those with abnormal previous results). Physicians evaluating patients preoperatively could safely substitute the previous test results analyzed in this study for preoperative screening tests if the previous tests are normal and no obvious indication for retesting is present.

  6. SAFEGUARDS ENVELOPE: PREVIOUS WORK AND EXAMPLES

    International Nuclear Information System (INIS)

    Metcalf, Richard; Bevill, Aaron; Charlton, William; Bean, Robert

    2008-01-01

    The future expansion of nuclear power will require not just electricity production but fuel cycle facilities such as fuel fabrication and reprocessing plants. As large reprocessing facilities are built in various states, they must be built and operated in a manner to minimize the risk of nuclear proliferation. Process monitoring has returned to the spotlight as an added measure that can increase confidence in the safeguards of special nuclear material (SNM). Process monitoring can be demonstrated to lengthen the allowable inventory period by reducing accountancy requirements, and to reduce the false positive indications. The next logical step is the creation of a Safeguards Envelope, a set of operational parameters and models to maximize anomaly detection and inventory period by process monitoring while minimizing operator impact and false positive rates. A brief example of a rudimentary Safeguards Envelope is presented, and shown to detect synthetic diversions overlaying a measured processing plant data set. This demonstration Safeguards Envelope is shown to increase the confidence that no SNM has been diverted with minimal operator impact, even though it is based on an information sparse environment. While the foundation on which a full Safeguards Envelope can be built has been presented in historical demonstrations of process monitoring, several requirements remain yet unfulfilled. Future work will require reprocessing plant transient models, inclusion of 'non-traditional' operating data, and exploration of new methods of identifying subtle events in transient processes

  7. RISK LOAN PORTFOLIO OPTIMIZATION MODEL BASED ON CVAR RISK MEASURE

    Directory of Open Access Journals (Sweden)

    Ming-Chang LEE

    2015-07-01

    Full Text Available In order to achieve commercial banks liquidity, safety and profitability objective requirements, loan portfolio risk analysis based optimization decisions are rational allocation of assets.  The risk analysis and asset allocation are the key technology of banking and risk management.  The aim of this paper, build a loan portfolio optimization model based on risk analysis.  Loan portfolio rate of return by using Value-at-Risk (VaR and Conditional Value-at-Risk (CVaR constraint optimization decision model reflects the bank's risk tolerance, and the potential loss of direct control of the bank.  In this paper, it analyze a general risk management model applied to portfolio problems with VaR and CVaR risk measures by using Using the Lagrangian Algorithm.  This paper solves the highly difficult problem by matrix operation method.  Therefore, the combination of this paper is easy understanding the portfolio problems with VaR and CVaR risk model is a hyperbola in mean-standard deviation space.  It is easy calculation in proposed method.

  8. Research on cloud-based remote measurement and analysis system

    Science.gov (United States)

    Gao, Zhiqiang; He, Lingsong; Su, Wei; Wang, Can; Zhang, Changfan

    2015-02-01

    The promising potential of cloud computing and its convergence with technologies such as cloud storage, cloud push, mobile computing allows for creation and delivery of newer type of cloud service. Combined with the thought of cloud computing, this paper presents a cloud-based remote measurement and analysis system. This system mainly consists of three parts: signal acquisition client, web server deployed on the cloud service, and remote client. This system is a special website developed using asp.net and Flex RIA technology, which solves the selective contradiction between two monitoring modes, B/S and C/S. This platform supplies customer condition monitoring and data analysis service by Internet, which was deployed on the cloud server. Signal acquisition device is responsible for data (sensor data, audio, video, etc.) collection and pushes the monitoring data to the cloud storage database regularly. Data acquisition equipment in this system is only conditioned with the function of data collection and network function such as smartphone and smart sensor. This system's scale can adjust dynamically according to the amount of applications and users, so it won't cause waste of resources. As a representative case study, we developed a prototype system based on Ali cloud service using the rotor test rig as the research object. Experimental results demonstrate that the proposed system architecture is feasible.

  9. Video-based measurements for wireless capsule endoscope tracking

    International Nuclear Information System (INIS)

    Spyrou, Evaggelos; Iakovidis, Dimitris K

    2014-01-01

    The wireless capsule endoscope is a swallowable medical device equipped with a miniature camera enabling the visual examination of the gastrointestinal (GI) tract. It wirelessly transmits thousands of images to an external video recording system, while its location and orientation are being tracked approximately by external sensor arrays. In this paper we investigate a video-based approach to tracking the capsule endoscope without requiring any external equipment. The proposed method involves extraction of speeded up robust features from video frames, registration of consecutive frames based on the random sample consensus algorithm, and estimation of the displacement and rotation of interest points within these frames. The results obtained by the application of this method on wireless capsule endoscopy videos indicate its effectiveness and improved performance over the state of the art. The findings of this research pave the way for a cost-effective localization and travel distance measurement of capsule endoscopes in the GI tract, which could contribute in the planning of more accurate surgical interventions. (paper)

  10. Video-based measurements for wireless capsule endoscope tracking

    Science.gov (United States)

    Spyrou, Evaggelos; Iakovidis, Dimitris K.

    2014-01-01

    The wireless capsule endoscope is a swallowable medical device equipped with a miniature camera enabling the visual examination of the gastrointestinal (GI) tract. It wirelessly transmits thousands of images to an external video recording system, while its location and orientation are being tracked approximately by external sensor arrays. In this paper we investigate a video-based approach to tracking the capsule endoscope without requiring any external equipment. The proposed method involves extraction of speeded up robust features from video frames, registration of consecutive frames based on the random sample consensus algorithm, and estimation of the displacement and rotation of interest points within these frames. The results obtained by the application of this method on wireless capsule endoscopy videos indicate its effectiveness and improved performance over the state of the art. The findings of this research pave the way for a cost-effective localization and travel distance measurement of capsule endoscopes in the GI tract, which could contribute in the planning of more accurate surgical interventions.

  11. Potentiometric measurement of polymer-membrane electrodes based on lanthanum

    Energy Technology Data Exchange (ETDEWEB)

    Saefurohman, Asep, E-mail: saefurohman.asep78@Gmail.com; Buchari,, E-mail: saefurohman.asep78@Gmail.com; Noviandri, Indra, E-mail: saefurohman.asep78@Gmail.com [Department of Chemistry, Bandung Institute of Technology (Indonesia); Syoni [Department of Metallurgy Engineering, Bandung Institute of Technology (Indonesia)

    2014-03-24

    Quantitative analysis of rare earth elements which are considered as the standard method that has a high accuracy, and detection limits achieved by the order of ppm is inductively coupled plasma atomic emission spectroscopy (ICPAES). But these tools are expensive and valuable analysis of the high cost of implementation. In this study be made and characterized selective electrode for the determination of rare earth ions is potentiometric. Membrane manufacturing techniques studied is based on immersion (liquid impregnated membrane) in PTFE 0.5 pore size. As ionophores to be used tri butyl phosphate (TBP) and bis(2-etylhexyl) hydrogen phosphate. There is no report previously that TBP used as ionophore in polymeric membrane based lanthanum. Some parameters that affect the performance of membrane electrode such as membrane composition, membrane thickness, and types of membrane materials studied in this research. Manufacturing of Ion Selective Electrodes (ISE) Lanthanum (La) by means of impregnation La membrane in TBP in kerosene solution has been done and showed performance for ISE-La. FTIR spectrum results for PTFE 0.5 pore size which impregnated in TBP and PTFE blank showed difference of spectra in the top 1257 cm{sup −1}, 1031 cm{sup −1} and 794.7 cm{sup −1} for P=O stretching and stretching POC from group −OP =O. The result showed shift wave number for P =O stretching of the cluster (−OP=O) in PTFE-TBP mixture that is at the peak of 1230 cm{sup −1} indicated that no interaction bond between hydroxyl group of molecules with molecular clusters fosforil of TBP or R{sub 3}P = O. The membrane had stable responses in pH range between 1 and 9. Good responses were obtained using 10{sup −3} M La(III) internal solution, which produced relatively high potential. ISE-La showed relatively good performances. The electrode had a response time of 29±4.5 second and could be use for 50 days. The linear range was between 10{sup −5} and 10{sup −1} M.

  12. Node-based measures of connectivity in genetic networks.

    Science.gov (United States)

    Koen, Erin L; Bowman, Jeff; Wilson, Paul J

    2016-01-01

    At-site environmental conditions can have strong influences on genetic connectivity, and in particular on the immigration and settlement phases of dispersal. However, at-site processes are rarely explored in landscape genetic analyses. Networks can facilitate the study of at-site processes, where network nodes are used to model site-level effects. We used simulated genetic networks to compare and contrast the performance of 7 node-based (as opposed to edge-based) genetic connectivity metrics. We simulated increasing node connectivity by varying migration in two ways: we increased the number of migrants moving between a focal node and a set number of recipient nodes, and we increased the number of recipient nodes receiving a set number of migrants. We found that two metrics in particular, the average edge weight and the average inverse edge weight, varied linearly with simulated connectivity. Conversely, node degree was not a good measure of connectivity. We demonstrated the use of average inverse edge weight to describe the influence of at-site habitat characteristics on genetic connectivity of 653 American martens (Martes americana) in Ontario, Canada. We found that highly connected nodes had high habitat quality for marten (deep snow and high proportions of coniferous and mature forest) and were farther from the range edge. We recommend the use of node-based genetic connectivity metrics, in particular, average edge weight or average inverse edge weight, to model the influences of at-site habitat conditions on the immigration and settlement phases of dispersal. © 2015 John Wiley & Sons Ltd.

  13. Automated criterion-based analysis for Cole parameters assessment from cerebral neonatal electrical bioimpedance spectroscopy measurements

    International Nuclear Information System (INIS)

    Seoane, F; Lindecrantz, Kaj; Ward, L C; Lingwood, B E

    2012-01-01

    Hypothermia has been proven as an effective rescue therapy for infants with moderate or severe neonatal hypoxic ischemic encephalopathy. Hypoxia-ischemia alters the electrical impedance characteristics of the brain in neonates; therefore, spectroscopic analysis of the cerebral bioimpedance of the neonate may be useful for the detection of candidate neonates eligible for hypothermia treatment. Currently, in addition to the lack of reference bioimpedance data obtained from healthy neonates, there is no standardized approach established for bioimpedance spectroscopy data analysis. In this work, cerebral bioimpedance measurements (12 h postpartum) in a cross-section of 84 term and near-term healthy neonates were performed at the bedside in the post-natal ward. To characterize the impedance spectra, Cole parameters (R 0 , R ∞ , f C and α) were extracted from the obtained measurements using an analysis process based on a best measurement and highest likelihood selection process. The results obtained in this study complement previously reported work and provide a standardized criterion-based method for data analysis. The availability of electrical bioimpedance spectroscopy reference data and the automatic criterion-based analysis method might support the development of a non-invasive method for prompt selection of neonates eligible for cerebral hypothermic rescue therapy. (paper)

  14. Application of model-based and knowledge-based measuring methods as analytical redundancy

    International Nuclear Information System (INIS)

    Hampel, R.; Kaestner, W.; Chaker, N.; Vandreier, B.

    1997-01-01

    The safe operation of nuclear power plants requires the application of modern and intelligent methods of signal processing for the normal operation as well as for the management of accident conditions. Such modern and intelligent methods are model-based and knowledge-based ones being founded on analytical knowledge (mathematical models) as well as experiences (fuzzy information). In addition to the existing hardware redundancies analytical redundancies will be established with the help of these modern methods. These analytical redundancies support the operating staff during the decision-making. The design of a hybrid model-based and knowledge-based measuring method will be demonstrated by the example of a fuzzy-supported observer. Within the fuzzy-supported observer a classical linear observer is connected with a fuzzy-supported adaptation of the model matrices of the observer model. This application is realized for the estimation of the non-measurable variables as steam content and mixture level within pressure vessels with water-steam mixture during accidental depressurizations. For this example the existing non-linearities will be classified and the verification of the model will be explained. The advantages of the hybrid method in comparison to the classical model-based measuring methods will be demonstrated by the results of estimation. The consideration of the parameters which have an important influence on the non-linearities requires the inclusion of high-dimensional structures of fuzzy logic within the model-based measuring methods. Therefore methods will be presented which allow the conversion of these high-dimensional structures to two-dimensional structures of fuzzy logic. As an efficient solution of this problem a method based on cascaded fuzzy controllers will be presented. (author). 2 refs, 12 figs, 5 tabs

  15. Automatic electromagnetic valve for previous vacuum

    International Nuclear Information System (INIS)

    Granados, C. E.; Martin, F.

    1959-01-01

    A valve which permits the maintenance of an installation vacuum when electric current fails is described. It also lets the air in the previous vacuum bomb to prevent the oil ascending in the vacuum tubes. (Author)

  16. Grid Based Integration Technologies of Virtual Measurement System

    International Nuclear Information System (INIS)

    Zhang, D P; He, L S; Yang, H

    2006-01-01

    This paper presents a novel integrated architecture of measurement system for the new requirements of measurement collaboration, measurement resource interconnection and transparent access etc in the wide-area and across organization in the context of a grid. The complexity of integration on a grid arises from the scale, dynamism, autonomy, and distribution of the measurement resources. The main argument of this paper is that these complexities should be made transparent to the collaborative measurement, via flexible reconfigurable mechanisms and dynamic virtualization services. The paper is started by discussing the integration-oriented measurement architecture which provides collaborative measurement services to distributed measurement resources and then the measurement mechanisms are discussed which implements the transparent access and collaboration of measurement resources by providing protocols, measurement schedule and global data driven model

  17. Equations based on anthropometry to predict body fat measured by absorptiometry in schoolchildren and adolescents.

    Science.gov (United States)

    Ortiz-Hernández, Luis; Vega López, A Valeria; Ramos-Ibáñez, Norma; Cázares Lara, L Joana; Medina Gómez, R Joab; Pérez-Salgado, Diana

    To develop and validate equations to estimate the percentage of body fat of children and adolescents from Mexico using anthropometric measurements. A cross-sectional study was carried out with 601 children and adolescents from Mexico aged 5-19 years. The participants were randomly divided into the following two groups: the development sample (n=398) and the validation sample (n=203). The validity of previously published equations (e.g., Slaughter) was also assessed. The percentage of body fat was estimated by dual-energy X-ray absorptiometry. The anthropometric measurements included height, sitting height, weight, waist and arm circumferences, skinfolds (triceps, biceps, subscapular, supra-iliac, and calf), and elbow and bitrochanteric breadth. Linear regression models were estimated with the percentage of body fat as the dependent variable and the anthropometric measurements as the independent variables. Equations were created based on combinations of six to nine anthropometric variables and had coefficients of determination (r 2 ) equal to or higher than 92.4% for boys and 85.8% for girls. In the validation sample, the developed equations had high r 2 values (≥85.6% in boys and ≥78.1% in girls) in all age groups, low standard errors (SE≤3.05% in boys and ≤3.52% in girls), and the intercepts were not different from the origin (p>0.050). Using the previously published equations, the coefficients of determination were lower, and/or the intercepts were different from the origin. The equations developed in this study can be used to assess the percentage of body fat of Mexican schoolchildren and adolescents, as they demonstrate greater validity and lower error compared with previously published equations. Copyright © 2017 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.

  18. Neutrosophic Refined Similarity Measure Based on Cosine Function

    Directory of Open Access Journals (Sweden)

    Said Broumi

    2014-12-01

    Full Text Available In this paper, the cosine similarity measure of neutrosophic refined (multi- sets is proposed and its properties are studied. The concept of this cosine similarity measure of neutrosophic refined sets is the extension of improved cosine similarity measure of single valued neutrosophic. Finally, using this cosine similarity measure of neutrosophic refined set, the application of medical diagnosis is presented.

  19. Improved Ordinary Measure and Image Entropy Theory based intelligent Copy Detection Method

    Directory of Open Access Journals (Sweden)

    Dengpan Ye

    2011-10-01

    Full Text Available Nowadays, more and more multimedia websites appear in social network. It brings some security problems, such as privacy, piracy, disclosure of sensitive contents and so on. Aiming at copyright protection, the copy detection technology of multimedia contents becomes a hot topic. In our previous work, a new computer-based copyright control system used to detect the media has been proposed. Based on this system, this paper proposes an improved media feature matching measure and an entropy based copy detection method. The Levenshtein Distance was used to enhance the matching degree when using for feature matching measure in copy detection. For entropy based copy detection, we make a fusion of the two features of entropy matrix of the entropy feature we extracted. Firstly,we extract the entropy matrix of the image and normalize it. Then, we make a fusion of the eigenvalue feature and the transfer matrix feature of the entropy matrix. The fused features will be used for image copy detection. The experiments show that compared to use these two kinds of features for image detection singly, using feature fusion matching method is apparent robustness and effectiveness. The fused feature has a high detection for copy images which have been received some attacks such as noise, compression, zoom, rotation and so on. Comparing with referred methods, the method proposed is more intelligent and can be achieved good performance.

  20. Geometry of X-ray based measurement of residual strain at desired penetration depth

    Energy Technology Data Exchange (ETDEWEB)

    Morawiec, A. [Polish Academy of Sciences, Institute of Metallurgy and Materials Science, Krakow (Poland)

    2017-10-15

    X-ray based measurement of residual lattice strains at chosen penetration depth is one of the methods for investigating strain inhomogeneities in near-surface layers of polycrystalline materials. The measurement relies on determining shifts of Bragg peaks for various directions of the scattering vector with respect to the specimen. At each of these directions, to reach a given the penetration depth, a proper specimen orientation is required. The task of determining such orientations, albeit elementary, is quite intricate. The existing literature describes only partial solutions with unspecified domains of application, which fail if applied to beyond the domains. Therefore, geometric aspects of the measurement are analyzed in details. Explicit bounds on measurement parameters are given. The equation fundamental for the procedure is solved with respect to specimen orientations. For a given direction of the scattering vector, there are generally four different specimen orientations leading to the same penetration depth. This simple fact (overlooked in previous analyses) can be used for improving reliability of measurement results. Analytical formulas for goniometer angles representing these orientations are provided. (copyright 2017 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  1. Non-invasive tissue temperature measurements based on quantitative diffuse optical spectroscopy (DOS) of water

    Energy Technology Data Exchange (ETDEWEB)

    Chung, S H [Department of Physics and Astronomy, University of Pennsylvania, 209 South 33rd Street, Philadelphia, PA 19104 (United States); Cerussi, A E; Tromberg, B J [Beckman Laser Institute and Medical Clinic, University of California, Irvine, 1002 Health Sciences Road, Irvine 92612, CA (United States); Merritt, S I [Masimo Corporation, 40 Parker, Irvine, CA 92618 (United States); Ruth, J, E-mail: bjtrombe@uci.ed [Department of Bioengineering, University of Pennsylvania, 210 S. 33rd Street, Room 240, Skirkanich Hall, Philadelphia, PA 19104 (United States)

    2010-07-07

    We describe the development of a non-invasive method for quantitative tissue temperature measurements using Broadband diffuse optical spectroscopy (DOS). Our approach is based on well-characterized opposing shifts in near-infrared (NIR) water absorption spectra that appear with temperature and macromolecular binding state. Unlike conventional reflectance methods, DOS is used to generate scattering-corrected tissue water absorption spectra. This allows us to separate the macromolecular bound water contribution from the thermally induced spectral shift using the temperature isosbestic point at 996 nm. The method was validated in intralipid tissue phantoms by correlating DOS with thermistor measurements (R = 0.96) with a difference of 1.1 {+-} 0.91 {sup 0}C over a range of 28-48 {sup 0}C. Once validated, thermal and hemodynamic (i.e. oxy- and deoxy-hemoglobin concentration) changes were measured simultaneously and continuously in human subjects (forearm) during mild cold stress. DOS-measured arm temperatures were consistent with previously reported invasive deep tissue temperature studies. These results suggest that DOS can be used for non-invasive, co-registered measurements of absolute temperature and hemoglobin parameters in thick tissues, a potentially important approach for optimizing thermal diagnostics and therapeutics.

  2. Sorption isotherms: A review on physical bases, modeling and measurement

    Energy Technology Data Exchange (ETDEWEB)

    Limousin, G. [Atomic Energy Commission, Tracers Technology Laboratory, 38054 Grenoble Cedex (France) and Laboratoire d' etude des Transferts en Hydrologie et Environnement (CNRS-INPG-IRD-UJF), BP 53, 38041 Grenoble Cedex (France)]. E-mail: guillaumelimousin@yahoo.fr; Gaudet, J.-P. [Laboratoire d' etude des Transferts en Hydrologie et Environnement (CNRS-INPG-IRD-UJF), BP 53, 38041 Grenoble Cedex (France); Charlet, L. [Laboratoire de Geophysique Interne et Techtonophysique - CNRS-IRD-LCPC-UJF-Universite de Savoie, BP 53, 38041 Grenoble Cedex (France); Szenknect, S. [Atomic Energy Commission, Tracers Technology Laboratory, 38054 Grenoble Cedex (France); Barthes, V. [Atomic Energy Commission, Tracers Technology Laboratory, 38054 Grenoble Cedex (France); Krimissa, M. [Electricite de France, Division Recherche et Developpement, Laboratoire National d' Hydraulique et d' Environnement - P78, 6 quai Watier, 78401 Chatou (France)

    2007-02-15

    The retention (or release) of a liquid compound on a solid controls the mobility of many substances in the environment and has been quantified in terms of the 'sorption isotherm'. This paper does not review the different sorption mechanisms. It presents the physical bases underlying the definition of a sorption isotherm, different empirical or mechanistic models, and details several experimental methods to acquire a sorption isotherm. For appropriate measurements and interpretations of isotherm data, this review emphasizes 4 main points: (i) the adsorption (or desorption) isotherm does not provide automatically any information about the reactions involved in the sorption phenomenon. So, mechanistic interpretations must be carefully verified. (ii) Among studies, the range of reaction times is extremely wide and this can lead to misinterpretations regarding the irreversibility of the reaction: a pseudo-hysteresis of the release compared with the retention is often observed. The comparison between the mean characteristic time of the reaction and the mean residence time of the mobile phase in the natural system allows knowing if the studied retention/release phenomenon should be considered as an instantaneous reversible, almost irreversible phenomenon, or if reaction kinetics must be taken into account. (iii) When the concentration of the retained substance is low enough, the composition of the bulk solution remains constant and a single-species isotherm is often sufficient, although it remains strongly dependent on the background medium. At higher concentrations, sorption may be driven by the competition between several species that affect the composition of the bulk solution. (iv) The measurement method has a great influence. Particularly, the background ionic medium, the solid/solution ratio and the use of flow-through or closed reactor are of major importance. The chosen method should balance easy-to-use features and representativity of the studied

  3. PC-based hardware and software for tracer measurements

    International Nuclear Information System (INIS)

    Kaemaeraeinen, V.J.; Kall, Leif; Kaeki, Arvo

    1990-01-01

    Cheap, efficient personal computers can be used for both measurement and analysis. The results can be calculated immediately after the measurements are made in order to exploit the real-time measuring capabilities of tracer techniques fully. In the analysis phase the measurement information is visualized using graphical methods. The programs are menu drive to make them easy to use and adaptable for field conditions. The measuring equipment is modular for easy installation and maintenance. (author)

  4. GIS Based Measurement and Regulatory Zoning of Urban Ecological Vulnerability

    Directory of Open Access Journals (Sweden)

    Xiaorui Zhang

    2015-07-01

    Full Text Available Urban ecological vulnerability is measured on the basis of ecological sensitivity and resilience based on the concept analysis of vulnerability. GIS-based multicriteria decision analysis (GIS-MCDA methods are used, supported by the spatial analysis tools of GIS, to define different levels of vulnerability for areas of the urban ecology. These areas are further classified into different types of regulatory zones. Taking the city of Hefei in China as the empirical research site, this study uses GIS-MCDA, including the index system, index weights and overlay rules, to measure the degree of its ecological vulnerability on the GIS platform. There are eight indices in the system. Raking and analytical hierarchy process (AHP methods are used to calculate index weights according to the characteristics of the index system. The integrated overlay rule, including selection of the maximum value, and weighted linear combination (WLC are applied as the overlay rules. In this way, five types of vulnerability areas have been classified as follows: very low vulnerability, low vulnerability, medium vulnerability, high vulnerability and very high vulnerability. They can be further grouped into three types of regulatory zone of ecological green line, ecological grey line and ecological red line. The study demonstrates that ecological green line areas are the largest (53.61% of the total study area and can be intensively developed; ecological grey line areas (19.59% of the total area can serve as the ecological buffer zone, and ecological red line areas (26.80% cannot be developed and must be protected. The results indicate that ecological green line areas may provide sufficient room for future urban development in Hefei city. Finally, the respective regulatory countermeasures are put forward. This research provides a scientific basis for decision-making around urban ecological protection, construction and sustainable development. It also provides theoretical method

  5. EVALUATION ON THE SEISMIC RESPONSE CHARACTERISTICS OF A ROAD EMBANKMENT BASED ON THE MODERATE EARTHQUAKE OBSERVATION AND THE MICROTREMOR MEASUREMENT

    Science.gov (United States)

    Hata, Yoshiya; Ichii, Koji; Yamada, Masayuki; Tokida, Ken-Ichi; Takezawa, Koichiro; Shibao, Susumu; Mitsushita, Junji; Murata, Akira; Furukawa, Aiko; Koizumi, Keigo

    Accurate evaluation on the seismic response characteristics of a road embankment is very important for the rational seismic assessment. However, in a lot of previous studies, the seismic response characteristics of an embankment were evaluated based on the results of shaking table test, centrifuge model test and dynamic FEM analysis. In this study, the transfer function and the shear wave velocity of a road embankment were evaluated based on the in-situ records of moderate earthquake observation and microtremor measurement. Test results show the possibility that the shear wave velocity of an embankment can be estimated by the earthquake observation or the microtremor measurement and the dynamic linear FEM analysis.

  6. Universal resources for approximate and stochastic measurement-based quantum computation

    International Nuclear Information System (INIS)

    Mora, Caterina E.; Piani, Marco; Miyake, Akimasa; Van den Nest, Maarten; Duer, Wolfgang; Briegel, Hans J.

    2010-01-01

    We investigate which quantum states can serve as universal resources for approximate and stochastic measurement-based quantum computation in the sense that any quantum state can be generated from a given resource by means of single-qubit (local) operations assisted by classical communication. More precisely, we consider the approximate and stochastic generation of states, resulting, for example, from a restriction to finite measurement settings or from possible imperfections in the resources or local operations. We show that entanglement-based criteria for universality obtained in M. Van den Nest et al. [New J. Phys. 9, 204 (2007)] for the exact, deterministic case can be lifted to the much more general approximate, stochastic case. This allows us to move from the idealized situation (exact, deterministic universality) considered in previous works to the practically relevant context of nonperfect state preparation. We find that any entanglement measure fulfilling some basic requirements needs to reach its maximum value on some element of an approximate, stochastic universal family of resource states, as the resource size grows. This allows us to rule out various families of states as being approximate, stochastic universal. We prove that approximate, stochastic universality is in general a weaker requirement than deterministic, exact universality and provide resources that are efficient approximate universal, but not exact deterministic universal. We also study the robustness of universal resources for measurement-based quantum computation under realistic assumptions about the (imperfect) generation and manipulation of entangled states, giving an explicit expression for the impact that errors made in the preparation of the resource have on the possibility to use it for universal approximate and stochastic state preparation. Finally, we discuss the relation between our entanglement-based criteria and recent results regarding the uselessness of states with a high

  7. Measurement properties of performance-based measures to assess physical function in hip and knee osteoarthritis: a systematic review.

    Science.gov (United States)

    Dobson, F; Hinman, R S; Hall, M; Terwee, C B; Roos, E M; Bennell, K L

    2012-12-01

    To systematically review the measurement properties of performance-based measures to assess physical function in people with hip and/or knee osteoarthritis (OA). Electronic searches were performed in MEDLINE, CINAHL, Embase, and PsycINFO up to the end of June 2012. Two reviewers independently rated measurement properties using the consensus-based standards for the selection of health status measurement instrument (COSMIN). "Best evidence synthesis" was made using COSMIN outcomes and the quality of findings. Twenty-four out of 1792 publications were eligible for inclusion. Twenty-one performance-based measures were evaluated including 15 single-activity measures and six multi-activity measures. Measurement properties evaluated included internal consistency (three measures), reliability (16 measures), measurement error (14 measures), validity (nine measures), responsiveness (12 measures) and interpretability (three measures). A positive rating was given to only 16% of possible measurement ratings. Evidence for the majority of measurement properties of tests reported in the review has yet to be determined. On balance of the limited evidence, the 40 m self-paced test was the best rated walk test, the 30 s-chair stand test and timed up and go test were the best rated sit to stand tests, and the Stratford battery, Physical Activity Restrictions and Functional Assessment System were the best rated multi-activity measures. Further good quality research investigating measurement properties of performance measures, including responsiveness and interpretability in people with hip and/or knee OA, is needed. Consensus on which combination of measures will best assess physical function in people with hip/and or knee OA is urgently required. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.

  8. Developing Barbed Microtip-Based Electrode Arrays for Biopotential Measurement

    Directory of Open Access Journals (Sweden)

    Li-Sheng Hsu

    2014-07-01

    Full Text Available This study involved fabricating barbed microtip-based electrode arrays by using silicon wet etching. KOH anisotropic wet etching was employed to form a standard pyramidal microtip array and HF/HNO3 isotropic etching was used to fabricate barbs on these microtips. To improve the electrical conductance between the tip array on the front side of the wafer and the electrical contact on the back side, a through-silicon via was created during the wet etching process. The experimental results show that the forces required to detach the barbed microtip arrays from human skin, a polydimethylsiloxane (PDMS polymer, and a polyvinylchloride (PVC film were larger compared with those required to detach microtip arrays that lacked barbs. The impedances of the skin-electrode interface were measured and the performance levels of the proposed dry electrode were characterized. Electrode prototypes that employed the proposed tip arrays were implemented. Electroencephalogram (EEG and electrocardiography (ECG recordings using these electrode prototypes were also demonstrated.

  9. Observing Tsunamis in the Ionosphere Using Ground Based GPS Measurements

    Science.gov (United States)

    Galvan, D. A.; Komjathy, A.; Song, Y. Tony; Stephens, P.; Hickey, M. P.; Foster, J.

    2011-01-01

    Ground-based Global Positioning System (GPS) measurements of ionospheric Total Electron Content (TEC) show variations consistent with atmospheric internal gravity waves caused by ocean tsunamis following recent seismic events, including the Tohoku tsunami of March 11, 2011. We observe fluctuations correlated in time, space, and wave properties with this tsunami in TEC estimates processed using JPL's Global Ionospheric Mapping Software. These TEC estimates were band-pass filtered to remove ionospheric TEC variations with periods outside the typical range of internal gravity waves caused by tsunamis. Observable variations in TEC appear correlated with the Tohoku tsunami near the epicenter, at Hawaii, and near the west coast of North America. Disturbance magnitudes are 1-10% of the background TEC value. Observations near the epicenter are compared to estimates of expected tsunami-driven TEC variations produced by Embry Riddle Aeronautical University's Spectral Full Wave Model, an atmosphere-ionosphere coupling model, and found to be in good agreement. The potential exists to apply these detection techniques to real-time GPS TEC data, providing estimates of tsunami speed and amplitude that may be useful for future early warning systems.

  10. Injection quality measurements with diamond based particle detectors

    CERN Document Server

    Stein, Oliver; CERN. Geneva. ATS Department

    2016-01-01

    During the re-commissioning phase of the LHC after the long shutdown 1 very high beam losses were observed at the TDI during beam injection. The losses reached up to 90% of the dump threshold. To decrease the through beam losses induced stress on the accelerator components these loss levels need to be reduced. Measurements with diamond based particle detectors (dBLMs), which have nano-second time resolution, revealed that the majority of these losses come from recaptured SPS beam surrounding the nominal bunch train. In this MD the injection loss patterns and loss intensities were investigated in greater detail. Performed calibration shots on the TDI (internal beam absorber for injection) gave a conversion factor from impacting particles intensities to signal in the dBLMs (0.1Vs/109 protons). Using the SPS tune kicker for cleaning the recaptured beam in the SPS and changing the LHC injection kicker settings resulted in a reduction of the injection losses. For 144 bunch injections the loss levels were decreased...

  11. Online monitoring of Mezcal fermentation based on redox potential measurements.

    Science.gov (United States)

    Escalante-Minakata, P; Ibarra-Junquera, V; Rosu, H C; De León-Rodríguez, A; González-García, R

    2009-01-01

    We describe an algorithm for the continuous monitoring of the biomass and ethanol concentrations as well as the growth rate in the Mezcal fermentation process. The algorithm performs its task having available only the online measurements of the redox potential. The procedure combines an artificial neural network (ANN) that relates the redox potential to the ethanol and biomass concentrations with a nonlinear observer-based algorithm that uses the ANN biomass estimations to infer the growth rate of this fermentation process. The results show that the redox potential is a valuable indicator of the metabolic activity of the microorganisms during Mezcal fermentation. In addition, the estimated growth rate can be considered as a direct evidence of the presence of mixed culture growth in the process. Usually, mixtures of microorganisms could be intuitively clear in this kind of processes; however, the total biomass data do not provide definite evidence by themselves. In this paper, the detailed design of the software sensor as well as its experimental application is presented at the laboratory level.

  12. Measurement-Based Performance Evaluation of Advanced MIMO Transceiver Designs

    Directory of Open Access Journals (Sweden)

    Schneider Christian

    2005-01-01

    Full Text Available This paper describes the methodology and the results of performance investigations on a multiple-input multiple-output (MIMO transceiver scheme for frequency-selective radio channels. The method relies on offline simulations and employs real-time MIMO channel sounder measurement data to ensure a realistic channel modeling. Thus it can be classified in between the performance evaluation using some predefined channel models and the evaluation of a prototype hardware in field experiments. New aspects for the simulation setup are discussed, which are frequently ignored when using simpler model-based evaluations. Example simulations are provided for an iterative ("turbo" MIMO equalizer concept. The dependency of the achievable bit error rate performance on the propagation characteristics and on the variation in some system design parameters is shown, whereas the antenna constellation is of particular concern for MIMO systems. Although in many of the considered constellations turbo MIMO equalization appears feasible in real field scenarios, there exist cases with poor performance as well, indicating that in practical applications link adaptation of the transmitter and receiver processing to the environment is necessary.

  13. EMG Processing Based Measures of Fatigue Assessment during Manual Lifting

    Directory of Open Access Journals (Sweden)

    E. F. Shair

    2017-01-01

    Full Text Available Manual lifting is one of the common practices used in the industries to transport or move objects to a desired place. Nowadays, even though mechanized equipment is widely available, manual lifting is still considered as an essential way to perform material handling task. Improper lifting strategies may contribute to musculoskeletal disorders (MSDs, where overexertion contributes as the highest factor. To overcome this problem, electromyography (EMG signal is used to monitor the workers’ muscle condition and to find maximum lifting load, lifting height and number of repetitions that the workers are able to handle before experiencing fatigue to avoid overexertion. Past researchers have introduced several EMG processing techniques and different EMG features that represent fatigue indices in time, frequency, and time-frequency domain. The impact of EMG processing based measures in fatigue assessment during manual lifting are reviewed in this paper. It is believed that this paper will greatly benefit researchers who need a bird’s eye view of the biosignal processing which are currently available, thus determining the best possible techniques for lifting applications.

  14. EMG Processing Based Measures of Fatigue Assessment during Manual Lifting

    Science.gov (United States)

    Marhaban, M. H.; Abdullah, A. R.

    2017-01-01

    Manual lifting is one of the common practices used in the industries to transport or move objects to a desired place. Nowadays, even though mechanized equipment is widely available, manual lifting is still considered as an essential way to perform material handling task. Improper lifting strategies may contribute to musculoskeletal disorders (MSDs), where overexertion contributes as the highest factor. To overcome this problem, electromyography (EMG) signal is used to monitor the workers' muscle condition and to find maximum lifting load, lifting height and number of repetitions that the workers are able to handle before experiencing fatigue to avoid overexertion. Past researchers have introduced several EMG processing techniques and different EMG features that represent fatigue indices in time, frequency, and time-frequency domain. The impact of EMG processing based measures in fatigue assessment during manual lifting are reviewed in this paper. It is believed that this paper will greatly benefit researchers who need a bird's eye view of the biosignal processing which are currently available, thus determining the best possible techniques for lifting applications. PMID:28303251

  15. Ground-based observations coordinated with Viking satellite measurements

    International Nuclear Information System (INIS)

    Opgenoorth, H.J.; Kirkwood, S.

    1989-01-01

    The instrumentation and the orbit of the Viking satellite made this first Swedish satellite mission ideally suited for coordinated observations with the dense network of ground-based stations in northern Scandinavia. Several arrays of complementing instruments such as magnetometers, all-sky cameras, riometers and doppler radars monitored on a routine basis the ionosphere under the magnetospheric region passed by Viking. For a large number of orbits the Viking passages close to Scandinavia were covered by the operation of specially designed programmes at the European incoherent-scatter facility (EISCAT). First results of coordinated observations on the ground and aboard Viking have shed new light on the most spectacular feature of substorm expansion, the westward-travelling surge. The end of a substorm and the associated decay of a westward-travelling surge have been analysed. EISCAT measurements of high spatial and temporal resolution indicate that the conductivities and electric fields associated with westward-travelling surges are not represented correctly by the existing models. (author)

  16. FIB-based measurement of local residual stresses on microsystems

    Science.gov (United States)

    Vogel, Dietmar; Sabate, Neus; Gollhardt, Astrid; Keller, Juergen; Auersperg, Juergen; Michel, Bernd

    2006-03-01

    The paper comprises research results obtained for stress determination on micro and nanotechnology components. It meets the concern of controlling stresses introduced to sensors, MEMS and electronics devices during different micromachining processes. The method bases on deformation measurement options made available inside focused ion beam equipment. Removing locally material by ion beam milling existing stresses / residual stresses lead to deformation fields around the milled feature. Digital image correlation techniques are used to extract deformation values from micrographs captured before and after milling. In the paper, two main milling features have been analyzed - through hole and through slit milling. Analytical solutions for stress release fields of in-plane stresses have been derived and compared to respective experimental findings. Their good agreement allows to settle a method for determination of residual stress values, which is demonstrated for thin membranes manufactured by silicon micro technology. Some emphasis is made on the elimination of main error sources for stress determination, like rigid body object displacements and rotations due to drifts of experimental conditions under FIB imaging. In order to illustrate potential application areas of the method residual stress suppression by ion implantation is evaluated by the method and reported here.

  17. MLVA Typing of Streptococcus pneumoniae Isolates with Emphasis on Serotypes 14, 9N and 9V: Comparison of Previously Described Panels and Proposal of a Novel 7 VNTR Loci-Based Simplified Scheme.

    Science.gov (United States)

    Costa, Natália S; Pinto, Tatiana C A; Merquior, Vânia L C; Castro, Luciana F S; da Rocha, Filomena S P; Morais, Jaqueline M; Peralta, José M; Teixeira, Lúcia M

    2016-01-01

    Streptococcus pneumoniae remains as an important cause of community-acquired bacterial infections, and the nasopharynx of asymptomatic carriers is the major reservoir of this microorganism. Pneumococcal strains of serotype 14 and serogroup 9 are among the most frequently isolated from both asymptomatic carriers and patients with invasive disease living in Brazil. Internationally disseminated clones belonging to such serotypes have been associated with the emergence and spread of antimicrobial resistance in our setting, highlighting the need for epidemiological tracking of these isolates. In this scenario, Multiple Loci VNTR Analysis (MLVA) has emerged as an alternative tool for the molecular characterization of pneumococci, in addition to more traditional techniques such as Multi-Locus Sequence Typing (MLST) and Pulsed-Field Gel Electrophoresis (PFGE). In the present study, 18 VNTR loci, as well as other previously described reduced MLVA panels (7 VNTR loci), were evaluated as tools to characterize pneumococcal strains of serotypes 14, 9N and 9V belonging to international and regional clones isolated in Brazil. The 18 VNTR loci panel was highly congruent with MLST and PFGE, being also useful for indicating the genetic relationship with international clones and for discriminating among strains with indistinguishable STs and PFGE profiles. Analysis of the results also allowed deducing a novel shorter 7 VNTR loci panel, keeping a high discriminatory power for isolates of the serotypes investigated and a high congruence level with MLST and PFGE. The newly proposed simplified panel was then evaluated for typing pneumococcal strains of other commonly isolated serotypes. The results indicate that MLVA is a faster and easier to perform, reliable approach for the molecular characterization of S. pneumoniae isolates, with potential for cost-effective application, especially in resource-limited countries.

  18. Transportation performance measures for outcome based system management and monitoring.

    Science.gov (United States)

    2014-09-01

    The Oregon Department of Transportation (ODOT) is mature in its development and use of : performance measures, however there was not a standard approach for selecting measures nor : evaluating if existing ones were used to inform decision-making. Thi...

  19. Dynamic portfolio managment based on complex quantile risk measures

    Directory of Open Access Journals (Sweden)

    Ekaterina V. Tulupova

    2011-05-01

    Full Text Available The article focuses on effectiveness evaluation combined measures of financial risks, which are convex combinations of measures VaR, CVaR and their analogues for the right distribution tail functions of a portfolio returns.

  20. A Raspberry Pi Based Portable Endoscopic 3D Measurement System

    Directory of Open Access Journals (Sweden)

    Jochen Schlobohm

    2016-07-01

    Full Text Available Geometry measurements are very important to monitor a machine part’s health and performance. Optical measurement system have several advantages for the acquisition of a parts geometry: measurement speed, precision, point density and contactless operation. Measuring parts inside of assembled machines is also desirable to keep maintenance cost low. The Raspberry Pi is a small and cost efficient computer that creates new opportunities for compact measurement systems. We have developed a fringe projection system which is capable of measuring in very limited space. A Raspberry Pi 2 is used to generate the projection patterns, acquire the image and reconstruct the geometry. Together with a small LED projector, the measurement system is small and easy to handle. It consists of off-the-shelf products which are nonetheless capable of measuring with an uncertainty of less than 100 μ m .

  1. Estimation of incidences of infectious diseases based on antibody measurements

    DEFF Research Database (Denmark)

    Simonsen, J; Mølbak, K; Falkenhorst, G

    2009-01-01

    bacterial infections. This study presents a Bayesian approach for obtaining incidence estimates by use of measurements of serum antibodies against Salmonella from a cross-sectional study. By comparing these measurements with antibody measurements from a follow-up study of infected individuals...

  2. Radiation risk estimation based on measurement error models

    CERN Document Server

    Masiuk, Sergii; Shklyar, Sergiy; Chepurny, Mykola; Likhtarov, Illya

    2017-01-01

    This monograph discusses statistics and risk estimates applied to radiation damage under the presence of measurement errors. The first part covers nonlinear measurement error models, with a particular emphasis on efficiency of regression parameter estimators. In the second part, risk estimation in models with measurement errors is considered. Efficiency of the methods presented is verified using data from radio-epidemiological studies.

  3. The Impact of Corporate Governance on Financial Performance: (Measured using Accounting and Value-Added based Measures): Evidence from Malaysia

    OpenAIRE

    Abdul Aziz, Khairul Annuar

    2005-01-01

    This paper aims to test empirically which measure, an accounting based financial performance measure such as Return on Equity, Price to Earnings Ratio, Earnings Per Share and Return on Capital Employed; or value-added based financial performance measures such as Economic Value Added and Market Value Added; is more closely related with Corporate Governance Compliance. This paper also aims to study the level of Corporate Governance Compliance of the Smaller Companies listed on the KLSE, the mea...

  4. 77 FR 70176 - Previous Participation Certification

    Science.gov (United States)

    2012-11-23

    ... participants' previous participation in government programs and ensure that the past record is acceptable prior... information is designed to be 100 percent automated and digital submission of all data and certifications is... government programs and ensure that the past record is acceptable prior to granting approval to participate...

  5. On the Tengiz petroleum deposit previous study

    International Nuclear Information System (INIS)

    Nysangaliev, A.N.; Kuspangaliev, T.K.

    1997-01-01

    Tengiz petroleum deposit previous study is described. Some consideration about structure of productive formation, specific characteristic properties of petroleum-bearing collectors are presented. Recommendation on their detail study and using of experience on exploration and development of petroleum deposit which have analogy on most important geological and industrial parameters are given. (author)

  6. Subsequent pregnancy outcome after previous foetal death

    NARCIS (Netherlands)

    Nijkamp, J. W.; Korteweg, F. J.; Holm, J. P.; Timmer, A.; Erwich, J. J. H. M.; van Pampus, M. G.

    Objective: A history of foetal death is a risk factor for complications and foetal death in subsequent pregnancies as most previous risk factors remain present and an underlying cause of death may recur. The purpose of this study was to evaluate subsequent pregnancy outcome after foetal death and to

  7. AATR an ionospheric activity indicator specifically based on GNSS measurements

    Science.gov (United States)

    Juan, José Miguel; Sanz, Jaume; Rovira-Garcia, Adrià; González-Casado, Guillermo; Ibáñez, D.; Perez, R. Orus

    2018-03-01

    This work reviews an ionospheric activity indicator useful for identifying disturbed periods affecting the performance of Global Navigation Satellite System (GNSS). This index is based in the Along Arc TEC Rate (AATR) and can be easily computed from dual-frequency GNSS measurements. The AATR indicator has been assessed over more than one Solar Cycle (2002-2017) involving about 140 receivers distributed world-wide. Results show that it is well correlated with the ionospheric activity and, unlike other global indicators linked to the geomagnetic activity (i.e. DST or Ap), it is sensitive to the regional behaviour of the ionosphere and identifies specific effects on GNSS users. Moreover, from a devoted analysis of different Satellite Based Augmentation System (SBAS) performances in different ionospheric conditions, it follows that the AATR indicator is a very suitable mean to reveal whether SBAS service availability anomalies are linked to the ionosphere. On this account, the AATR indicator has been selected as the metric to characterise the ionosphere operational conditions in the frame of the European Space Agency activities on the European Geostationary Navigation Overlay System (EGNOS). The AATR index has been adopted as a standard tool by the International Civil Aviation Organization (ICAO) for joint ionospheric studies in SBAS. In this work we explain how the AATR is computed, paying special attention to the cycle-slip detection, which is one of the key issues in the AATR computation, not fully addressed in other indicators such as the Rate Of change of the TEC Index (ROTI). After this explanation we present some of the main conclusions about the ionospheric activity that can extracted from the AATR values during the above mentioned long-term study. These conclusions are: (a) the different spatial correlation related with the MOdified DIP (MODIP) which allows to clearly separate high, mid and low latitude regions, (b) the large spatial correlation in mid

  8. Measurement error in epidemiologic studies of air pollution based on land-use regression models.

    Science.gov (United States)

    Basagaña, Xavier; Aguilera, Inmaculada; Rivera, Marcela; Agis, David; Foraster, Maria; Marrugat, Jaume; Elosua, Roberto; Künzli, Nino

    2013-10-15

    Land-use regression (LUR) models are increasingly used to estimate air pollution exposure in epidemiologic studies. These models use air pollution measurements taken at a small set of locations and modeling based on geographical covariates for which data are available at all study participant locations. The process of LUR model development commonly includes a variable selection procedure. When LUR model predictions are used as explanatory variables in a model for a health outcome, measurement error can lead to bias of the regression coefficients and to inflation of their variance. In previous studies dealing with spatial predictions of air pollution, bias was shown to be small while most of the effect of measurement error was on the variance. In this study, we show that in realistic cases where LUR models are applied to health data, bias in health-effect estimates can be substantial. This bias depends on the number of air pollution measurement sites, the number of available predictors for model selection, and the amount of explainable variability in the true exposure. These results should be taken into account when interpreting health effects from studies that used LUR models.

  9. Synchrotron-based measurements of the electronic structure of the organic semiconductor copper phthalocyanine

    International Nuclear Information System (INIS)

    Downes, J.E.

    2004-01-01

    Full text: Copper phthalocyanine (CuPc) is a prototypical molecular organic semiconductor that is currently used in the construction of many organic electronic devices such as organic light emitting diodes (OLEDs). Although the material is currently being used, and despite many experimental and theoretical studies, it's detailed electronic structure is still not completely understood. This is likely due to two key factors. Firstly, the interaction of the Cu 3d and phthalocyanine ligand 2p electrons leads to the formation of a complex arrangement of localized and delocalized states near the Fermi level. Secondly, thin films of the material are subject to damage by the photon beam used to make measurements of their electronic structure. Using the synchrotron-based techniques of soft x-ray emission spectroscopy (XES) and x-ray photoemission spectroscopy (XPS), we have measured the detailed electronic structure of in-situ grown thin film samples of CuPc. Beam damage was minimized by continuous translation of the sample during data acquisition. The results obtained differ significantly from previous XES and ultraviolet photoemission measurements, but are in excellent agreement with recent density functional calculations. The reasons for these discrepancies will be explained, and their implications for future measurements on similar materials will be explored

  10. The high-resolution extraterrestrial solar spectrum (QASUMEFTS determined from ground-based solar irradiance measurements

    Directory of Open Access Journals (Sweden)

    J. Gröbner

    2017-09-01

    Full Text Available A high-resolution extraterrestrial solar spectrum has been determined from ground-based measurements of direct solar spectral irradiance (SSI over the wavelength range from 300 to 500 nm using the Langley-plot technique. The measurements were obtained at the Izaña Atmospheric Research Centre from the Agencia Estatal de Meteorología, Tenerife, Spain, during the period 12 to 24 September 2016. This solar spectrum (QASUMEFTS was combined from medium-resolution (bandpass of 0.86 nm measurements of the QASUME (Quality Assurance of Spectral Ultraviolet Measurements in Europe spectroradiometer in the wavelength range from 300 to 500 nm and high-resolution measurements (0.025 nm from a Fourier transform spectroradiometer (FTS over the wavelength range from 305 to 380 nm. The Kitt Peak solar flux atlas was used to extend this high-resolution solar spectrum to 500 nm. The expanded uncertainties of this solar spectrum are 2 % between 310 and 500 nm and 4 % at 300 nm. The comparison of this solar spectrum with solar spectra measured in space (top of the atmosphere gave very good agreements in some cases, while in some other cases discrepancies of up to 5 % were observed. The QASUMEFTS solar spectrum represents a benchmark dataset with uncertainties lower than anything previously published. The metrological traceability of the measurements to the International System of Units (SI is assured by an unbroken chain of calibrations leading to the primary spectral irradiance standard of the Physikalisch-Technische Bundesanstalt in Germany.

  11. Working with previously anonymous gamete donors and donor-conceived adults: recent practice experiences of running the DNA-based voluntary information exchange and contact register, UK DonorLink.

    Science.gov (United States)

    Crawshaw, Marilyn; Gunter, Christine; Tidy, Christine; Atherton, Freda

    2013-03-01

    This article describes recent practice experiences with donor conceived adults, donors, non-donor-conceived adult children of donors using the voluntary DNA-based register, UK DonorLink. It highlights additional complexities faced when using DNA rather than paper records for searching, in particular from the risk of false positives, low chances of success and potential inclusion of biological parents' DNA. Professionals' experiences in supporting those being "linked" suggest challenges as well as rewards. Registration carries the potential to be therapeutic for donor-conceived adults and donors and to enhance their political awareness regardless of links being made. Registrants value both peer and professional support, providing the latter can respond flexibly and be delivered by staff experienced in intermediary work. Given that the majority of those affected by donor conception internationally come from anonymous donation systems, these findings are highly pertinent and argue the need for political and moral debate about such service provision.

  12. IntelliGO: a new vector-based semantic similarity measure including annotation origin

    Directory of Open Access Journals (Sweden)

    Devignes Marie-Dominique

    2010-12-01

    previously published measures. Conclusions The IntelliGO similarity measure provides a customizable and comprehensive method for quantifying gene similarity based on GO annotations. It also displays a robust set-discriminating power which suggests it will be useful for functional clustering. Availability An on-line version of the IntelliGO similarity measure is available at: http://bioinfo.loria.fr/Members/benabdsi/intelligo_project/

  13. Balanced Scorecard Based Performance Measurement & Strategic Management System

    OpenAIRE

    Permatasari, Paulina

    2006-01-01

    Developing strategy and performance measurement are an integral part of management control system. Making strategic decision about planning and controlling require information regarding how different subunits in organization work. To be effective, performance measurement, both financial and non-financial must motivate manager and employees at different levels to force goal accomplishment and organization strategic. An organization's measurement system strongly affects the behavior of people b...

  14. [Prevalence of previously diagnosed diabetes mellitus in Mexico.

    Science.gov (United States)

    Rojas-Martínez, Rosalba; Basto-Abreu, Ana; Aguilar-Salinas, Carlos A; Zárate-Rojas, Emiliano; Villalpando, Salvador; Barrientos-Gutiérrez, Tonatiuh

    2018-01-01

    To compare the prevalence of previously diagnosed diabetes in 2016 with previous national surveys and to describe treatment and its complications. Mexico's national surveys Ensa 2000, Ensanut 2006, 2012 and 2016 were used. For 2016, logistic regression models and measures of central tendency and dispersion were obtained. The prevalence of previously diagnosed diabetes in 2016 was 9.4%. The increase of 2.2% relative to 2012 was not significant and only observed in patients older than 60 years. While preventive measures have increased, the access to medical treatment and lifestyle has not changed. The treatment has been modified, with an increase in insulin and decrease in hypoglycaemic agents. Population aging, lack of screening actions and the increase in diabetes complications will lead to an increase on the burden of disease. Policy measures targeting primary and secondary prevention of diabetes are crucial.

  15. The Importance of Replication in Measurement Research: Using Curriculum-Based Measures with Postsecondary Students with Developmental Disabilities

    Science.gov (United States)

    Hosp, John L.; Ford, Jeremy W.; Huddle, Sally M.; Hensley, Kiersten K.

    2018-01-01

    Replication is a foundation of the development of a knowledge base in an evidence-based field such as education. This study includes two direct replications of Hosp, Hensley, Huddle, and Ford which found evidence of criterion-related validity of curriculum-based measurement (CBM) for reading and mathematics with postsecondary students with…

  16. Subsequent childbirth after a previous traumatic birth.

    Science.gov (United States)

    Beck, Cheryl Tatano; Watson, Sue

    2010-01-01

    Nine percent of new mothers in the United States who participated in the Listening to Mothers II Postpartum Survey screened positive for meeting the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition criteria for posttraumatic stress disorder after childbirth. Women who have had a traumatic birth experience report fewer subsequent children and a longer length of time before their second baby. Childbirth-related posttraumatic stress disorder impacts couples' physical relationship, communication, conflict, emotions, and bonding with their children. The purpose of this study was to describe the meaning of women's experiences of a subsequent childbirth after a previous traumatic birth. Phenomenology was the research design used. An international sample of 35 women participated in this Internet study. Women were asked, "Please describe in as much detail as you can remember your subsequent pregnancy, labor, and delivery following your previous traumatic birth." Colaizzi's phenomenological data analysis approach was used to analyze the stories of the 35 women. Data analysis yielded four themes: (a) riding the turbulent wave of panic during pregnancy; (b) strategizing: attempts to reclaim their body and complete the journey to motherhood; (c) bringing reverence to the birthing process and empowering women; and (d) still elusive: the longed-for healing birth experience. Subsequent childbirth after a previous birth trauma has the potential to either heal or retraumatize women. During pregnancy, women need permission and encouragement to grieve their prior traumatic births to help remove the burden of their invisible pain.

  17. A Software Behavior Trustworthiness Measurement Method based on Data Mining

    Directory of Open Access Journals (Sweden)

    Yuyu Yuan

    2011-10-01

    carried out in three stages: firstly, defining the concept of trust, software trustworthiness, static and dynamic feature datasets with fundamental calculating criteria; secondly, providing a group of formulas to illustrate congruence measurement approach for comparing the two types of feature datasets; lastly, giving an architecture supported by software trustworthiness measurement algorithm to evaluate conceptualized hierarchical software trustworthiness.

  18. BEAM-BASED MEASUREMENTS OF PERSISTENT CURRENT DECAY IN RHIC

    International Nuclear Information System (INIS)

    FISCHER, W.; JAIN, A.; TEPIKIAN, S.

    2001-01-01

    The two RHIC rings are equipped with superconducting dipole magnets. At injection, induced persistent currents in these magnets lead to a sextupole component. As the persistent currents decay with time, the horizontal and vertical chromaticities change. From magnet measurements of persistent current decays, chromaticity changes in the machine are estimated and compared with chromaticity measurements

  19. Femtosecond frequency comb based distance measurement in air

    NARCIS (Netherlands)

    Balling, P.; Kren, P.; Masika, P.; van den Berg, S.A.

    2009-01-01

    Interferometric measurement of distance using a femtosecond frequency comb is demonstrated and compared with a counting interferometer displacement measurement. A numerical model of pulse propagation in air is developed and the results are compared with experimental data for short distances. The

  20. Ergonomic measures in construction work: enhancing evidence-based implementation

    NARCIS (Netherlands)

    Visser, S.

    2015-01-01

    Despite the development and availability of ergonomic measures in the construction industry, the number of construction workers reporting high physical work demands remains high. A reduction of the high physical work demands can be achieved by using ergonomic measures. However, these ergonomic

  1. Attitude angular measurement system based on MEMS accelerometer

    Science.gov (United States)

    Luo, Lei

    2014-09-01

    For the purpose of monitoring the attitude of aircraft, an angular measurement system using a MEMS heat convection accelerometer is presented in this study. A double layers conditioning circuit that center around the single chip processor is designed and built. Professional display software with the RS232 standard is used to communicate between the sensor and the computer. Calibration experiments were carried out to characterize the measuring system with the range of - 90°to +90°. The curves keep good linearity with the practical angle. The maximum deviation occurs at the 90°where the value is 2.8°.The maximum error is 1.6% and the repeatability is measured to be 2.1%. Experiments proved that the developed measurement system is capable of measuring attitude angle.

  2. Posterior variability of inclusion shape based on tomographic measurement data

    International Nuclear Information System (INIS)

    Watzenig, Daniel; Fox, Colin

    2008-01-01

    We treat the problem of recovering the unknown shape of a single inclusion with unknown constant permittivity in an otherwise uniform background material, from uncertain measurements of trans-capacitance at electrodes outside the material. The ubiquitous presence of measurement noise implies that the practical measurement process is probabilistic, and the inverse problem is naturally stated as statistical inference. Formulating the inverse problem in a Bayesian inferential framework requires accurately modelling the forward map, measurement noise, and specifying a prior distribution for the cross-sectional material distribution. Numerical implementation of the forward map is via the boundary element method (BEM) taking advantage of a piecewise constant representation. Summary statistics are calculated using MCMC sampling to characterize posterior variability for synthetic and measured data sets.

  3. [A novel biologic electricity signal measurement based on neuron chip].

    Science.gov (United States)

    Lei, Yinsheng; Wang, Mingshi; Sun, Tongjing; Zhu, Qiang; Qin, Ran

    2006-06-01

    Neuron chip is a multiprocessor with three pipeline CPU; its communication protocol and control processor are integrated in effect to carry out the function of communication, control, attemper, I/O, etc. A novel biologic electronic signal measurement network system is composed of intelligent measurement nodes with neuron chip at the core. In this study, the electronic signals such as ECG, EEG, EMG and BOS can be synthetically measured by those intelligent nodes, and some valuable diagnostic messages are found. Wavelet transform is employed in this system to analyze various biologic electronic signals due to its strong time-frequency ability of decomposing signal local character. Better effect is gained. This paper introduces the hardware structure of network and intelligent measurement node, the measurement theory and the signal figure of data acquisition and processing.

  4. Femtosecond frequency comb based distance measurement in air.

    Science.gov (United States)

    Balling, Petr; Kren, Petr; Masika, Pavel; van den Berg, S A

    2009-05-25

    Interferometric measurement of distance using a femtosecond frequency comb is demonstrated and compared with a counting interferometer displacement measurement. A numerical model of pulse propagation in air is developed and the results are compared with experimental data for short distances. The relative agreement for distance measurement in known laboratory conditions is better than 10(-7). According to the model, similar precision seems feasible even for long-distance measurement in air if conditions are sufficiently known. It is demonstrated that the relative width of the interferogram envelope even decreases with the measured length, and a fringe contrast higher than 90% could be obtained for kilometer distances in air, if optimal spectral width for that length and wavelength is used. The possibility of comb radiation delivery to the interferometer by an optical fiber is shown by model and experiment, which is important from a practical point of view.

  5. Statistical characteristics of surrogate data based on geophysical measurements

    Directory of Open Access Journals (Sweden)

    V. Venema

    2006-01-01

    Full Text Available In this study, the statistical properties of a range of measurements are compared with those of their surrogate time series. Seven different records are studied, amongst others, historical time series of mean daily temperature, daily rain sums and runoff from two rivers, and cloud measurements. Seven different algorithms are used to generate the surrogate time series. The best-known method is the iterative amplitude adjusted Fourier transform (IAAFT algorithm, which is able to reproduce the measured distribution as well as the power spectrum. Using this setup, the measurements and their surrogates are compared with respect to their power spectrum, increment distribution, structure functions, annual percentiles and return values. It is found that the surrogates that reproduce the power spectrum and the distribution of the measurements are able to closely match the increment distributions and the structure functions of the measurements, but this often does not hold for surrogates that only mimic the power spectrum of the measurement. However, even the best performing surrogates do not have asymmetric increment distributions, i.e., they cannot reproduce nonlinear dynamical processes that are asymmetric in time. Furthermore, we have found deviations of the structure functions on small scales.

  6. Measurements of DSD Second Moment Based on Laser Extinction

    Science.gov (United States)

    Lane, John E.; Jones, Linwood; Kasparis, Takis C.; Metzger, Philip

    2013-01-01

    Using a technique recently developed for estimating the density of surface dust dispersed during a rocket landing, measuring the extinction of a laser passing through rain (or dust in the rocket case) yields an estimate of the 2nd moment of the particle cloud, and rainfall drop size distribution (DSD) in the terrestrial meteorological case. With the exception of disdrometers, instruments that measure rainfall make in direct measurements of the DSD. Most common of these instruments are the rainfall rate gauge measuring the 1 1/3 th moment, (when using a D(exp 2/3) dependency on terminal velocity). Instruments that scatter microwaves off of hydrometeors, such as the WSR-880, vertical wind profilers, and microwave disdrometers, measure the 6th moment of the DSD. By projecting a laser onto a target, changes in brightness of the laser spot against the target background during rain, yield a measurement of the DSD 2nd moment, using the Beer-Lambert law. In order to detect the laser attenuation within the 8-bit resolution of most camera image arrays, a minimum path length is required, depending on the intensity of the rainfall rate. For moderate to heavy rainfall, a laser path length of 100 m is sufficient to measure variations in optical extinction using a digital camera. A photo-detector could replace the camera, for automated installations. In order to spatially correlate the 2nd moment measurements to a collocated disdrometer or tipping bucket, the laser's beam path can be reflected multiple times using mirrors to restrict the spatial extent of the measurement. In cases where a disdrometer is not available, complete DSD estimates can be produced by parametric fitting of DSD model to the 2nd moment data in conjunction with tipping bucket data. In cases where a disdrometer is collocated, the laser extinction technique may yield a significant improvement to insitu disdrometer validation and calibration strategies

  7. New Stereo Vision Digital Camera System for Simultaneous Measurement of Cloud Base Height and Atmospheric Visibility

    Science.gov (United States)

    Janeiro, F. M.; Carretas, F.; Palma, N.; Ramos, P. M.; Wagner, F.

    2013-12-01

    Clouds play an important role in many aspects of everyday life. They affect both the local weather as well as the global climate and are an important parameter on climate change studies. Cloud parameters are also important for weather prediction models which make use of actual measurements. It is thus important to have low-cost instrumentation that can be deployed in the field to measure those parameters. This kind of instruments should also be automated and robust since they may be deployed in remote places and be subject to adverse weather conditions. Although clouds are very important in environmental systems, they are also an essential component of airplane safety when visual flight rules (VFR) are enforced, such as in most small aerodromes where it is not economically viable to install instruments for assisted flying. Under VFR there are strict limits on the height of the cloud base, cloud cover and atmospheric visibility that ensure the safety of the pilots and planes. Although there are instruments, available in the market, to measure those parameters, their relatively high cost makes them unavailable in many local aerodromes. In this work we present a new prototype which has been recently developed and deployed in a local aerodrome as proof of concept. It is composed by two digital cameras that capture photographs of the sky and allow the measurement of the cloud height from the parallax effect. The new developments consist on having a new geometry which allows the simultaneous measurement of cloud base height, wind speed at cloud base height and atmospheric visibility, which was not previously possible with only two cameras. The new orientation of the cameras comes at the cost of a more complex geometry to measure the cloud base height. The atmospheric visibility is calculated from the Lambert-Beer law after the measurement of the contrast between a set of dark objects and the background sky. The prototype includes the latest hardware developments that

  8. Neutrosophic Cubic MCGDM Method Based on Similarity Measure

    Directory of Open Access Journals (Sweden)

    Surapati Pramanik

    2017-06-01

    Full Text Available The notion of neutrosophic cubic set is originated from the hybridization of the concept of neutrosophic set and interval valued neutrosophic set. We define similarity measure for neutrosophic cubic sets and prove some of its basic properties.

  9. Ground-based spectral measurements of solar radiation, (2)

    International Nuclear Information System (INIS)

    Murai, Keizo; Kobayashi, Masaharu; Goto, Ryozo; Yamauchi, Toyotaro

    1979-01-01

    A newly designed spectro-pyranometer was used for the measurement of the global (direct + diffuse) and the diffuse sky radiation reaching the ground. By the subtraction of the diffuse component from the global radiation, we got the direct radiation component which leads to the spectral distribution of the optical thickness (extinction coefficient) of the turbid atmosphere. The measurement of the diffuse sky radiation reveals the scattering effect of aerosols and that of the global radiation allows the estimation of total attenuation caused by scattering and absorption of aerosols. The effects of the aerosols are represented by the deviation of the real atmosphere measured from the Rayleigh atmosphere. By the combination of the measured values with those obtained by theoretical calculation for the model atmosphere, we estimated the amount of absorption by the aerosols. Very strong absorption in the ultraviolet region was recognized. (author)

  10. An Antenna Measurement System Based on Optical Feeding

    Directory of Open Access Journals (Sweden)

    Ryohei Hosono

    2013-01-01

    the advantage of the system is demonstrated by measuring an ultra-wideband (UWB antenna both by the optical and electrical feeding systems and comparing with a calculated result. Ripples in radiation pattern due to the electrical feeding are successfully suppressed by the optical feeding. For example, in a radiation measurement on the azimuth plane at 3 GHz, ripple amplitude of 1.0 dB that appeared in the electrical feeding is reduced to 0.3 dB. In addition, a circularly polarized (CP antenna is successfully measured by the proposed system to show that the system is available not only for amplitude but also phase measurements.

  11. TOWARDS MEASURES OF INTELLIGENCE BASED ON SEMIOTIC CONTROL

    Energy Technology Data Exchange (ETDEWEB)

    C. JOSLYN

    2000-08-01

    We address the question of how to identify and measure the degree of intelligence in systems. We define the presence of intelligence as equivalent to the presence of a control relation. We contrast the distinct atomic semioic definitions of models and controls, and discuss hierarchical and anticipatory control. We conclude with a suggestion about moving towards quantitative measures of the degree of such control in systems.

  12. A Vision-Based Sensor for Noncontact Structural Displacement Measurement

    Science.gov (United States)

    Feng, Dongming; Feng, Maria Q.; Ozer, Ekin; Fukuda, Yoshio

    2015-01-01

    Conventional displacement sensors have limitations in practical applications. This paper develops a vision sensor system for remote measurement of structural displacements. An advanced template matching algorithm, referred to as the upsampled cross correlation, is adopted and further developed into a software package for real-time displacement extraction from video images. By simply adjusting the upsampling factor, better subpixel resolution can be easily achieved to improve the measurement accuracy. The performance of the vision sensor is first evaluated through a laboratory shaking table test of a frame structure, in which the displacements at all the floors are measured by using one camera to track either high-contrast artificial targets or low-contrast natural targets on the structural surface such as bolts and nuts. Satisfactory agreements are observed between the displacements measured by the single camera and those measured by high-performance laser displacement sensors. Then field tests are carried out on a railway bridge and a pedestrian bridge, through which the accuracy of the vision sensor in both time and frequency domains is further confirmed in realistic field environments. Significant advantages of the noncontact vision sensor include its low cost, ease of operation, and flexibility to extract structural displacement at any point from a single measurement. PMID:26184197

  13. An optomechatronic curvature measurement array based on fiber Bragg gratings

    International Nuclear Information System (INIS)

    Chang, Hsing-Cheng; Lin, Shyan-Lung; Hung, San-Shan; Chang, I-Nan; Chen, Ya-Hui; Lin, Jung-Chih; Liu, Wen-Fung

    2014-01-01

    This study investigated an optomechatronic array-integrated signal processing module and a human–machine interface based on fiber Bragg grating sensing elements embedded in an elastic support matrix that involves using a self-located electromagnetic mechanism for curvature sensing and solid contour reconstruction. Using bilinear interpolation and average calculation methods, the smooth and accurate surface contours of convex and concave lenses are reconstructed in real-time. The elastic supporting optical sensing array is self-balanced to reduce operational errors. Compared with our previous single-head sensor, the sensitivity of the proposed array is improved by more than 15%. In the curvature range from −20.15 to +27.09 m −1 , the sensitivities are 3.53 pm m for the convex measurement and 2.15 pm m for the concave measurement with an error rate below 8.89%. The curvature resolutions are 0.283 and 0.465 m −1 for convex and concave lenses, respectively. This array could be applied in the curvature measurement of solar collectors to monitor energy conversion efficiency or could be used to monitor the wafer-level thin-film fabrication process. (paper)

  14. Microcomputer-based tests for repeated-measures: Metric properties and predictive validities

    Science.gov (United States)

    Kennedy, Robert S.; Baltzley, Dennis R.; Dunlap, William P.; Wilkes, Robert L.; Kuntz, Lois-Ann

    1989-01-01

    A menu of psychomotor and mental acuity tests were refined. Field applications of such a battery are, for example, a study of the effects of toxic agents or exotic environments on performance readiness, or the determination of fitness for duty. The key requirement of these tasks is that they be suitable for repeated-measures applications, and so questions of stability and reliability are a continuing, central focus of this work. After the initial (practice) session, seven replications of 14 microcomputer-based performance tests (32 measures) were completed by 37 subjects. Each test in the battery had previously been shown to stabilize in less than five 90-second administrations and to possess retest reliabilities greater than r = 0.707 for three minutes of testing. However, all the tests had never been administered together as a battery and they had never been self-administered. In order to provide predictive validity for intelligence measurement, the Wechsler Adult Intelligence Scale-Revised and the Wonderlic Personnel Test were obtained on the same subjects.

  15. Volume Recovery of Polymeric Glasses: Application of a Capacitance-based Measurement Technique

    Science.gov (United States)

    Sakib, Nazam; Simon, Sindee

    Glasses, including polymeric glasses, are inherently non-equilibrium materials. As a consequence, the volume and enthalpy of a glass evolve towards equilibrium in a process termed structural recovery. Several open questions and new controversies remain unanswered in the field. Specifically, the presence of intermediate plateaus during isothermal structural recovery has been reported in recent enthalpy work. In addition, the dependence of the relaxation time on state variables and thermal history is unclear. Dilatometry is particularly useful for structural recovery studies because volume is an absolute quantity and volumetric measurements can be done in-situ. A capillary dilatometer, fitted with a linear variable differential transducer, was used previously to measure volume recovery of polymeric glass formers in our laboratory. To improve on the limitations associated with that methodology, including competition between the range of measurements versus the sensitivity, a capacitance-based technique has been developed following the work of Richert, 2010. The modification is performed by converting the glass capillary dilatometer into a cylindrical capacitor. For precision in capacitance data acquisition, an Andeen-Hagerling ultra-precision capacitance bridge (2550A, 1 kHz) is used. The setup will be tested by performing the signatures of structural recovery as described by Kovacs, 1963. Experiments are also planned to address the open questions in the field.

  16. Web-Based Gerontology Courses: How Do They Measure Up?

    Science.gov (United States)

    Hills, William E.; Brallier, Sara A.; Palm, Linda J.; Graham, Jamie M.

    2009-01-01

    This study compared Web-based and lecture-based Gerontology and Psychology of Aging courses in terms of student performance, demographic and academic characteristics of students enrolled in the courses, and extent to which these characteristics differentially predicted outcomes of learning in the two course types. Participants for this study were…

  17. Method and apparatus of a portable imaging-based measurement with self calibration

    Science.gov (United States)

    Chang, Tzyy-Shuh [Ann Arbor, MI; Huang, Hsun-Hau [Ann Arbor, MI

    2012-07-31

    A portable imaging-based measurement device is developed to perform 2D projection based measurements on an object that is difficult or dangerous to access. This device is equipped with self calibration capability and built-in operating procedures to ensure proper imaging based measurement.

  18. A correction scheme for thermal conductivity measurement using the comparative cut-bar technique based on 3D numerical simulation

    International Nuclear Information System (INIS)

    Xing, Changhu; Folsom, Charles; Jensen, Colby; Ban, Heng; Marshall, Douglas W

    2014-01-01

    As an important factor affecting the accuracy of thermal conductivity measurement, systematic (bias) error in the guarded comparative axial heat flow (cut-bar) method was mostly neglected by previous researches. This bias is primarily due to the thermal conductivity mismatch between sample and meter bars (reference), which is common for a sample of unknown thermal conductivity. A correction scheme, based on finite element simulation of the measurement system, was proposed to reduce the magnitude of the overall measurement uncertainty. This scheme was experimentally validated by applying corrections on four types of sample measurements in which the specimen thermal conductivity is much smaller, slightly smaller, equal and much larger than that of the meter bar. As an alternative to the optimum guarding technique proposed before, the correction scheme can be used to minimize the uncertainty contribution from the measurement system with non-optimal guarding conditions. It is especially necessary for large thermal conductivity mismatches between sample and meter bars. (paper)

  19. Challenging previous conceptions of vegetarianism and eating disorders.

    Science.gov (United States)

    Fisak, B; Peterson, R D; Tantleff-Dunn, S; Molnar, J M

    2006-12-01

    The purpose of this study was to replicate and expand upon previous research that has examined the potential association between vegetarianism and disordered eating. Limitations of previous research studies are addressed, including possible low reliability of measures of eating pathology within vegetarian samples, use of only a few dietary restraint measures, and a paucity of research examining potential differences in body image and food choice motives of vegetarians versus nonvegetarians. Two hundred and fifty-six college students completed a number of measures of eating pathology and body image, and a food choice motives questionnaire. Interestingly, no significant differences were found between vegetarians and nonvegetarians in measures of eating pathology or body image. However, significant differences in food choice motives were found. Implications for both researchers and clinicians are discussed.

  20. Bite force measurement based on fiber Bragg grating sensor

    Science.gov (United States)

    Padma, Srivani; Umesh, Sharath; Asokan, Sundarrajan; Srinivas, Talabattula

    2017-10-01

    The maximum level of voluntary bite force, which results from the combined action of muscle of mastication, joints, and teeth, i.e., craniomandibular structure, is considered as one of the major indicators for the functional state of the masticatory system. Measurement of voluntary bite force provides useful data for the jaw muscle function and activity along with assessment of prosthetics. This study proposes an in vivo methodology for the dynamic measurement of bite force employing a fiber Bragg grating (FBG) sensor known as bite force measurement device (BFMD). The BFMD developed is a noninvasive intraoral device, which transduces the bite force exerted at the occlusal surface into strain variations on a metal plate. These strain variations are acquired by the FBG sensor bonded over it. The BFMD developed facilitates adjustment of the distance between the biting platform, which is essential to capture the maximum voluntary bite force at three different positions of teeth, namely incisor, premolar, and molar sites. The clinically relevant bite forces are measured at incisor, molar, and premolar position and have been compared against each other. Furthermore, the bite forces measured with all subjects are segregated according to gender and also compared against each other.

  1. Mobile platform of altitude measurement based on a smartphone

    Science.gov (United States)

    Roszkowski, Paweł; Kowalczyk, Marcin

    2016-09-01

    The article presents a low cost, fully - functional meter of altitude and pressure changes in a form of mobile application controlled by Android OS (operating system). The measurements are possible due to pressure sensor inserted in majority of latest modern mobile phones, which are known as smartphones. Using their computing capabilities and other equipment components like GPS receiver in connection with data from the sensor enabled authors to create a sophisticated handheld measuring platform with many unique features. One of them is a drawing altitude maps mode in which user can create maps of altitude changes just by moving around examined area. Another one is a convenient mode for altitude measurement. It is also extended with analysis tools which provide a possibility to compare measured values by displaying the data in a form of plots. The platform consists of external backup server, where the user can secure all gathered data. Moreover, the results of measurement's accuracy examination process which was executed after building the solution were shown. At the end, the realized meter of altitude was compared to other popular altimeters, which are available on the market currently.

  2. Development of a field measurement instrument for nuclear electromagnetic pulse (NEMP) based on signal transmission through fiber

    International Nuclear Information System (INIS)

    Song Wenwu; Zhang Chuandong; Liu Yi; Chen Jiuchun; Fan Youwen

    2007-01-01

    This paper deals with design principles, development and performance of a field measurement instrument for nuclear electromagnetic pulse (EMP) based on signal transmission through fiber. To determine the minimum band width this instrument needs, we analyze cutoff spectrum of a time domain double exponential signal, employing Fast Fourier Transform (FFT), and get its inverse transform signal. Then we design the circuit of laser device and the circuit of measuring device according to previous analysis. This instrument meets requirements of related regulations. Its specifications meet requirements of NEMP hazard protection research and can be of great significance to it. (authors)

  3. Distance-Based Functional Diversity Measures and Their Decomposition: A Framework Based on Hill Numbers

    Science.gov (United States)

    Chiu, Chun-Huo; Chao, Anne

    2014-01-01

    Hill numbers (or the “effective number of species”) are increasingly used to characterize species diversity of an assemblage. This work extends Hill numbers to incorporate species pairwise functional distances calculated from species traits. We derive a parametric class of functional Hill numbers, which quantify “the effective number of equally abundant and (functionally) equally distinct species” in an assemblage. We also propose a class of mean functional diversity (per species), which quantifies the effective sum of functional distances between a fixed species to all other species. The product of the functional Hill number and the mean functional diversity thus quantifies the (total) functional diversity, i.e., the effective total distance between species of the assemblage. The three measures (functional Hill numbers, mean functional diversity and total functional diversity) quantify different aspects of species trait space, and all are based on species abundance and species pairwise functional distances. When all species are equally distinct, our functional Hill numbers reduce to ordinary Hill numbers. When species abundances are not considered or species are equally abundant, our total functional diversity reduces to the sum of all pairwise distances between species of an assemblage. The functional Hill numbers and the mean functional diversity both satisfy a replication principle, implying the total functional diversity satisfies a quadratic replication principle. When there are multiple assemblages defined by the investigator, each of the three measures of the pooled assemblage (gamma) can be multiplicatively decomposed into alpha and beta components, and the two components are independent. The resulting beta component measures pure functional differentiation among assemblages and can be further transformed to obtain several classes of normalized functional similarity (or differentiation) measures, including N-assemblage functional generalizations of

  4. Distance-based functional diversity measures and their decomposition: a framework based on Hill numbers.

    Directory of Open Access Journals (Sweden)

    Chun-Huo Chiu

    Full Text Available Hill numbers (or the "effective number of species" are increasingly used to characterize species diversity of an assemblage. This work extends Hill numbers to incorporate species pairwise functional distances calculated from species traits. We derive a parametric class of functional Hill numbers, which quantify "the effective number of equally abundant and (functionally equally distinct species" in an assemblage. We also propose a class of mean functional diversity (per species, which quantifies the effective sum of functional distances between a fixed species to all other species. The product of the functional Hill number and the mean functional diversity thus quantifies the (total functional diversity, i.e., the effective total distance between species of the assemblage. The three measures (functional Hill numbers, mean functional diversity and total functional diversity quantify different aspects of species trait space, and all are based on species abundance and species pairwise functional distances. When all species are equally distinct, our functional Hill numbers reduce to ordinary Hill numbers. When species abundances are not considered or species are equally abundant, our total functional diversity reduces to the sum of all pairwise distances between species of an assemblage. The functional Hill numbers and the mean functional diversity both satisfy a replication principle, implying the total functional diversity satisfies a quadratic replication principle. When there are multiple assemblages defined by the investigator, each of the three measures of the pooled assemblage (gamma can be multiplicatively decomposed into alpha and beta components, and the two components are independent. The resulting beta component measures pure functional differentiation among assemblages and can be further transformed to obtain several classes of normalized functional similarity (or differentiation measures, including N-assemblage functional

  5. A Pseudorange Measurement Scheme Based on Snapshot for Base Station Positioning Receivers.

    Science.gov (United States)

    Mo, Jun; Deng, Zhongliang; Jia, Buyun; Bian, Xinmei

    2017-12-01

    Digital multimedia broadcasting signal is promised to be a wireless positioning signal. This paper mainly studies a multimedia broadcasting technology, named China mobile multimedia broadcasting (CMMB), in the context of positioning. Theoretical and practical analysis on the CMMB signal suggests that the existing CMMB signal does not have the meter positioning capability. So, the CMMB system has been modified to achieve meter positioning capability by multiplexing the CMMB signal and pseudo codes in the same frequency band. The time difference of arrival (TDOA) estimation method is used in base station positioning receivers. Due to the influence of a complex fading channel and the limited bandwidth of receivers, the regular tracking method based on pseudo code ranging is difficult to provide continuous and accurate TDOA estimations. A pseudorange measurement scheme based on snapshot is proposed to solve the problem. This algorithm extracts the TDOA estimation from the stored signal fragments, and utilizes the Taylor expansion of the autocorrelation function to improve the TDOA estimation accuracy. Monte Carlo simulations and real data tests show that the proposed algorithm can significantly reduce the TDOA estimation error for base station positioning receivers, and then the modified CMMB system achieves meter positioning accuracy.

  6. Toward autonomous measurements of photosynthetic electron transport rates: An evaluation of active fluorescence-based measurements of photochemistry

    NARCIS (Netherlands)

    Silsbe, G.M.; Oxborough, K.; Suggett, D.J.; Forster, R.M.; Ihnken, S.; Komárek, O.; Lawrenz, E.; Prášil, O.; Röttgers, R.; Šicner, M.; Simis, S.G.H.; Van Dijk, M.A.; Kromkamp, J.C.

    2015-01-01

    This study presents a methods evaluation and intercalibration of active fluorescence-based measurements of the quantum yield ( inline image) and absorption coefficient ( inline image) of photosystem II (PSII) photochemistry. Measurements of inline image, inline image, and irradiance (E) can be

  7. Measurement properties of performance-based measures to assess physical function in hip and knee osteoarthritis: a systematic review

    NARCIS (Netherlands)

    Dobson, F.; Hinman, R.S.; Leverstein-van Hall, M.A.; Terwee, C.B.; Roos, E.M.; Bennell, K.L.

    2012-01-01

    Objectives: To systematically review the measurement properties of performance-based measures to assess physical function in people with hip and/or knee osteoarthritis (OA). Methods: Electronic searches were performed in MEDLINE, CINAHL, Embase, and PsycINFO up to the end of June 2012. Two reviewers

  8. Ground-based intercomparison of two isoprene measurement techniques

    Directory of Open Access Journals (Sweden)

    E. Leibrock

    2003-01-01

    Full Text Available An informal intercomparison of two isoprene (C5H8 measurement techniques was carried out during Fall of 1998 at a field site located approximately 3 km west of Boulder, Colorado, USA. A new chemical ionization mass spectrometric technique (CIMS was compared to a well-established gas chromatographic technique (GC. The CIMS technique utilized benzene cation chemistry to ionize isoprene. The isoprene levels measured by the CIMS were often larger than those obtained with the GC. The results indicate that the CIMS technique suffered from an anthropogenic interference associated with air masses from the Denver, CO metropolitan area as well as an additional interference occurring in clean conditions. However, the CIMS technique is also demonstrated to be sensitive and fast. Especially after introduction of a tandem mass spectrometric technique, it is therefore a candidate for isoprene measurements in remote environments near isoprene sources.

  9. A generalized complexity measure based on Rényi entropy

    Science.gov (United States)

    Sánchez-Moreno, Pablo; Angulo, Juan Carlos; Dehesa, Jesus S.

    2014-08-01

    The intrinsic statistical complexities of finite many-particle systems (i.e., those defined in terms of the single-particle density) quantify the degree of structure or patterns, far beyond the entropy measures. They are intuitively constructed to be minima at the opposite extremes of perfect order and maximal randomness. Starting from the pioneering LMC measure, which satisfies these requirements, some extensions of LMC-Rényi type have been published in the literature. The latter measures were shown to describe a variety of physical aspects of the internal disorder in atomic and molecular systems (e.g., quantum phase transitions, atomic shell filling) which are not grasped by their mother LMC quantity. However, they are not minimal for maximal randomness in general. In this communication, we propose a generalized LMC-Rényi complexity which overcomes this problem. Some applications which illustrate this fact are given.

  10. Template measurement for plutonium pit based on neural networks

    International Nuclear Information System (INIS)

    Zhang Changfan; Gong Jian; Liu Suping; Hu Guangchun; Xiang Yongchun

    2012-01-01

    Template measurement for plutonium pit extracts characteristic data from-ray spectrum and the neutron counts emitted by plutonium. The characteristic data of the suspicious object are compared with data of the declared plutonium pit to verify if they are of the same type. In this paper, neural networks are enhanced as the comparison algorithm for template measurement of plutonium pit. Two kinds of neural networks are created, i.e. the BP and LVQ neural networks. They are applied in different aspects for the template measurement and identification. BP neural network is used for classification for different types of plutonium pits, which is often used for management of nuclear materials. LVQ neural network is used for comparison of inspected objects to the declared one, which is usually applied in the field of nuclear disarmament and verification. (authors)

  11. Micro-Structure Measurement and Imaging Based on Digital Holography

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kyeong Suk; Jung, Hyun Chul; Chang, Ho Seob; Akhter, Naseem [Chosun University, Gwangju (Korea, Republic of); Kee, Chang Doo [Chonnam National University, Gwangju (Korea, Republic of)

    2010-06-15

    Advancements in the imaging and computing technology have opened the path to digital holography for non-destructive investigations of technical samples, material property measurement, vibration analysis, flow visualization and stress analysis in aerospace industry which has widened the application of digital holography in the above fields. In this paper, we demonstrate the non-destructive investigation and micro-structure measurement application of digital holography to the small particles and a biological sample. This paper gives a brief description of the digital holograms recorded with this system and illustratively demonstrated

  12. Estimation of piping temperature fluctuations based on external strain measurements

    International Nuclear Information System (INIS)

    Morilhat, P.; Maye, J.P.

    1993-01-01

    Due to the difficulty to carry out measurements at the inner sides of nuclear reactor piping subjected to thermal transients, temperature and stress variations in the pipe walls are estimated by means of external thermocouples and strain-gauges. This inverse problem is solved by spectral analysis. Since the wall harmonic transfer function (response to a harmonic load) is known, the inner side signal will be obtained by convolution of the inverse transfer function of the system and of the strain measurement enables detection of internal temperature fluctuations in a frequency range beyond the scope of the thermocouples. (authors). 5 figs., 3 refs

  13. Micro-Structure Measurement and Imaging Based on Digital Holography

    International Nuclear Information System (INIS)

    Kim, Kyeong Suk; Jung, Hyun Chul; Chang, Ho Seob; Akhter, Naseem; Kee, Chang Doo

    2010-01-01

    Advancements in the imaging and computing technology have opened the path to digital holography for non-destructive investigations of technical samples, material property measurement, vibration analysis, flow visualization and stress analysis in aerospace industry which has widened the application of digital holography in the above fields. In this paper, we demonstrate the non-destructive investigation and micro-structure measurement application of digital holography to the small particles and a biological sample. This paper gives a brief description of the digital holograms recorded with this system and illustratively demonstrated

  14. Do emotional intelligence and previous caring experience influence student nurse performance? A comparative analysis.

    Science.gov (United States)

    Stenhouse, Rosie; Snowden, Austyn; Young, Jenny; Carver, Fiona; Carver, Hannah; Brown, Norrie

    2016-08-01

    Reports of poor nursing care have focused attention on values based selection of candidates onto nursing programmes. Values based selection lacks clarity and valid measures. Previous caring experience might lead to better care. Emotional intelligence (EI) might be associated with performance, is conceptualised and measurable. To examine the impact of 1) previous caring experience, 2) emotional intelligence 3) social connection scores on performance and retention in a cohort of first year nursing and midwifery students in Scotland. A longitudinal, quasi experimental design. Adult and mental health nursing, and midwifery programmes in a Scottish University. Adult, mental health and midwifery students (n=598) completed the Trait Emotional Intelligence Questionnaire-short form and Schutte's Emotional Intelligence Scale on entry to their programmes at a Scottish University, alongside demographic and previous caring experience data. Social connection was calculated from a subset of questions identified within the TEIQue-SF in a prior factor and Rasch analysis. Student performance was calculated as the mean mark across the year. Withdrawal data were gathered. 598 students completed baseline measures. 315 students declared previous caring experience, 277 not. An independent-samples t-test identified that those without previous caring experience scored higher on performance (57.33±11.38) than those with previous caring experience (54.87±11.19), a statistically significant difference of 2.47 (95% CI, 0.54 to 4.38), t(533)=2.52, p=.012. Emotional intelligence scores were not associated with performance. Social connection scores for those withdrawing (mean rank=249) and those remaining (mean rank=304.75) were statistically significantly different, U=15,300, z=-2.61, p$_amp_$lt;0.009. Previous caring experience led to worse performance in this cohort. Emotional intelligence was not a useful indicator of performance. Lower scores on the social connection factor were associated

  15. A measurement-based technique for incipient anomaly detection

    KAUST Repository

    Harrou, Fouzi; Sun, Ying

    2016-01-01

    Fault detection is essential for safe operation of various engineering systems. Principal component analysis (PCA) has been widely used in monitoring highly correlated process variables. Conventional PCA-based methods, nevertheless, often fail to detect small or incipient faults. In this paper, we develop new PCA-based monitoring charts, combining PCA with multivariate memory control charts, such as the multivariate cumulative sum (MCUSUM) and multivariate exponentially weighted moving average (MEWMA) monitoring schemes. The multivariate control charts with memory are sensitive to small and moderate faults in the process mean, which significantly improves the performance of PCA methods and widen their applicability in practice. Using simulated data, we demonstrate that the proposed PCA-based MEWMA and MCUSUM control charts are more effective in detecting small shifts in the mean of the multivariate process variables, and outperform the conventional PCA-based monitoring charts. © 2015 IEEE.

  16. A measurement-based technique for incipient anomaly detection

    KAUST Repository

    Harrou, Fouzi

    2016-06-13

    Fault detection is essential for safe operation of various engineering systems. Principal component analysis (PCA) has been widely used in monitoring highly correlated process variables. Conventional PCA-based methods, nevertheless, often fail to detect small or incipient faults. In this paper, we develop new PCA-based monitoring charts, combining PCA with multivariate memory control charts, such as the multivariate cumulative sum (MCUSUM) and multivariate exponentially weighted moving average (MEWMA) monitoring schemes. The multivariate control charts with memory are sensitive to small and moderate faults in the process mean, which significantly improves the performance of PCA methods and widen their applicability in practice. Using simulated data, we demonstrate that the proposed PCA-based MEWMA and MCUSUM control charts are more effective in detecting small shifts in the mean of the multivariate process variables, and outperform the conventional PCA-based monitoring charts. © 2015 IEEE.

  17. Deterministic Predictions of Vessel Responses Based on Past Measurements

    DEFF Research Database (Denmark)

    Nielsen, Ulrik Dam; Jensen, Jørgen Juncher

    2017-01-01

    The paper deals with a prediction procedure from which global wave-induced responses can be deterministically predicted a short time, 10-50 s, ahead of current time. The procedure relies on the autocorrelation function and takes into account prior measurements only; i.e. knowledge about wave...

  18. Measurement-based local quantum filters and their ability to ...

    Indian Academy of Sciences (India)

    Debmalya Das

    2017-05-30

    May 30, 2017 ... Entanglement; local filters; quantum measurement. PACS No. 03.65 ... ties [4,5], it also plays a key role in quantum computing where it is ... Furthermore, we pro- vide an ..... Corresponding to each of these vectors, we can con-.

  19. Comet: An internet based platform for education in measurement

    NARCIS (Netherlands)

    Regtien, Paulus P.L.; Halaj, Martin; Kureková, Eva; Gabko, Peter

    2005-01-01

    The project COMET provides a multimedia training package for metrology and measurement. The package is developed by a consortium of 10 institutes from 7 European countries. It consists of 31 modules, each dealing with a particular aspect of metrology, and is available in English, German, French and

  20. COMET: A multimedia internet based platform for education in measurement

    NARCIS (Netherlands)

    Grattan, K.T.V.; Regtien, Paulus P.L.; Halaj, M; Kureková, E.; Gabko, P

    2006-01-01

    The project COMET provides a multimedia training package for metrology and measurement. The package is developed by a consortium of 10 institutes from 7 European countries. It consists of 31 modules, each dealing with a particular aspect of metrology, and is available in English, German, French and

  1. TRIM timber projections: an evaluation based on forest inventory measurements.

    Science.gov (United States)

    John R. Mills

    1989-01-01

    Two consecutive timberland inventories collected from permanent plots in the natural pine type in North Carolina were used to evaluate the timber resource inventory model (TRIM). This study compares model predictions with field measurements and examines the effect of inventory data aggregation on the accuracy of projections. Projections were repeated for two geographic...

  2. Psycho-Pedagogical Measuring Bases of Educational Competences of Students

    Science.gov (United States)

    Kenzhegaliev, Kulush K.; Shayakhmetova, Aisulu A.; Zulkarnayeva, Zhamila A.; Iksatova, Balzhan K.; Shonova, Bakytgul A.

    2016-01-01

    The relevance of the research problem is conditioned by the weak development of measurement, assessment of educational competences at an operational level, at the level of actions, by insufficient applications of psycho-pedagogical theories and methods of mathematical statistics. The aim of the work is to develop through teaching experiments the…

  3. Quality measures for HRR alignment based ISAR imaging algorithms

    CSIR Research Space (South Africa)

    Janse van Rensburg, V

    2013-05-01

    Full Text Available Some Inverse Synthetic Aperture Radar (ISAR) algorithms form the image in a two-step process of range alignment and phase conjugation. This paper discusses a comprehensive set of measures used to quantify the quality of range alignment, with the aim...

  4. Measuring Clearance Mechanics Based on Dynamic Leg Length

    Science.gov (United States)

    Khamis, Sam; Danino, Barry; Hayek, Shlomo; Carmeli, Eli

    2018-01-01

    The aim of this study was to quantify clearance mechanics during gait. Seventeen children diagnosed with hemiplegic cerebral palsy underwent a three-dimensional gait analysis evaluation. Dynamic leg lengths were measured from the hip joint center to the heel, to the ankle joint center and to the forefoot throughout the gait cycle. Significant…

  5. Microflown based monopole sound sources for reciprocal measurements

    NARCIS (Netherlands)

    Bree, H.E. de; Basten, T.G.H.

    2008-01-01

    Monopole sound sources (i.e. omni directional sound sources with a known volume velocity) are essential for reciprocal measurements used in vehicle interior panel noise contribution analysis. Until recently, these monopole sound sources use a sound pressure transducer sensor as a reference sensor. A

  6. Image-Based Collection and Measurements for Construction Pay Items

    Science.gov (United States)

    2017-07-01

    Prior to each payment to contractors and suppliers, measurements are made to document the actual amount of pay items placed at the site. This manual process has substantial risk for personnel, and could be made more efficient and less prone to human ...

  7. Coordination of two robot manipulators based on position measurements only

    NARCIS (Netherlands)

    Rodriguez Angeles, A.; Nijmeijer, H.

    2001-01-01

    In this note we propose a controller that solves the problem of coordination of two (or more) robots, under a master-slave scheme, in the case when only position measurements are available. The controller consists of a feedback control law, and two non-linear observers. It is shown that the

  8. Height estimations based on eye measurements throughout a gait cycle

    DEFF Research Database (Denmark)

    Yang, Sylvia X M; Larsen, Peter K; Alkjær, Tine

    2014-01-01

    (EH) measurement, on the other hand, is less prone to concealment. The purpose of the present study was to investigate: (1) how the eye height varies during the gait cycle, and (2) how the eye height changes with head position. The eyes were plotted manually in APAS for 16 test subjects during...

  9. Basing of a complex design measures for protection against fire

    International Nuclear Information System (INIS)

    Kryuger, V.

    1983-01-01

    Fire impact on NPP radiation safety is analyzed. The general industry requirements to the protection system against fire are shown to be insufficient for NPPs. A complex of protection measures against fire is suggested that should be taken into account in the NPP designs [ru

  10. Simulated BRDF based on measured surface topography of metal

    Science.gov (United States)

    Yang, Haiyue; Haist, Tobias; Gronle, Marc; Osten, Wolfgang

    2017-06-01

    The radiative reflective properties of a calibration standard rough surface were simulated by ray tracing and the Finite-difference time-domain (FDTD) method. The simulation results have been used to compute the reflectance distribution functions (BRDF) of metal surfaces and have been compared with experimental measurements. The experimental and simulated results are in good agreement.

  11. Recruitment recommendation system based on fuzzy measure and indeterminate integral

    Science.gov (United States)

    Yin, Xin; Song, Jinjie

    2017-08-01

    In this study, we propose a comprehensive evaluation approach based on indeterminate integral. By introducing the related concepts of indeterminate integral and their formulas into the recruitment recommendation system, we can calculate the suitability of each job for different applicants with the defined importance for each criterion listed in the job advertisements, the association between different criteria and subjective assessment as the prerequisite. Thus we can make recommendations to the applicants based on the score of the suitability of each job from high to low. In the end, we will exemplify the usefulness and practicality of this system with samples.

  12. Remote measurement of microwave distribution based on optical detection

    Energy Technology Data Exchange (ETDEWEB)

    Ji, Zhong; Ding, Wenzheng; Yang, Sihua; Chen, Qun, E-mail: redrocks-chenqun@hotmail.com, E-mail: xingda@scnu.edu.cn; Xing, Da, E-mail: redrocks-chenqun@hotmail.com, E-mail: xingda@scnu.edu.cn [MOE Key Laboratory of Laser Life Science and Institute of Laser Life Science, South China Normal University, Guangzhou 510631 (China)

    2016-01-04

    In this letter, we present the development of a remote microwave measurement system. This method employs an arc discharge lamp that serves as an energy converter from microwave to visible light, which can propagate without transmission medium. Observed with a charge coupled device, quantitative microwave power distribution can be achieved when the operators and electronic instruments are in a distance from the high power region in order to reduce the potential risk. We perform the experiments using pulsed microwaves, and the results show that the system response is dependent on the microwave intensity over a certain range. Most importantly, the microwave distribution can be monitored in real time by optical observation of the response of a one-dimensional lamp array. The characteristics of low cost, a wide detection bandwidth, remote measurement, and room temperature operation make the system a preferred detector for microwave applications.

  13. [Automated measurement of distance vision based on the DIN strategy].

    Science.gov (United States)

    Effert, R; Steinmetz, H; Jansen, W; Rau, G; Reim, M

    1989-07-01

    A method for automated measurement of far vision is described which meets the test requirements laid down in the new DIN standards. The subject sits 5 m from a high-resolution monitor on which either Landolt rings or Snellen's types are generated by a computer. By moving a joystick the subject indicates to the computer whether he can see the critical detail (e.g., the direction of opening of the Landolt ring). Depending on the subject's input and the course of the test so far, the computer generates the next test symbol until the threshold criterion is reached. The sequence of presentation of the symbols and the threshold criterion are also in accordance with the DIN standard. Initial measurements of far vision using this automated system produced similar results to those obtained by conventional methods.

  14. A new consensus measure based on Pearson correlation coefficient

    OpenAIRE

    Chiclana, Francisco; Gonzalez-Arteaga, Teresa; de Andres Calle, Rocio

    2016-01-01

    Obtaining consensual solutions is an important issue in decision making processes. It depends on several factors such as experts’ opinions, principles, knowledge, experience, etc. In the literature we can find a considerable amount of consensus measurement from different research areas (from a Social Choice perspective: Alcalde-Unzu and Vorsatz [1], Alcantud, de Andres Calle and Cascon [2] and Bosch [3], among others and from Decision Making Theory: Gonzalez-Arteaga, Alcantud and ...

  15. EPR-based distance measurements at ambient temperature

    Science.gov (United States)

    Krumkacheva, Olesya; Bagryanskaya, Elena

    2017-07-01

    Pulsed dipolar (PD) EPR spectroscopy is a powerful technique allowing for distance measurements between spin labels in the range of 2.5-10.0 nm. It was proposed more than 30 years ago, and nowadays is widely used in biophysics and materials science. Until recently, PD EPR experiments were limited to cryogenic temperatures (T biomolecules, the influence of a linker between the spin probe and biomolecule, and future opportunities.

  16. Comparison of ground-based and Viking Orbiter measurements of Martian water vapor - Variability of the seasonal cycle

    Science.gov (United States)

    Jakosky, B. M.; Barker, E. S.

    1984-01-01

    Earth-based observations of Mars atmospheric water vapor are presented for the 1975-1976, 1977-1978, and 1983 apparitions. Comparisons are made with near-simultaneous spacecraft measurements made from the Viking Orbiter Mars Atmospheric Water Detection experiment during 1976-1978 and with previous earth-based measurements. Differences occur between the behavior in the different years, and may be related to the Mars climate. Measurements during the southern summer in 1969 indicate a factor of three times as much water as is present at this same season in other years. This difference may have resulted from the sublimation of water from the south polar residual cap upon removal of most or all of the CO2 ice present; sublimation of all of the CO2 ice during some years could be a result of a greater thermal load being placed on the cap due to the presence of differing amounts of atmospheric dust.

  17. Real-time temperature field measurement based on acoustic tomography

    International Nuclear Information System (INIS)

    Bao, Yong; Jia, Jiabin; Polydorides, Nick

    2017-01-01

    Acoustic tomography can be used to measure the temperature field from the time-of-flight (TOF). In order to capture real-time temperature field changes and accurately yield quantitative temperature images, two improvements to the conventional acoustic tomography system are studied: simultaneous acoustic transmission and TOF collection along multiple ray paths, and an offline iteration reconstruction algorithm. During system operation, all the acoustic transceivers send modulated and filtered wideband Kasami sequences simultaneously to facilitate fast and accurate TOF measurements using cross-correlation detection. For image reconstruction, the iteration process is separated and executed offline beforehand to shorten computation time for online temperature field reconstruction. The feasibility and effectiveness of the developed methods are validated in the simulation study. The simulation results demonstrate that the proposed method can reduce the processing time per frame from 160 ms to 20 ms, while the reconstruction error remains less than 5%. Hence, the proposed method has great potential in the measurement of rapid temperature change with good temporal and spatial resolution. (paper)

  18. Grain bulk density measurement based on wireless network

    Directory of Open Access Journals (Sweden)

    Wu Fangming

    2017-01-01

    Full Text Available To know the accurate quantity of stored grain, grain density sensors must be used to measure the grain’s bulk density. However, multi-sensors should be inserted into the storage facility, to quickly collect data during the inventory checking of stored grain. In this study, the ZigBee and Wi-Fi coexistence network’s ability to transmit data collected by density sensors was investigated. A system consisting of six sensor nodes, six router nodes, one gateway and one Android Pad was assembled to measure the grain’s bulk density and calculate its quantity. The CC2530 chip with ZigBee technology was considered as the core of the information processing, and wireless nodes detection in sensor, and router nodes. ZigBee worked in difference signal channel with Wi-Fi to avoid interferences and connected with Wi-Fi module by UART serial communications interfaces in gateway. The Android Pad received the measured data through the gateway and processed this data to calculate quantity. The system enabled multi-point and real-time parameter detection inside the grain storage. Results show that the system has characteristics of good expansibility, networking flexibility and convenience.

  19. Patient Specific Dosimetry based in excreted urine measurements

    Energy Technology Data Exchange (ETDEWEB)

    Barquero, R.; Nunez, C.; Ruiz, A.; Valverde, J.; Basurto, F.

    2006-07-01

    One of the limiting factors in utilising therapeutic radiopharmaceuticals in the I-131 thyroid therapy is the potential hazard to the bone marrow, kidneys, and other internal organs. In this work, by means of daily dose rate measurements at a point in contact of the can with the urine excreted by the patient undergoing radio-iodine therapy, activities and associated absorbed doses in total body are calculated. The urine can is characterised by a geometric and materials model for MC simulation with MCNP. Knowing the conversion factor from activity in urine to dose rate in the measurement point of the can for each filling volume, the urine and patient activity can be obtained at each measurement time. From the fitting of these activities, the time evolution, the effective half life in the patient and the cumulative whole body activity are calculated. The emission characteristics of I-131 are using after to estimate the maximum whole body absorbed dose. The results for 2 hyperthyroidism and 4 carcinoma treatments are presented. The maximum total body absorbed dose are 673 and 149 Gy for the carcinoma and hyperthyroidism. The corresponding range of T1/2 eff is o.2 to 2.5 days (carcinoma) and 5.4 to 6.6 days (hyperthyroidism). (Author)

  20. Smartphone-based quantitative measurements on holographic sensors.

    Science.gov (United States)

    Khalili Moghaddam, Gita; Lowe, Christopher Robin

    2017-01-01

    The research reported herein integrates a generic holographic sensor platform and a smartphone-based colour quantification algorithm in order to standardise and improve the determination of the concentration of analytes of interest. The utility of this approach has been exemplified by analysing the replay colour of the captured image of a holographic pH sensor in near real-time. Personalised image encryption followed by a wavelet-based image compression method were applied to secure the image transfer across a bandwidth-limited network to the cloud. The decrypted and decompressed image was processed through four principal steps: Recognition of the hologram in the image with a complex background using a template-based approach, conversion of device-dependent RGB values to device-independent CIEXYZ values using a polynomial model of the camera and computation of the CIEL*a*b* values, use of the colour coordinates of the captured image to segment the image, select the appropriate colour descriptors and, ultimately, locate the region of interest (ROI), i.e. the hologram in this case, and finally, application of a machine learning-based algorithm to correlate the colour coordinates of the ROI to the analyte concentration. Integrating holographic sensors and the colour image processing algorithm potentially offers a cost-effective platform for the remote monitoring of analytes in real time in readily accessible body fluids by minimally trained individuals.

  1. Smartphone-based quantitative measurements on holographic sensors.

    Directory of Open Access Journals (Sweden)

    Gita Khalili Moghaddam

    Full Text Available The research reported herein integrates a generic holographic sensor platform and a smartphone-based colour quantification algorithm in order to standardise and improve the determination of the concentration of analytes of interest. The utility of this approach has been exemplified by analysing the replay colour of the captured image of a holographic pH sensor in near real-time. Personalised image encryption followed by a wavelet-based image compression method were applied to secure the image transfer across a bandwidth-limited network to the cloud. The decrypted and decompressed image was processed through four principal steps: Recognition of the hologram in the image with a complex background using a template-based approach, conversion of device-dependent RGB values to device-independent CIEXYZ values using a polynomial model of the camera and computation of the CIEL*a*b* values, use of the colour coordinates of the captured image to segment the image, select the appropriate colour descriptors and, ultimately, locate the region of interest (ROI, i.e. the hologram in this case, and finally, application of a machine learning-based algorithm to correlate the colour coordinates of the ROI to the analyte concentration. Integrating holographic sensors and the colour image processing algorithm potentially offers a cost-effective platform for the remote monitoring of analytes in real time in readily accessible body fluids by minimally trained individuals.

  2. Comparing econometric and survey-based methodologies in measuring offshoring

    DEFF Research Database (Denmark)

    Refslund, Bjarke

    2016-01-01

    such as the national or regional level. Most macro analyses are based on proxies and trade statistics with limitations. Drawing on unique Danish survey data, this article demonstrates how survey data can provide important insights into the national scale and impacts of offshoring, including changes of employment...

  3. Modelling of autogenous shrinkage of concrete based on paste measurements

    NARCIS (Netherlands)

    Schlangen, E.; Leegwater, G.; Koenders, E.A.B.

    2006-01-01

    In order to be able to improve concrete modelling based on its constituent, more knowledge is needed about the material behaviour of these constituents. In this research the focus is on the behaviour of hardening concrete, therefore the properties of hardening cement are of most relevance.

  4. Likelihood-based Dynamic Factor Analysis for Measurement and Forecasting

    NARCIS (Netherlands)

    Jungbacker, B.M.J.P.; Koopman, S.J.

    2015-01-01

    We present new results for the likelihood-based analysis of the dynamic factor model. The latent factors are modelled by linear dynamic stochastic processes. The idiosyncratic disturbance series are specified as autoregressive processes with mutually correlated innovations. The new results lead to

  5. Intrusion detection method based on nonlinear correlation measure

    NARCIS (Netherlands)

    Ambusaidi, Mohammed A.; Tan, Zhiyuan; He, Xiangjian; Nanda, Priyadarsi; Lu, Liang Fu; Jamdagni, Aruna

    2014-01-01

    Cyber crimes and malicious network activities have posed serious threats to the entire internet and its users. This issue is becoming more critical, as network-based services, are more widespread and closely related to our daily life. Thus, it has raised a serious concern in individual internet

  6. Underestimation of Severity of Previous Whiplash Injuries

    Science.gov (United States)

    Naqui, SZH; Lovell, SJ; Lovell, ME

    2008-01-01

    INTRODUCTION We noted a report that more significant symptoms may be expressed after second whiplash injuries by a suggested cumulative effect, including degeneration. We wondered if patients were underestimating the severity of their earlier injury. PATIENTS AND METHODS We studied recent medicolegal reports, to assess subjects with a second whiplash injury. They had been asked whether their earlier injury was worse, the same or lesser in severity. RESULTS From the study cohort, 101 patients (87%) felt that they had fully recovered from their first injury and 15 (13%) had not. Seventy-six subjects considered their first injury of lesser severity, 24 worse and 16 the same. Of the 24 that felt the violence of their first accident was worse, only 8 had worse symptoms, and 16 felt their symptoms were mainly the same or less than their symptoms from their second injury. Statistical analysis of the data revealed that the proportion of those claiming a difference who said the previous injury was lesser was 76% (95% CI 66–84%). The observed proportion with a lesser injury was considerably higher than the 50% anticipated. CONCLUSIONS We feel that subjects may underestimate the severity of an earlier injury and associated symptoms. Reasons for this may include secondary gain rather than any proposed cumulative effect. PMID:18201501

  7. [Electronic cigarettes - effects on health. Previous reports].

    Science.gov (United States)

    Napierała, Marta; Kulza, Maksymilian; Wachowiak, Anna; Jabłecka, Katarzyna; Florek, Ewa

    2014-01-01

    Currently very popular in the market of tobacco products have gained electronic cigarettes (ang. E-cigarettes). These products are considered to be potentially less harmful in compared to traditional tobacco products. However, current reports indicate that the statements of the producers regarding to the composition of the e- liquids not always are sufficient, and consumers often do not have reliable information on the quality of the product used by them. This paper contain a review of previous reports on the composition of e-cigarettes and their impact on health. Most of the observed health effects was related to symptoms of the respiratory tract, mouth, throat, neurological complications and sensory organs. Particularly hazardous effects of the e-cigarettes were: pneumonia, congestive heart failure, confusion, convulsions, hypotension, aspiration pneumonia, face second-degree burns, blindness, chest pain and rapid heartbeat. In the literature there is no information relating to passive exposure by the aerosols released during e-cigarette smoking. Furthermore, the information regarding to the use of these products in the long term are not also available.

  8. Reliability and validity of an internet-based questionnaire measuring lifetime physical activity.

    Science.gov (United States)

    De Vera, Mary A; Ratzlaff, Charles; Doerfling, Paul; Kopec, Jacek

    2010-11-15

    Lifetime exposure to physical activity is an important construct for evaluating associations between physical activity and disease outcomes, given the long induction periods in many chronic diseases. The authors' objective in this study was to evaluate the measurement properties of the Lifetime Physical Activity Questionnaire (L-PAQ), a novel Internet-based, self-administered instrument measuring lifetime physical activity, among Canadian men and women in 2005-2006. Reliability was examined using a test-retest study. Validity was examined in a 2-part study consisting of 1) comparisons with previously validated instruments measuring similar constructs, the Lifetime Total Physical Activity Questionnaire (LT-PAQ) and the Chasan-Taber Physical Activity Questionnaire (CT-PAQ), and 2) a priori hypothesis tests of constructs measured by the L-PAQ. The L-PAQ demonstrated good reliability, with intraclass correlation coefficients ranging from 0.67 (household activity) to 0.89 (sports/recreation). Comparison between the L-PAQ and the LT-PAQ resulted in Spearman correlation coefficients ranging from 0.41 (total activity) to 0.71 (household activity); comparison between the L-PAQ and the CT-PAQ yielded coefficients of 0.58 (sports/recreation), 0.56 (household activity), and 0.50 (total activity). L-PAQ validity was further supported by observed relations between the L-PAQ and sociodemographic variables, consistent with a priori hypotheses. Overall, the L-PAQ is a useful instrument for assessing multiple domains of lifetime physical activity with acceptable reliability and validity.

  9. Two-phase flow measurement based on oblique laser scattering

    Science.gov (United States)

    Vendruscolo, Tiago P.; Fischer, Robert; Martelli, Cícero; Rodrigues, Rômulo L. P.; Morales, Rigoberto E. M.; da Silva, Marco J.

    2015-07-01

    Multiphase flow measurements play a crucial role in monitoring productions processes in many industries. To guarantee the safety of processes involving multiphase flows, it is important to detect changes in the flow conditions before they can cause damage, often in fractions of seconds. Here we demonstrate how the scattering pattern of a laser beam passing a two-phase flow under an oblique angle to the flow direction can be used to detect derivations from the desired flow conditions in microseconds. Applying machine-learning techniques to signals obtained from three photo-detectors we achieve a compact, versatile, low-cost sensor design for safety applications.

  10. Measurement of hepatic steatosis based on magnetic resonance images

    Science.gov (United States)

    Tkaczyk, Adam; Jańczyk, Wojciech; Chełstowska, Sylwia; Socha, Piotr; Mulawka, Jan

    2017-08-01

    The subject of this work is the usage of digital image processing to measure hepatic steatosis. To calculate this value manually it requires a lot of time and precision from the radiologist. In order to resolve this issue, a C++ application has been created. This paper describes the algorithms that have been used to solve the problem. The next chapter presents the application architecture and introduces graphical user interface. The last section describes all the tests which have been carried out to check the correctness of the results.

  11. A Computer Based Moire Technique To Measure Very Small Displacements

    Science.gov (United States)

    Sciammarella, Cesar A.; Amadshahi, Mansour A.; Subbaraman, B.

    1987-02-01

    The accuracy that can be achieved in the measurement of very small displacements in techniques such as moire, holography and speckle is limited by the noise inherent to the utilized optical devices. To reduce the noise to signal ratio, the moire method can be utilized. Two system of carrier fringes are introduced, an initial system before the load is applied and a final system when the load is applied. The moire pattern of these two systems contains the sought displacement information and the noise common to the two patterns is eliminated. The whole process is performed by a computer on digitized versions of the patterns. Examples of application are given.

  12. Sensing line effects on PWR-based differential pressure measurements

    International Nuclear Information System (INIS)

    Evans, R.P.; Neff, G.G.

    1982-01-01

    An incorrrect configuration of the fluid-filled pressure sensing lines connecting differential pressure transducers to the pressure taps in a pressurized water reactor system can cause errors in the measurement and, during rapid pressure transients, could cause the transducer to fail. Testing was performed in both static and dynamic modes to experimentally determine the effects of sensing lines of various lengths, diameters, and materials. Testing was performed at ambient temperature with absolute line pressures at about 17 MPa using water as the pressure transmission fluid

  13. Directed energy deflection laboratory measurements of common space based targets

    Science.gov (United States)

    Brashears, Travis; Lubin, Philip; Hughes, Gary B.; Meinhold, Peter; Batliner, Payton; Motta, Caio; Madajian, Jonathan; Mercer, Whitaker; Knowles, Patrick

    2016-09-01

    We report on laboratory studies of the effectiveness of directed energy planetary defense as a part of the DE-STAR (Directed Energy System for Targeting of Asteroids and exploRation) program. DE-STAR and DE-STARLITE are directed energy "stand-off" and "stand-on" programs, respectively. These systems consist of a modular array of kilowatt-class lasers powered by photovoltaics, and are capable of heating a spot on the surface of an asteroid to the point of vaporization. Mass ejection, as a plume of evaporated material, creates a reactionary thrust capable of diverting the asteroid's orbit. In a series of papers, we have developed a theoretical basis and described numerical simulations for determining the thrust produced by material evaporating from the surface of an asteroid. In the DESTAR concept, the asteroid itself is used as the deflection "propellant". This study presents results of experiments designed to measure the thrust created by evaporation from a laser directed energy spot. We constructed a vacuum chamber to simulate space conditions, and installed a torsion balance that holds a common space target sample. The sample is illuminated with a fiber array laser with flux levels up to 60 MW/m2 , which allows us to simulate a mission level flux but on a small scale. We use a separate laser as well as a position sensitive centroid detector to readout the angular motion of the torsion balance and can thus determine the thrust. We compare the measured thrust to the models. Our theoretical models indicate a coupling coefficient well in excess of 100 μN/Woptical, though we assume a more conservative value of 80 μN/Woptical and then degrade this with an optical "encircled energy" efficiency of 0.75 to 60 μN/Woptical in our deflection modeling. Our measurements discussed here yield about 45 μN/Wabsorbed as a reasonable lower limit to the thrust per optical watt absorbed. Results vary depending on the material tested and are limited to measurements of 1 axis, so

  14. A comparison between plaque-based and vessel-based measurement for plaque component using volumetric intravascular ultrasound radiofrequency data analysis.

    Science.gov (United States)

    Shin, Eun-Seok; Garcia-Garcia, Hector M; Garg, Scot; Serruys, Patrick W

    2011-04-01

    Although percent plaque components on plaque-based measurement have been used traditionally in previous studies, the impact of vessel-based measurement for percent plaque components have yet to be studied. The purpose of this study was therefore to correlate percent plaque components derived by plaque- and vessel-based measurement using intravascular ultrasound virtual histology (IVUS-VH). The patient cohort comprised of 206 patients with de novo coronary artery lesions who were imaged with IVUS-VH. Age ranged from 35 to 88 years old, and 124 patients were male. Whole pullback analysis was used to calculate plaque volume, vessel volume, and absolute and percent volumes of fibrous, fibrofatty, necrotic core, and dense calcium. The plaque and vessel volumes were well correlated (r = 0.893, P measurement was also highly correlated with vessel-based measurement. Therefore, the percent plaque component volume calculated by vessel volume could be used instead of the conventional percent plaque component volume calculated by plaque volume.

  15. Assessing Therapist Competence : Development of a Performance-Based Measure and Its Comparison With a Web-Based Measure

    NARCIS (Netherlands)

    Cooper, Zafra; Doll, Helen; Bailey-Straebler, Suzanne; Bohn, Kristin; de Vries, Dian; Murphy, Rebecca; O'Connor, Marianne E; Fairburn, Christopher G

    2017-01-01

    BACKGROUND: Recent research interest in how best to train therapists to deliver psychological treatments has highlighted the need for rigorous, but scalable, means of measuring therapist competence. There are at least two components involved in assessing therapist competence: the assessment of their

  16. A framework for grouping nanoparticles based on their measurable characteristics.

    Science.gov (United States)

    Sayes, Christie M; Smith, P Alex; Ivanov, Ivan V

    2013-01-01

    There is a need to take a broader look at nanotoxicological studies. Eventually, the field will demand that some generalizations be made. To begin to address this issue, we posed a question: are metal colloids on the nanometer-size scale a homogeneous group? In general, most people can agree that the physicochemical properties of nanomaterials can be linked and related to their induced toxicological responses. The focus of this study was to determine how a set of selected physicochemical properties of five specific metal-based colloidal materials on the nanometer-size scale - silver, copper, nickel, iron, and zinc - could be used as nanodescriptors that facilitate the grouping of these metal-based colloids. The example of the framework pipeline processing provided in this paper shows the utility of specific statistical and pattern recognition techniques in grouping nanoparticles based on experimental data about their physicochemical properties. Interestingly, the results of the analyses suggest that a seemingly homogeneous group of nanoparticles could be separated into sub-groups depending on interdependencies observed in their nanodescriptors. These particles represent an important category of nanomaterials that are currently mass produced. Each has been reputed to induce toxicological and/or cytotoxicological effects. Here, we propose an experimental methodology coupled with mathematical and statistical modeling that can serve as a prototype for a rigorous framework that aids in the ability to group nanomaterials together and to facilitate the subsequent analysis of trends in data based on quantitative modeling of nanoparticle-specific structure-activity relationships. The computational part of the proposed framework is rather general and can be applied to other groups of nanomaterials as well.

  17. Refractive Index Measurement of Liquids Based on Microstructured Optical Fibers

    Directory of Open Access Journals (Sweden)

    Susana Silva

    2014-12-01

    Full Text Available This review is focused on microstructured optical fiber sensors developed in recent years for liquid RI sensing. The review is divided into three parts: the first section introduces a general view of the most relevant refractometric sensors that have been reported over the last thirty years. Section 2 discusses several microstructured optical fiber designs, namely, suspended-core fiber, photonic crystal fiber, large-core air-clad photonic crystal fiber, and others. This part is also divided into two main groups: the interferometric-based and resonance-based configurations. The sensing methods rely either on full/selective filling of the microstructured fiber air holes with a liquid analyte or by simply immersing the sensing fiber into the liquid analyte. The sensitivities and resolutions are tabled at the end of this section followed by a brief discussion of the obtained results. The last section concludes with some remarks about the microstructured fiber-based configurations developed for RI sensing and their potential for future applications.

  18. Chamber and Diffusive Based Carbon Flux Measurements in an Alaskan Arctic Ecosystem

    Science.gov (United States)

    Wilkman, E.; Oechel, W. C.; Zona, D.

    2013-12-01

    Eric Wilkman, Walter Oechel, Donatella Zona Comprising an area of more than 7 x 106 km2 and containing over 11% of the world's organic matter pool, Arctic terrestrial ecosystems are vitally important components of the global carbon cycle, yet their structure and functioning are sensitive to subtle changes in climate and many of these functional changes can have large effects on the atmosphere and future climate regimes (Callaghan & Maxwell 1995, Chapin et al. 2002). Historically these northern ecosystems have acted as strong C sinks, sequestering large stores of atmospheric C due to photosynthetic dominance in the short summer season and low rates of decomposition throughout the rest of the year as a consequence of cold, nutrient poor, and generally water-logged conditions. Currently, much of this previously stored carbon is at risk of loss to the atmosphere due to accelerated soil organic matter decomposition in warmer future climates (Grogan & Chapin 2000). Although there have been numerous studies on Arctic carbon dynamics, much of the previous soil flux work has been done at limited time intervals, due to both the harshness of the environment and labor and time constraints. Therefore, in June of 2013 an Ultraportable Greenhouse Gas Analyzer (UGGA - Los Gatos Research Inc.) was deployed in concert with the LI-8100A Automated Soil Flux System (LI-COR Biosciences) in Barrow, AK to gather high temporal frequency soil CO2 and CH4 fluxes from a wet sedge tundra ecosystem. An additional UGGA in combination with diffusive probes, installed in the same location, provides year-round soil and snow CO2 and CH4 concentrations. When used in combination with the recently purchased AlphaGUARD portable radon monitor (Saphymo GmbH), continuous soil and snow diffusivities and fluxes of CO2 and CH4 can be calculated (Lehmann & Lehmann 2000). Of particular note, measuring soil gas concentration over a diffusive gradient in this way allows one to separate both net production and

  19. Obstructive pulmonary disease in patients with previous tuberculosis ...

    African Journals Online (AJOL)

    Obstructive pulmonary disease in patients with previous tuberculosis: Pathophysiology of a community-based cohort. B.W. Allwood, R Gillespie, M Galperin-Aizenberg, M Bateman, H Olckers, L Taborda-Barata, G.L. Calligaro, Q Said-Hartley, R van Zyl-Smit, C.B. Cooper, E van Rikxoort, J Goldin, N Beyers, E.D. Bateman ...

  20. Generalized flow and determinism in measurement-based quantum computation

    Energy Technology Data Exchange (ETDEWEB)

    Browne, Daniel E [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PU (United Kingdom); Kashefi, Elham [Computing Laboratory and Christ Church College, University of Oxford, Parks Road, Oxford OX1 3QD (United Kingdom); Mhalla, Mehdi [Laboratoire d' Informatique de Grenoble, CNRS - Centre national de la recherche scientifique, Universite de Grenoble (France); Perdrix, Simon [Preuves, Programmes et Systemes (PPS), Universite Paris Diderot, Paris (France)

    2007-08-15

    We extend the notion of quantum information flow defined by Danos and Kashefi (2006 Phys. Rev. A 74 052310) for the one-way model (Raussendorf and Briegel 2001 Phys. Rev. Lett. 86 910) and present a necessary and sufficient condition for the stepwise uniformly deterministic computation in this model. The generalized flow also applied in the extended model with measurements in the (X, Y), (X, Z) and (Y, Z) planes. We apply both measurement calculus and the stabiliser formalism to derive our main theorem which for the first time gives a full characterization of the stepwise uniformly deterministic computation in the one-way model. We present several examples to show how our result improves over the traditional notion of flow, such as geometries (entanglement graph with input and output) with no flow but having generalized flow and we discuss how they lead to an optimal implementation of the unitaries. More importantly one can also obtain a better quantum computation depth with the generalized flow rather than with flow. We believe our characterization result is particularly valuable for the study of the algorithms and complexity in the one-way model.

  1. Automatic anatomical structures location based on dynamic shape measurement

    Science.gov (United States)

    Witkowski, Marcin; Rapp, Walter; Sitnik, Robert; Kujawinska, Malgorzata; Vander Sloten, Jos; Haex, Bart; Bogaert, Nico; Heitmann, Kjell

    2005-09-01

    New image processing methods and active photonics apparatus have made possible the development of relatively inexpensive optical systems for complex shape and object measurements. We present dynamic 360° scanning method for analysis of human lower body biomechanics, with an emphasis on the analysis of the knee joint. The anatomical structure (of high medical interest) that is possible to scan and analyze, is patella. Tracking of patella position and orientation under dynamic conditions may lead to detect pathological patella movements and help in knee joint disease diagnosis. The processed data is obtained from a dynamic laser triangulation surface measurement system, able to capture slow to normal movements with a scan frequency between 15 and 30 Hz. These frequency rates are enough to capture controlled movements used e.g. for medical examination purposes. The purpose of the work presented is to develop surface analysis methods that may be used as support of diagnosis of motoric abilities of lower limbs. The paper presents algorithms used to process acquired lower limbs surface data in order to find the position and orientation of patella. The algorithms implemented include input data preparation, curvature description methods, knee region discrimination and patella assumed position/orientation calculation. Additionally, a method of 4D (3D + time) medical data visualization is proposed. Also some exemplary results are presented.

  2. New approach to radiation monitoring: citizen based radiation measurement

    International Nuclear Information System (INIS)

    Kuca, P.; Helebrant, J.

    2016-01-01

    Both the Fukushima Dai-chi NPP accident in Japan in 2011 and the Chernobyl NPP accident in USSR in 1986 similarly to the first one have shown a necessity to find a way how to improve confidence of the public to official authorities. It is important especially in such a case of severe accidents with significant consequences in large inhabited areas around the damaged NPP. A lack of public confidence to officials was caused mostly by rather poor communication between official authorities and the public, as well by restricted access to the information for the public. It may have extremely negative impacts on the public understanding of actual situation and its possible risks, on public acceptance of necessary protective measures and participation of the public in remediation of the affected areas. One of possible ways to improve the situation can be implementation of citizen radiation monitoring on voluntary basis. Making sure, the official results are compatible with public self-measured ones, the public probably has more confidence in them. In the Czech Republic the implementation of such an approach is tested in the framework of security research founded by the Czech Ministry of the Interior - the research project RAMESIS solved by SURO. (authors)

  3. Generalized flow and determinism in measurement-based quantum computation

    International Nuclear Information System (INIS)

    Browne, Daniel E; Kashefi, Elham; Mhalla, Mehdi; Perdrix, Simon

    2007-01-01

    We extend the notion of quantum information flow defined by Danos and Kashefi (2006 Phys. Rev. A 74 052310) for the one-way model (Raussendorf and Briegel 2001 Phys. Rev. Lett. 86 910) and present a necessary and sufficient condition for the stepwise uniformly deterministic computation in this model. The generalized flow also applied in the extended model with measurements in the (X, Y), (X, Z) and (Y, Z) planes. We apply both measurement calculus and the stabiliser formalism to derive our main theorem which for the first time gives a full characterization of the stepwise uniformly deterministic computation in the one-way model. We present several examples to show how our result improves over the traditional notion of flow, such as geometries (entanglement graph with input and output) with no flow but having generalized flow and we discuss how they lead to an optimal implementation of the unitaries. More importantly one can also obtain a better quantum computation depth with the generalized flow rather than with flow. We believe our characterization result is particularly valuable for the study of the algorithms and complexity in the one-way model

  4. An AFM-based pit-measuring method for indirect measurements of cell-surface membrane vesicles

    International Nuclear Information System (INIS)

    Zhang, Xiaojun; Chen, Yuan; Chen, Yong

    2014-01-01

    Highlights: • Air drying induced the transformation of cell-surface membrane vesicles into pits. • An AFM-based pit-measuring method was developed to measure cell-surface vesicles. • Our method detected at least two populations of cell-surface membrane vesicles. - Abstract: Circulating membrane vesicles, which are shed from many cell types, have multiple functions and have been correlated with many diseases. Although circulating membrane vesicles have been extensively characterized, the status of cell-surface membrane vesicles prior to their release is less understood due to the lack of effective measurement methods. Recently, as a powerful, micro- or nano-scale imaging tool, atomic force microscopy (AFM) has been applied in measuring circulating membrane vesicles. However, it seems very difficult for AFM to directly image/identify and measure cell-bound membrane vesicles due to the similarity of surface morphology between membrane vesicles and cell surfaces. Therefore, until now no AFM studies on cell-surface membrane vesicles have been reported. In this study, we found that air drying can induce the transformation of most cell-surface membrane vesicles into pits that are more readily detectable by AFM. Based on this, we developed an AFM-based pit-measuring method and, for the first time, used AFM to indirectly measure cell-surface membrane vesicles on cultured endothelial cells. Using this approach, we observed and quantitatively measured at least two populations of cell-surface membrane vesicles, a nanoscale population (<500 nm in diameter peaking at ∼250 nm) and a microscale population (from 500 nm to ∼2 μm peaking at ∼0.8 μm), whereas confocal microscopy only detected the microscale population. The AFM-based pit-measuring method is potentially useful for studying cell-surface membrane vesicles and for investigating the mechanisms of membrane vesicle formation/release

  5. Development of a novel diamond based detector for machine induced background and luminosity measurements

    Energy Technology Data Exchange (ETDEWEB)

    Hempel, Maria

    2017-07-15

    The Large Hadron Collider (LHC) is the largest particle accelerator and storage ring in the world, used to investigate fundamentals of particle physics and to develop at the same time the technology of accelerators and detectors. Four main experiments, located around the LHC ring, provide insight into the nature of particles and search for answers to as yet unexplained phenomena in the universe. These four experiments are ATLAS (A Toroidal LHC Apparatus), ALICE (A Large Ion Collider Experiment), CMS (Compact Muon Solenoid) and LHCb (LHC beauty). Two proton or heavy ion beams circulate in the LHC and are brought into collision in the four experiments. The physics potential of each experiment is determined by the luminosity, which is a ratio of the number of the events during a certain time period to the cross section of a physics process. A measurement of the luminosity is therefore essential to determine the cross section of interesting physics processes. In addition, safe and high-quality data-taking requires stable beam conditions with almost no beam losses. So-called beam loss monitors are installed in the LHC rings to monitor beam losses around the LHC. Each experiment has in addition its own detectors to measure beam losses, hereafter called machine induced background. One such detector is installed in CMS, the Fast Beam Condition Monitor (BCM1F). Based on diamond sensors it was designed and built to measure both, the luminosity and the machine induced background. BCM1F ran smoothly during the first LHC running period from 2009-2012 and delivered valuable beam loss and luminosity information to the control rooms of CMS and LHC. At the end of 2012 the LHC was shut down for an upgrade to improve the performance by increasing the proton energy from 4 TeV to 7 TeV and decreasing the proton bunch spacing from 50 ns to 25 ns. Due to the success of BCM1F an upgrade of its sensors and readout components was planned in order to fulfil the new requirements. The upgrade

  6. Development of a novel diamond based detector for machine induced background and luminosity measurements

    International Nuclear Information System (INIS)

    Hempel, Maria

    2017-07-01

    The Large Hadron Collider (LHC) is the largest particle accelerator and storage ring in the world, used to investigate fundamentals of particle physics and to develop at the same time the technology of accelerators and detectors. Four main experiments, located around the LHC ring, provide insight into the nature of particles and search for answers to as yet unexplained phenomena in the universe. These four experiments are ATLAS (A Toroidal LHC Apparatus), ALICE (A Large Ion Collider Experiment), CMS (Compact Muon Solenoid) and LHCb (LHC beauty). Two proton or heavy ion beams circulate in the LHC and are brought into collision in the four experiments. The physics potential of each experiment is determined by the luminosity, which is a ratio of the number of the events during a certain time period to the cross section of a physics process. A measurement of the luminosity is therefore essential to determine the cross section of interesting physics processes. In addition, safe and high-quality data-taking requires stable beam conditions with almost no beam losses. So-called beam loss monitors are installed in the LHC rings to monitor beam losses around the LHC. Each experiment has in addition its own detectors to measure beam losses, hereafter called machine induced background. One such detector is installed in CMS, the Fast Beam Condition Monitor (BCM1F). Based on diamond sensors it was designed and built to measure both, the luminosity and the machine induced background. BCM1F ran smoothly during the first LHC running period from 2009-2012 and delivered valuable beam loss and luminosity information to the control rooms of CMS and LHC. At the end of 2012 the LHC was shut down for an upgrade to improve the performance by increasing the proton energy from 4 TeV to 7 TeV and decreasing the proton bunch spacing from 50 ns to 25 ns. Due to the success of BCM1F an upgrade of its sensors and readout components was planned in order to fulfil the new requirements. The upgrade

  7. Simple, fast, and low-cost camera-based water content measurement with colorimetric fluorescent indicator

    Science.gov (United States)

    Song, Seok-Jeong; Kim, Tae-Il; Kim, Youngmi; Nam, Hyoungsik

    2018-05-01

    Recently, a simple, sensitive, and low-cost fluorescent indicator has been proposed to determine water contents in organic solvents, drugs, and foodstuffs. The change of water content leads to the change of the indicator's fluorescence color under the ultra-violet (UV) light. Whereas the water content values could be estimated from the spectrum obtained by a bulky and expensive spectrometer in the previous research, this paper demonstrates a simple and low-cost camera-based water content measurement scheme with the same fluorescent water indicator. Water content is calculated over the range of 0-30% by quadratic polynomial regression models with color information extracted from the captured images of samples. Especially, several color spaces such as RGB, xyY, L∗a∗b∗, u‧v‧, HSV, and YCBCR have been investigated to establish the optimal color information features over both linear and nonlinear RGB data given by a camera before and after gamma correction. In the end, a 2nd order polynomial regression model along with HSV in a linear domain achieves the minimum mean square error of 1.06% for a 3-fold cross validation method. Additionally, the resultant water content estimation model is implemented and evaluated in an off-the-shelf Android-based smartphone.

  8. Measurement of the first Townsend ionization coefficient in a methane-based tissue-equivalent gas

    Energy Technology Data Exchange (ETDEWEB)

    Petri, A.R. [Instituto de Pesquisas Energéticas e Nucleares, Cidade Universitária, 05508-000 São Paulo (Brazil); Gonçalves, J.A.C. [Instituto de Pesquisas Energéticas e Nucleares, Cidade Universitária, 05508-000 São Paulo (Brazil); Departamento de Física, Pontifícia Universidade Católica de São Paulo, 01303-050 São Paulo (Brazil); Mangiarotti, A. [Instituto de Física - Universidade de São Paulo, Cidade Universitária, 05508-080 São Paulo (Brazil); Botelho, S. [Instituto de Pesquisas Energéticas e Nucleares, Cidade Universitária, 05508-000 São Paulo (Brazil); Bueno, C.C., E-mail: ccbueno@ipen.br [Instituto de Pesquisas Energéticas e Nucleares, Cidade Universitária, 05508-000 São Paulo (Brazil)

    2017-03-21

    Tissue-equivalent gases (TEGs), often made of a hydrocarbon, nitrogen, and carbon dioxide, have been employed in microdosimetry for decades. However, data on the first Townsend ionization coefficient (α) in such mixtures are scarce, regardless of the chosen hydrocarbon. In this context, measurements of α in a methane-based tissue-equivalent gas (CH{sub 4} – 64.4%, CO{sub 2} – 32.4%, and N{sub 2} – 3.2%) were performed in a uniform field configuration for density-normalized electric fields (E/N) up to 290 Td. The setup adopted in our previous works was improved for operating at low pressures. The modifications introduced in the apparatus and the experimental technique were validated by comparing our results of the first Townsend ionization coefficient in nitrogen, carbon dioxide, and methane with those from the literature and Magboltz simulations. The behavior of α in the methane-based TEG was consistent with that observed for pure methane. All the experimental results are included in tabular form in the .

  9. Electrophoresis- and FRET-Based Measures of Serpin Polymerization.

    Science.gov (United States)

    Faull, Sarah V; Brown, Anwen E; Haq, Imran; Irving, James A

    2017-01-01

    Many serpinopathies, including alpha-1 antitrypsin (A1AT) deficiency, are associated with the formation of unbranched polymer chains of mutant serpins. In vivo, this deficiency is the result of mutations that cause kinetic or thermodynamic destabilization of the molecule. However, polymerization can also be induced in vitro from mutant or wild-type serpins under destabilizing conditions. The characteristics of the resulting polymers are dependent upon induction conditions. Due to their relationship to disease, serpin polymers, mainly those formed from A1AT, have been widely studied. Here, we describe Förster resonance energy transfer (FRET) and gel-based approaches for their characterization.

  10. Modal response of interior mass based upon external measurements

    International Nuclear Information System (INIS)

    Chow, C T; Eli, M; Jorgensen, B R; Woehrle, T.

    1999-01-01

    Modal response testing has been used to predict the motion of interior masses of a system in which only external instrumentation is allowed. Testing of this form may occasionally be necessary in validation of a computer model, but also has potential as a tool for validating individual assemblies in a QA process. Examination of the external frequency response and mode shapes can offer insight into interior response. The interpretation of these results is improved through parallel analytical solutions. A simple, three-mass model has been examined experimentally and analytically to demonstrate modal theory. These results show the limitations of the external measurement in predicting internal response due to transmissibility. A procedure for utilizing external testing is described. The question posed through this research is whether or not modal correlation analysis can be adapted for use in systems for which instrumentation of critical components is missing

  11. Long distance measurement with a femtosecond laser based frequency comb

    Science.gov (United States)

    Bhattacharya, N.; Cui, M.; Zeitouny, M. G.; Urbach, H. P.; van den Berg, S. A.

    2017-11-01

    Recent advances in the field of ultra-short pulse lasers have led to the development of reliable sources of carrier envelope phase stabilized femtosecond pulses. The pulse train generated by such a source has a frequency spectrum that consists of discrete, regularly spaced lines known as a frequency comb. In this case both the frequency repetition and the carrier-envelope-offset frequency are referenced to a frequency standard, like an atomic clock. As a result the accuracy of the frequency standard is transferred to the optical domain, with the frequency comb as transfer oscillator. These unique properties allow the frequency comb to be applied as a versatile tool, not only for time and frequency metrology, but also in fundamental physics, high-precision spectroscopy, and laser noise characterization. The pulse-to-pulse phase relationship of the light emitted by the frequency comb has opened up new directions for long range highly accurate distance measurement.

  12. Microscopic oxygen imaging based on fluorescein bleaching efficiency measurements

    DEFF Research Database (Denmark)

    Beutler, Martin; Heisterkamp, Ines M.; Piltz, Bastian

    2014-01-01

    by a charge-coupled-device (ccd) camera mounted on a fluorescence microscope allowed a pixelwise estimation of the ratio function in a microscopic image. Use of a microsensor and oxygen-consuming bacteria in a sample chamber enabled the calibration of the system for quantification of absolute oxygen......Photobleaching of the fluorophore fluorescein in an aqueous solution is dependent on the oxygen concentration. Therefore, the time-dependent bleaching behavior can be used to measure of dissolved oxygen concentrations. The method can be combined with epi-fluorescence microscopy. The molecular...... states of the fluorophore can be expressed by a three-state energy model. This leads to a set of differential equations which describe the photobleaching behavior of fluorescein. The numerical solution of these equations shows that in a conventional wide-field fluorescence microscope, the fluorescence...

  13. Synchrotron radiation-based Mössbauer spectra of {sup 174}Yb measured with internal conversion electrons

    Energy Technology Data Exchange (ETDEWEB)

    Masuda, Ryo, E-mail: masudar@rri.kyoto-u.ac.jp; Kobayashi, Yasuhiro; Kitao, Shinji; Kurokuzu, Masayuki [Research Reactor Institute, Kyoto University, Kumatori-cho, Sennan-gun, Osaka 590-0494 (Japan); Saito, Makina [Beamline Spectroscopy/Scattering Group, Elettra-Sincrotrone Trieste, S. S. 14 Km 163.5, I-34149 Trieste (Italy); Yoda, Yoshitaka [Research and Utilization Division, Japan Synchrotron Radiation Research Institute, Kouto, Sayo-cho, Sayo-gun, Hyogo 679-5198 (Japan); Mitsui, Takaya [Condensed Matter Science Division, Japan Atomic Energy Agency, Kouto, Sayo-cho, Sayo-gun, Hyogo 679-5148 (Japan); Iga, Fumitoshi [College of Science, Ibaraki University, Mito, Ibaraki, 310-8512 (Japan); Seto, Makoto [Research Reactor Institute, Kyoto University, Kumatori-cho, Sennan-gun, Osaka 590-0494 (Japan); Condensed Matter Science Division, Japan Atomic Energy Agency, Kouto, Sayo-cho, Sayo-gun, Hyogo 679-5148 (Japan)

    2014-02-24

    A detection system for synchrotron-radiation (SR)-based Mössbauer spectroscopy was developed to enhance the nuclear resonant scattering counting rate and thus increase the available nuclides. In the system, a windowless avalanche photodiode (APD) detector was combined with a vacuum cryostat to detect the internal conversion (IC) electrons and fluorescent X-rays accompanied by nuclear de-excitation. As a feasibility study, the SR-based Mössbauer spectrum using the 76.5 keV level of {sup 174}Yb was observed without {sup 174}Yb enrichment of the samples. The counting rate was five times higher than that of our previous system, and the spectrum was obtained within 10 h. This result shows that nuclear resonance events can be more efficiently detected by counting IC electrons for nuclides with high IC coefficients. Furthermore, the windowless detection system enables us to place the sample closer to the APD elements and is advantageous for nuclear resonant inelastic scattering measurements. Therefore, this detection system can not only increase the number of nuclides accessible in SR-based Mössbauer spectroscopy but also allows the nuclear resonant inelastic scattering measurements of small single crystals or enzymes with dilute probe nuclides that are difficult to measure with the previous detection system.

  14. Multiple kernel boosting framework based on information measure for classification

    International Nuclear Information System (INIS)

    Qi, Chengming; Wang, Yuping; Tian, Wenjie; Wang, Qun

    2016-01-01

    The performance of kernel-based method, such as support vector machine (SVM), is greatly affected by the choice of kernel function. Multiple kernel learning (MKL) is a promising family of machine learning algorithms and has attracted many attentions in recent years. MKL combines multiple sub-kernels to seek better results compared to single kernel learning. In order to improve the efficiency of SVM and MKL, in this paper, the Kullback–Leibler kernel function is derived to develop SVM. The proposed method employs an improved ensemble learning framework, named KLMKB, which applies Adaboost to learning multiple kernel-based classifier. In the experiment for hyperspectral remote sensing image classification, we employ feature selected through Optional Index Factor (OIF) to classify the satellite image. We extensively examine the performance of our approach in comparison to some relevant and state-of-the-art algorithms on a number of benchmark classification data sets and hyperspectral remote sensing image data set. Experimental results show that our method has a stable behavior and a noticeable accuracy for different data set.

  15. Integration of a silicon-based microprobe into a gear measuring instrument for accurate measurement of micro gears

    International Nuclear Information System (INIS)

    Ferreira, N; Krah, T; Jeong, D C; Kniel, K; Härtig, F; Metz, D; Dietzel, A; Büttgenbach, S

    2014-01-01

    The integration of silicon micro probing systems into conventional gear measuring instruments (GMIs) allows fully automated measurements of external involute micro spur gears of normal modules smaller than 1 mm. This system, based on a silicon microprobe, has been developed and manufactured at the Institute for Microtechnology of the Technische Universität Braunschweig. The microprobe consists of a silicon sensor element and a stylus which is oriented perpendicularly to the sensor. The sensor is fabricated by means of silicon bulk micromachining. Its small dimensions of 6.5 mm × 6.5 mm allow compact mounting in a cartridge to facilitate the integration into a GMI. In this way, tactile measurements of 3D microstructures can be realized. To enable three-dimensional measurements with marginal forces, four Wheatstone bridges are built with diffused piezoresistors on the membrane of the sensor. On the reverse of the membrane, the stylus is glued perpendicularly to the sensor on a boss to transmit the probing forces to the sensor element during measurements. Sphere diameters smaller than 300 µm and shaft lengths of 5 mm as well as measurement forces from 10 µN enable the measurements of 3D microstructures. Such micro probing systems can be integrated into universal coordinate measuring machines and also into GMIs to extend their field of application. Practical measurements were carried out at the Physikalisch-Technische Bundesanstalt by qualifying the microprobes on a calibrated reference sphere to determine their sensitivity and their physical dimensions in volume. Following that, profile and helix measurements were carried out on a gear measurement standard with a module of 1 mm. The comparison of the measurements shows good agreement between the measurement values and the calibrated values. This result is a promising basis for the realization of smaller probe diameters for the tactile measurement of micro gears with smaller modules. (paper)

  16. Adaptive Voltage Stability Protection Based on Load Identification Using Phasor Measurement Units

    DEFF Research Database (Denmark)

    Liu, Leo; Bak, Claus Leth; Chen, Zhe

    2011-01-01

    collapse. In this paper, the online load identification using measurement-based approach based on Phasor Measurement Units (PMU) was proposed to evaluate the proximity to voltage instability in order to prevent voltage collapse. In the scenarios of disturbances, the proximity to voltage collapse...... scheme based on PMUs is promising, as it prevented the voltage collapse and minimized the load shedding area....

  17. A survey tool for measuring evidence-based decision making capacity in public health agencies

    Directory of Open Access Journals (Sweden)

    Jacobs Julie A

    2012-03-01

    Full Text Available Abstract Background While increasing attention is placed on using evidence-based decision making (EBDM to improve public health, there is little research assessing the current EBDM capacity of the public health workforce. Public health agencies serve a wide range of populations with varying levels of resources. Our survey tool allows an individual agency to collect data that reflects its unique workforce. Methods Health department leaders and academic researchers collaboratively developed and conducted cross-sectional surveys in Kansas and Mississippi (USA to assess EBDM capacity. Surveys were delivered to state- and local-level practitioners and community partners working in chronic disease control and prevention. The core component of the surveys was adopted from a previously tested instrument and measured gaps (importance versus availability in competencies for EBDM in chronic disease. Other survey questions addressed expectations and incentives for using EBDM, self-efficacy in three EBDM skills, and estimates of EBDM within the agency. Results In both states, participants identified communication with policymakers, use of economic evaluation, and translation of research to practice as top competency gaps. Self-efficacy in developing evidence-based chronic disease control programs was lower than in finding or using data. Public health practitioners estimated that approximately two-thirds of programs in their agency were evidence-based. Mississippi participants indicated that health department leaders' expectations for the use of EBDM was approximately twice that of co-workers' expectations and that the use of EBDM could be increased with training and leadership prioritization. Conclusions The assessment of EBDM capacity in Kansas and Mississippi built upon previous nationwide findings to identify top gaps in core competencies for EBDM in chronic disease and to estimate a percentage of programs in U.S. health departments that are evidence-based

  18. Ranking the Online Documents Based on Relative Credibility Measures

    Directory of Open Access Journals (Sweden)

    Ahmad Dahlan

    2013-09-01

    Full Text Available Information searching is the most popular activity in Internet. Usually the search engine provides the search results ranked by the relevance. However, for a certain purpose that concerns with information credibility, particularly citing information for scientific works, another approach of ranking the search engine results is required. This paper presents a study on developing a new ranking method based on the credibility of information. The method is built up upon two well-known algorithms, PageRank and Citation Analysis. The result of the experiment that used Spearman Rank Correlation Coefficient to compare the proposed rank (generated by the method with the standard rank (generated manually by a group of experts showed that the average Spearman 0 < rS < critical value. It means that the correlation was proven but it was not significant. Hence the proposed rank does not satisfy the standard but the performance could be improved.

  19. Ranking the Online Documents Based on Relative Credibility Measures

    Directory of Open Access Journals (Sweden)

    Ahmad Dahlan

    2009-05-01

    Full Text Available Information searching is the most popular activity in Internet. Usually the search engine provides the search results ranked by the relevance. However, for a certain purpose that concerns with information credibility, particularly citing information for scientific works, another approach of ranking the search engine results is required. This paper presents a study on developing a new ranking method based on the credibility of information. The method is built up upon two well-known algorithms, PageRank and Citation Analysis. The result of the experiment that used Spearman Rank Correlation Coefficient to compare the proposed rank (generated by the method with the standard rank (generated manually by a group of experts showed that the average Spearman 0 < rS < critical value. It means that the correlation was proven but it was not significant. Hence the proposed rank does not satisfy the standard but the performance could be improved.

  20. Transformer Temperature Measurment Using Optical Fiber Based Microbend Sensor

    Directory of Open Access Journals (Sweden)

    Deepika YADAV

    2007-10-01

    Full Text Available Breakdown of transformers proves to be very expensive and inconvenient because it takes a lot of time for their replacement. During breakdown the industry also incurs heavy losses because of stoppage in production line. A system for monitoring the temperature of transformers is required. Existing sensors cannot be used for monitoring the temperature of transformers because they are sensitive to electrical signals and can cause sparking which can trigger fire since there is oil in transformers cooling coils. Optical fibers are electrically inert so this system will prove to be ideal for this application. Results of investigations carried out by simulating a configuration of Optical Fiber Temperature Sensor for transformers based on microbending using Matlab as a simulation tool to evaluate the effectiveness of this sensor have been communicated through this manuscript. The results are in the form of graphs of intensity modulation vs. the temperature.

  1. Natural texture retrieval based on perceptual similarity measurement

    Science.gov (United States)

    Gao, Ying; Dong, Junyu; Lou, Jianwen; Qi, Lin; Liu, Jun

    2018-04-01

    A typical texture retrieval system performs feature comparison and might not be able to make human-like judgments of image similarity. Meanwhile, it is commonly known that perceptual texture similarity is difficult to be described by traditional image features. In this paper, we propose a new texture retrieval scheme based on texture perceptual similarity. The key of the proposed scheme is that prediction of perceptual similarity is performed by learning a non-linear mapping from image features space to perceptual texture space by using Random Forest. We test the method on natural texture dataset and apply it on a new wallpapers dataset. Experimental results demonstrate that the proposed texture retrieval scheme with perceptual similarity improves the retrieval performance over traditional image features.

  2. Glucose Monitoring System Based on Osmotic Pressure Measurements

    Directory of Open Access Journals (Sweden)

    Alexandra LEAL

    2011-02-01

    Full Text Available This paper presents the design and development of a prototype sensor unit for implementation in a long-term glucose monitoring system suitable for estimating glucose levels in people suffering from diabetes mellitus. The system utilizes osmotic pressure as the sensing mechanism and consists of a sensor prototype that is integrated together with a pre-amplifier and data acquisition unit for both data recording and processing. The sensor prototype is based on an embedded silicon absolute pressure transducer and a semipermeable nanoporous membrane that is enclosed in the sensor housing. The glucose monitoring system facilitates the integration of a low power microcontroller that is combined with a wireless inductive powered communication link. Experimental verification have proven that the system is capable of tracking osmotic pressure changes using albumin as a model compound, and thereby show a proof of concept for novel long term tracking of blood glucose from remote sensor nodes.

  3. Estimating spacecraft attitude based on in-orbit sensor measurements

    DEFF Research Database (Denmark)

    Jakobsen, Britt; Lyn-Knudsen, Kevin; Mølgaard, Mathias

    2014-01-01

    of 2014/15. To better evaluate the performance of the payload, it is desirable to couple the payload data with the satellite's orientation. With AAUSAT3 already in orbit it is possible to collect data directly from space in order to evaluate the performance of the attitude estimation. An extended kalman...... filter (EKF) is used for quaternion-based attitude estimation. A Simulink simulation environment developed for AAUSAT3, containing a "truth model" of the satellite and the orbit environment, is used to test the performance The performance is tested using different sensor noise parameters obtained both...... from a controlled environment on Earth as well as in-orbit. By using sensor noise parameters obtained on Earth as the expected parameters in the attitude estimation, and simulating the environment using the sensor noise parameters from space, it is possible to assess whether the EKF can be designed...

  4. PRETTY: Grazing altimetry measurements based on the interferometric method

    DEFF Research Database (Denmark)

    Høeg, Per; Fragner, Heinrich; Dielacher, Andreas

    2017-01-01

    The exploitation of signals stemming from global navigation systems for passive bistatic radar applications has beenproposed and implemented within numerous studies. The fact that such missions do not rely on high power amplifiersand that the need of high gain antennas with large geometrical...... dimensions can be avoided, makes them suitable forsmall satellite missions. Applications where a continuous high coverage is needed, as for example disaster warning,have the demand for a large number of satellites in orbit, which in turn requires small and relatively low cost satellites.The proposed PRETTY...... (Passive Reflectometry and Dosimetry) mission includes a demonstrator payload for passivereflectometry and scatterometry focusing on very low incidence angles whereby the direct and reflected signal will bereceived via the same antenna. The correlation of both signals will be done by a specific FPGA based...

  5. Efficient iris texture analysis method based on Gabor ordinal measures

    Science.gov (United States)

    Tajouri, Imen; Aydi, Walid; Ghorbel, Ahmed; Masmoudi, Nouri

    2017-07-01

    With the remarkably increasing interest directed to the security dimension, the iris recognition process is considered to stand as one of the most versatile technique critically useful for the biometric identification and authentication process. This is mainly due to every individual's unique iris texture. A modestly conceived efficient approach relevant to the feature extraction process is proposed. In the first place, iris zigzag "collarette" is extracted from the rest of the image by means of the circular Hough transform, as it includes the most significant regions lying in the iris texture. In the second place, the linear Hough transform is used for the eyelids' detection purpose while the median filter is applied for the eyelashes' removal. Then, a special technique combining the richness of Gabor features and the compactness of ordinal measures is implemented for the feature extraction process, so that a discriminative feature representation for every individual can be achieved. Subsequently, the modified Hamming distance is used for the matching process. Indeed, the advanced procedure turns out to be reliable, as compared to some of the state-of-the-art approaches, with a recognition rate of 99.98%, 98.12%, and 95.02% on CASIAV1.0, CASIAV3.0, and IIT Delhi V1 iris databases, respectively.

  6. UAV BASED BRDF-MEASUREMENTS OF AGRICULTURAL SURFACES WITH PFIFFIKUS

    Directory of Open Access Journals (Sweden)

    G. J. Grenzdörffer

    2012-09-01

    Full Text Available BRDF is a common problem in remote sensing and also in oblique photogrammetry. Common approaches of BRDF-measurement with a field goniometer are costly and rather cumbersome. UAVs may offer an interesting alternative by using a special flight pattern of oblique and converging images. The main part of this paper is the description of a photogrammetric workflow in order to determine the anisotropic reflection properties of a given surface. Due to the relatively low flying heights standard procedures from close range photogrammetry were adopted for outdoor usage. The photogrammetric processing delivered automatic and highly accurate orientation information with the aid of coded targets. The interior orientation of the consumer grade camera is more or less stable. The radiometrically corrected oblique images are converted into ortho photos. The azimuth and elevation angle of every point may then be computed. The calculated anisotropy of a winter wheat plot is shown. A system four diagonally-looking cameras (Four Vision and an additional nadir looking camera is under development. The multi camera system especially designed for a Micro- UAV with a payload of min 1 kg. The system is composed of five industrial digital frame cameras (1.3 Mpix CCD-chips, 15 fp/s with fixed lenses. Also special problems with the construction of a light weight housing of the multi camera solution are covered in the paper.

  7. Evaluation of the Relationship between Literacy and Mathematics Skills as Assessed by Curriculum-Based Measures

    Science.gov (United States)

    Rutherford-Becker, Kristy J.; Vanderwood, Michael L.

    2009-01-01

    The purpose of this study was to evaluate the extent that reading performance (as measured by curriculum-based measures [CBM] of oral reading fluency [ORF] and Maze reading comprehension), is related to math performance (as measured by CBM math computation and applied math). Additionally, this study examined which of the two reading measures was a…

  8. An evaluation of IASI-NH3 with ground-based Fourier transform infrared spectroscopy measurements

    Directory of Open Access Journals (Sweden)

    E. Dammers

    2016-08-01

    Full Text Available Global distributions of atmospheric ammonia (NH3 measured with satellite instruments such as the Infrared Atmospheric Sounding Interferometer (IASI contain valuable information on NH3 concentrations and variability in regions not yet covered by ground-based instruments. Due to their large spatial coverage and (bi-daily overpasses, the satellite observations have the potential to increase our knowledge of the distribution of NH3 emissions and associated seasonal cycles. However the observations remain poorly validated, with only a handful of available studies often using only surface measurements without any vertical information. In this study, we present the first validation of the IASI-NH3 product using ground-based Fourier transform infrared spectroscopy (FTIR observations. Using a recently developed consistent retrieval strategy, NH3 concentration profiles have been retrieved using observations from nine Network for the Detection of Atmospheric Composition Change (NDACC stations around the world between 2008 and 2015. We demonstrate the importance of strict spatio-temporal collocation criteria for the comparison. Large differences in the regression results are observed for changing intervals of spatial criteria, mostly due to terrain characteristics and the short lifetime of NH3 in the atmosphere. The seasonal variations of both datasets are consistent for most sites. Correlations are found to be high at sites in areas with considerable NH3 levels, whereas correlations are lower at sites with low atmospheric NH3 levels close to the detection limit of the IASI instrument. A combination of the observations from all sites (Nobs = 547 give a mean relative difference of −32.4 ± (56.3 %, a correlation r of 0.8 with a slope of 0.73. These results give an improved estimate of the IASI-NH3 product performance compared to the previous upper-bound estimates (−50 to +100 %.

  9. Detecting concealed information in less than a second: response latency-based measures

    NARCIS (Netherlands)

    Verschuere, B.; de Houwer, J.; Verschuere, B.; Ben-Shakhar, G.; Meijer, E.

    2011-01-01

    Concealed information can be accurately assessed with physiological measures. To overcome the practical limitations of physiological measures, an assessment using response latencies has been proposed. At first sight, research findings on response latency based concealed information tests seem

  10. Statistical shape modeling based renal volume measurement using tracked ultrasound

    Science.gov (United States)

    Pai Raikar, Vipul; Kwartowitz, David M.

    2017-03-01

    Autosomal dominant polycystic kidney disease (ADPKD) is the fourth most common cause of kidney transplant worldwide accounting for 7-10% of all cases. Although ADPKD usually progresses over many decades, accurate risk prediction is an important task.1 Identifying patients with progressive disease is vital to providing new treatments being developed and enable them to enter clinical trials for new therapy. Among other factors, total kidney volume (TKV) is a major biomarker predicting the progression of ADPKD. Consortium for Radiologic Imaging Studies in Polycystic Kidney Disease (CRISP)2 have shown that TKV is an early, and accurate measure of cystic burden and likely growth rate. It is strongly associated with loss of renal function.3 While ultrasound (US) has proven as an excellent tool for diagnosing the disease; monitoring short-term changes using ultrasound has been shown to not be accurate. This is attributed to high operator variability and reproducibility as compared to tomographic modalities such as CT and MR (Gold standard). Ultrasound has emerged as one of the standout modality for intra-procedural imaging and with methods for spatial localization has afforded us the ability to track 2D ultrasound in physical space which it is being used. In addition to this, the vast amount of recorded tomographic data can be used to generate statistical shape models that allow us to extract clinical value from archived image sets. In this work, we aim at improving the prognostic value of US in managing ADPKD by assessing the accuracy of using statistical shape model augmented US data, to predict TKV, with the end goal of monitoring short-term changes.

  11. Customized DSP-based vibration measurement for wind turbines

    Energy Technology Data Exchange (ETDEWEB)

    LaWhite, N.E.; Cohn, K.E. [Second Wind Inc., Somerville, MA (United States)

    1996-12-31

    As part of its Advanced Distributed Monitoring System (ADMS) project funded by NREL, Second Wind Inc. is developing a new vibration measurement system for use with wind turbines. The system uses low-cost accelerometers originally designed for automobile airbag crash-detection coupled with new software executed on a Digital Signal Processor (DSP) device. The system is envisioned as a means to monitor the mechanical {open_quotes}health{close_quotes} of the wind turbine over its lifetime. In addition the system holds promise as a customized emergency vibration detector. The two goals are very different and it is expected that different software programs will be executed for each function. While a fast Fourier transform (FFT) signature under given operating conditions can yield much information regarding turbine condition, the sampling period and processing requirements make it inappropriate for emergency condition monitoring. This paper briefly reviews the development of prototype DSP and accelerometer hardware. More importantly, it reviews our work to design prototype vibration alarm filters. Two-axis accelerometer test data from the experimental FloWind vertical axis wind turbine is analyzed and used as a development guide. Two levels of signal processing are considered. The first uses narrow band pre-processing filters at key fundamental frequencies such as the 1P, 2P and 3P. The total vibration energy in each frequency band is calculated and evaluated as a possible alarm trigger. In the second level of signal processing, the total vibration energy in each frequency band is further decomposed using the two-axis directional information. Directional statistics are calculated to differentiate between linear translations and circular translations. After analyzing the acceleration statistics for normal and unusual operating conditions, the acceleration processing system described could be used in automatic early detection of fault conditions. 9 figs.

  12. An explicit semantic relatedness measure based on random walk

    Directory of Open Access Journals (Sweden)

    HU Sihui

    2016-10-01

    Full Text Available The semantic relatedness calculation of open domain knowledge network is a significant issue.In this paper,pheromone strategy is drawn from the thought of ant colony algorithm and is integrated into the random walk which is taken as the basic framework of calculating the semantic relatedness degree.The pheromone distribution is taken as a criterion of determining the tightness degree of semantic relatedness.A method of calculating semantic relatedness degree based on random walk is proposed and the exploration process of calculating the semantic relatedness degree is presented in a dominant way.The method mainly contains Path Select Model(PSM and Semantic Relatedness Computing Model(SRCM.PSM is used to simulate the path selection of ants and pheromone release.SRCM is used to calculate the semantic relatedness by utilizing the information returned by ants.The result indicates that the method could complete semantic relatedness calculation in linear complexity and extend the feasible strategy of semantic relatedness calculation.

  13. Characterizing GEO Titan IIIC Transtage Fragmentations Using Ground-based and Telescopic Measurements

    Science.gov (United States)

    Cowardin, H.; Anz-Meador, P.; Reyes, J. A.

    In a continued effort to better characterize the geosynchronous orbit (GEO) environment, NASA’s Orbital Debris Program Office (ODPO) utilizes various ground-based optical assets to acquire photometric and spectral data of known debris associated with fragmentations in or near GEO. The Titan IIIC Transtage upper stage is known to have fragmented four times. Two of the four fragmentations were in GEO while the Transtage fragmented a third time in GEO transfer orbit. The forth fragmentation occurred in low Earth orbit. To better assess and characterize these fragmentations, the NASA ODPO acquired a Titan Transtage test and display article previously in the custody of the 309th Aerospace Maintenance and Regeneration Group (AMARG) in Tucson, Arizona. After initial inspections at AMARG demonstrated that it was of sufficient fidelity to be of interest, the test article was brought to NASA Johnson Space Center (JSC) to continue material analysis and historical documentation. The Transtage has undergone two separate spectral measurement campaigns to characterize the reflectance spectroscopy of historical aerospace materials. These data have been incorporated into the NASA Spectral Database, with the goal of using telescopic data comparisons for potential material identification. A Light Detection and Ranging (LIDAR) system scan also has been completed and a scale model has been created for use in the Optical Measurement Center (OMC) for photometric analysis of an intact Transtage, including bidirectional reflectance distribution function (BRDF) measurements. An historical overview of the Titan IIIC Transtage, the current analysis that has been done to date, and the future work to be completed in support of characterizing the GEO and near GEO orbital debris environment will be discussed in the subsequent presentation.

  14. A Clustering-Oriented Closeness Measure Based on Neighborhood Chain and Its Application in the Clustering Ensemble Framework Based on the Fusion of Different Closeness Measures

    Directory of Open Access Journals (Sweden)

    Shaoyi Liang

    2017-09-01

    Full Text Available Closeness measures are crucial to clustering methods. In most traditional clustering methods, the closeness between data points or clusters is measured by the geometric distance alone. These metrics quantify the closeness only based on the concerned data points’ positions in the feature space, and they might cause problems when dealing with clustering tasks having arbitrary clusters shapes and different clusters densities. In this paper, we first propose a novel Closeness Measure between data points based on the Neighborhood Chain (CMNC. Instead of using geometric distances alone, CMNC measures the closeness between data points by quantifying the difficulty for one data point to reach another through a chain of neighbors. Furthermore, based on CMNC, we also propose a clustering ensemble framework that combines CMNC and geometric-distance-based closeness measures together in order to utilize both of their advantages. In this framework, the “bad data points” that are hard to cluster correctly are identified; then different closeness measures are applied to different types of data points to get the unified clustering results. With the fusion of different closeness measures, the framework can get not only better clustering results in complicated clustering tasks, but also higher efficiency.

  15. THE MEASUREMENT BASES AND THE ANALYSIS OFTHOSE FOR QUALITATIVE CHARACTERISTICS OF FINANCIAL STATEMENTS

    OpenAIRE

    Hikmet Ulusan

    2008-01-01

    The measurement bases of assets and liabilities for financial reporting are basically included: historical cost, replacement cost, net realizable value, value in use, deprival value and fair value. The first part of this study deals with the measurement bases of assets and liabilities for financial reporting. In the second part, the measurement bases for the qualitative characteristics that determine the usefulness of information provided in financial statements areanalyzed.

  16. A Laser-Based Measuring System for Online Quality Control of Car Engine Block

    Directory of Open Access Journals (Sweden)

    Xing-Qiang Li

    2016-11-01

    Full Text Available For online quality control of car engine production, pneumatic measurement instrument plays an unshakeable role in measuring diameters inside engine block because of its portability and high-accuracy. To the limitation of its measuring principle, however, the working space between the pneumatic device and measured surface is too small to require manual operation. This lowers the measuring efficiency and becomes an obstacle to perform automatic measurement. In this article, a high-speed, automatic measuring system is proposed to take the place of pneumatic devices by using a laser-based measuring unit. The measuring unit is considered as a set of several measuring modules, where each of them acts like a single bore gauge and is made of four laser triangulation sensors (LTSs, which are installed on different positions and in opposite directions. The spatial relationship among these LTSs was calibrated before measurements. Sampling points from measured shaft holes can be collected by the measuring unit. A unified mathematical model was established for both calibration and measurement. Based on the established model, the relative pose between the measuring unit and measured workpiece does not impact the measuring accuracy. This frees the measuring unit from accurate positioning or adjustment, and makes it possible to realize fast and automatic measurement. The proposed system and method were finally validated by experiments.

  17. Quantum supremacy in constant-time measurement-based computation: A unified architecture for sampling and verification

    Science.gov (United States)

    Miller, Jacob; Sanders, Stephen; Miyake, Akimasa

    2017-12-01

    While quantum speed-up in solving certain decision problems by a fault-tolerant universal quantum computer has been promised, a timely research interest includes how far one can reduce the resource requirement to demonstrate a provable advantage in quantum devices without demanding quantum error correction, which is crucial for prolonging the coherence time of qubits. We propose a model device made of locally interacting multiple qubits, designed such that simultaneous single-qubit measurements on it can output probability distributions whose average-case sampling is classically intractable, under similar assumptions as the sampling of noninteracting bosons and instantaneous quantum circuits. Notably, in contrast to these previous unitary-based realizations, our measurement-based implementation has two distinctive features. (i) Our implementation involves no adaptation of measurement bases, leading output probability distributions to be generated in constant time, independent of the system size. Thus, it could be implemented in principle without quantum error correction. (ii) Verifying the classical intractability of our sampling is done by changing the Pauli measurement bases only at certain output qubits. Our usage of random commuting quantum circuits in place of computationally universal circuits allows a unique unification of sampling and verification, so they require the same physical resource requirements in contrast to the more demanding verification protocols seen elsewhere in the literature.

  18. Repeat immigration: A previously unobserved source of heterogeneity?

    Science.gov (United States)

    Aradhya, Siddartha; Scott, Kirk; Smith, Christopher D

    2017-07-01

    Register data allow for nuanced analyses of heterogeneities between sub-groups which are not observable in other data sources. One heterogeneity for which register data is particularly useful is in identifying unique migration histories of immigrant populations, a group of interest across disciplines. Years since migration is a commonly used measure of integration in studies seeking to understand the outcomes of immigrants. This study constructs detailed migration histories to test whether misclassified migrations may mask important heterogeneities. In doing so, we identify a previously understudied group of migrants called repeat immigrants, and show that they differ systematically from permanent immigrants. In addition, we quantify the degree to which migration information is misreported in the registers. The analysis is carried out in two steps. First, we estimate income trajectories for repeat immigrants and permanent immigrants to understand the degree to which they differ. Second, we test data validity by cross-referencing migration information with changes in income to determine whether there are inconsistencies indicating misreporting. From the first part of the analysis, the results indicate that repeat immigrants systematically differ from permanent immigrants in terms of income trajectories. Furthermore, income trajectories differ based on the way in which years since migration is calculated. The second part of the analysis suggests that misreported migration events, while present, are negligible. Repeat immigrants differ in terms of income trajectories, and may differ in terms of other outcomes as well. Furthermore, this study underlines that Swedish registers provide a reliable data source to analyze groups which are unidentifiable in other data sources.

  19. A comparison of manual anthropometric measurements with Kinect-based scanned measurements in terms of precision and reliability.

    Science.gov (United States)

    Bragança, Sara; Arezes, Pedro; Carvalho, Miguel; Ashdown, Susan P; Castellucci, Ignacio; Leão, Celina

    2018-01-01

    Collecting anthropometric data for real-life applications demands a high degree of precision and reliability. It is important to test new equipment that will be used for data collectionOBJECTIVE:Compare two anthropometric data gathering techniques - manual methods and a Kinect-based 3D body scanner - to understand which of them gives more precise and reliable results. The data was collected using a measuring tape and a Kinect-based 3D body scanner. It was evaluated in terms of precision by considering the regular and relative Technical Error of Measurement and in terms of reliability by using the Intraclass Correlation Coefficient, Reliability Coefficient, Standard Error of Measurement and Coefficient of Variation. The results obtained showed that both methods presented better results for reliability than for precision. Both methods showed relatively good results for these two variables, however, manual methods had better results for some body measurements. Despite being considered sufficiently precise and reliable for certain applications (e.g. apparel industry), the 3D scanner tested showed, for almost every anthropometric measurement, a different result than the manual technique. Many companies design their products based on data obtained from 3D scanners, hence, understanding the precision and reliability of the equipment used is essential to obtain feasible results.

  20. Automatic calibration system of the temperature instrument display based on computer vision measuring

    Science.gov (United States)

    Li, Zhihong; Li, Jinze; Bao, Changchun; Hou, Guifeng; Liu, Chunxia; Cheng, Fang; Xiao, Nianxin

    2010-07-01

    With the development of computers and the techniques of dealing with pictures and computer optical measurement, various measuring techniques are maturing gradually on the basis of optical picture processing technique and using in practice. On the bases, we make use of the many years' experience and social needs in temperature measurement and computer vision measurement to come up with the completely automatic way of the temperature measurement meter with integration of the computer vision measuring technique. It realizes synchronization collection with theory temperature value, improves calibration efficiency. based on least square fitting principle, integrate data procession and the best optimize theory, rapidly and accurately realizes automation acquisition and calibration of temperature.

  1. Utility of the Canadian Occupational Performance Measure as an admission and outcome measure in interdisciplinary community-based geriatric rehabilitation

    DEFF Research Database (Denmark)

    Larsen, Anette Enemark; Carlsson, Gunilla

    2012-01-01

    In a community-based geriatric rehabilitation project, the Canadian Occupational Performance Measure (COPM) was used to develop a coordinated, interdisciplinary, and client-centred approach focusing on occupational performance. The purpose of this study was to evaluate the utility of the COPM as ...... physician, home care, occupational therapy, physiotherapy...

  2. Testing for Distortions in Performance Measures: An Application to Residual Income Based Measures like Economic Value Added

    NARCIS (Netherlands)

    Sloof, R.; van Praag, M.

    2015-01-01

    Distorted performance measures in compensation contracts elicit suboptimal behavioral responses that may even prove to be dysfunctional (gaming). This paper applies the empirical test developed by Courty and Marschke (2008) to detect whether the widely used class of Residual Income based performance

  3. 10 CFR Appendix W to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Medium Base Compact Fluorescent Lamps

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Uniform Test Method for Measuring the Energy Consumption of Medium Base Compact Fluorescent Lamps W Appendix W to Subpart B of Part 430 Energy DEPARTMENT OF... the previous step. Round the final energy consumption value, as applicable, to the nearest decimal...

  4. Measurement of left ventricular torsion using block-matching-based speckle tracking for two-dimensional echocardiography

    Science.gov (United States)

    Sun, Feng-Rong; Wang, Xiao-Jing; Wu, Qiang; Yao, Gui-Hua; Zhang, Yun

    2013-01-01

    Left ventricular (LV) torsion is a sensitive and global index of LV systolic and diastolic function, but how to noninvasively measure it is challenging. Two-dimensional echocardiography and the block-matching based speckle tracking method were used to measure LV torsion. Main advantages of the proposed method over the previous ones are summarized as follows: (1) The method is automatic, except for manually selecting some endocardium points on the end-diastolic frame in initialization step. (2) The diamond search strategy is applied, with a spatial smoothness constraint introduced into the sum of absolute differences matching criterion; and the reference frame during the search is determined adaptively. (3) The method is capable of removing abnormal measurement data automatically. The proposed method was validated against that using Doppler tissue imaging and some preliminary clinical experimental studies were presented to illustrate clinical values of the proposed method.

  5. Measurement of Negativity Bias in Personal Narratives Using Corpus-Based Emotion Dictionaries

    Science.gov (United States)

    Cohen, Shuki J.

    2011-01-01

    This study presents a novel methodology for the measurement of negativity bias using positive and negative dictionaries of emotion words applied to autobiographical narratives. At odds with the cognitive theory of mood dysregulation, previous text-analytical studies have failed to find significant correlation between emotion dictionaries and…

  6. An efficient binomial model-based measure for sequence comparison and its application.

    Science.gov (United States)

    Liu, Xiaoqing; Dai, Qi; Li, Lihua; He, Zerong

    2011-04-01

    Sequence comparison is one of the major tasks in bioinformatics, which could serve as evidence of structural and functional conservation, as well as of evolutionary relations. There are several similarity/dissimilarity measures for sequence comparison, but challenges remains. This paper presented a binomial model-based measure to analyze biological sequences. With help of a random indicator, the occurrence of a word at any position of sequence can be regarded as a random Bernoulli variable, and the distribution of a sum of the word occurrence is well known to be a binomial one. By using a recursive formula, we computed the binomial probability of the word count and proposed a binomial model-based measure based on the relative entropy. The proposed measure was tested by extensive experiments including classification of HEV genotypes and phylogenetic analysis, and further compared with alignment-based and alignment-free measures. The results demonstrate that the proposed measure based on binomial model is more efficient.

  7. Electron beam based transversal profile measurements of intense ion beams

    International Nuclear Information System (INIS)

    El Moussati, Said

    2014-01-01

    A non-invasive diagnostic method for the experimental determination of the transverse profile of an intense ion beam has been developed and investigated theoretically as well as experimentally within the framework of the present work. The method is based on the deflection of electrons when passing the electromagnetic field of an ion beam. To achieve this an electron beam is employed with a specifically prepared transversal profile. This distinguish this method from similar ones which use thin electron beams for scanning the electromagnetic field [Roy et al. 2005; Blockland10]. The diagnostic method presented in this work will be subsequently called ''Electron-Beam-Imaging'' (EBI). First of all the influence of the electromagnetic field of the ion beam on the electrons has been theoretically analyzed. It was found that the magnetic field causes only a shift of the electrons along the ion beam axis, while the electric field only causes a shift in a plane transverse to the ion beam. Moreover, in the non-relativistic case the magnetic force is significantly smaller than the Coulomb one and the electrons suffer due to the magnetic field just a shift and continue to move parallel to their initial trajectory. Under the influence of the electric field, the electrons move away from the ion beam axis, their resulting trajectory shows a specific angle compared to the original direction. This deflection angle practically depends just on the electric field of the ion beam. Thus the magnetic field has been neglected when analysing the experimental data. The theoretical model provides a relationship between the deflection angle of the electrons and the charge distribution in the cross section of the ion beam. The model however only can be applied for small deflection angles. This implies a relationship between the line-charge density of the ion beam and the initial kinetic energy of the electrons. Numerical investigations have been carried out to clarify the

  8. THE OPTIMIZATION OF TECHNOLOGICAL MINING PARAMETERS IN QUARRY FOR DIMENSION STONE BLOCKS QUALITY IMPROVEMENT BASED ON PHOTOGRAMMETRIC TECHNIQUES OF MEASUREMENT

    Directory of Open Access Journals (Sweden)

    Ruslan Sobolevskyi

    2018-01-01

    Full Text Available This research focuses on patterns of change in the dimension stone commodity blocks quality production on previously identifi ed and measured geometrical parameters of natural cracks, modelling and planning out the fi nal dimension of stone products and fi nished products based on the proposed digital photogrammetric techniques. The optimal parameters of surveying are investigated and the infl uence of surveying distance to length and crack area is estimated. Rational technological parameters of dimension stone blocks production are taken into account.

  9. How Do Undergraduate Students Conceptualize Acid-Base Chemistry? Measurement of a Concept Progression

    Science.gov (United States)

    Romine, William L.; Todd, Amber N.; Clark, Travis B.

    2016-01-01

    We developed and validated a new instrument, called "Measuring Concept progressions in Acid-Base chemistry" (MCAB) and used it to better understand the progression of undergraduate students' understandings about acid-base chemistry. Items were developed based on an existing learning progression for acid-base chemistry. We used the Rasch…

  10. DNA methylation-based measures of biological age: meta-analysis predicting time to death

    Science.gov (United States)

    Chen, Brian H.; Marioni, Riccardo E.; Colicino, Elena; Peters, Marjolein J.; Ward-Caviness, Cavin K.; Tsai, Pei-Chien; Roetker, Nicholas S.; Just, Allan C.; Demerath, Ellen W.; Guan, Weihua; Bressler, Jan; Fornage, Myriam; Studenski, Stephanie; Vandiver, Amy R.; Moore, Ann Zenobia; Tanaka, Toshiko; Kiel, Douglas P.; Liang, Liming; Vokonas, Pantel; Schwartz, Joel; Lunetta, Kathryn L.; Murabito, Joanne M.; Bandinelli, Stefania; Hernandez, Dena G.; Melzer, David; Nalls, Michael; Pilling, Luke C.; Price, Timothy R.; Singleton, Andrew B.; Gieger, Christian; Holle, Rolf; Kretschmer, Anja; Kronenberg, Florian; Kunze, Sonja; Linseisen, Jakob; Meisinger, Christine; Rathmann, Wolfgang; Waldenberger, Melanie; Visscher, Peter M.; Shah, Sonia; Wray, Naomi R.; McRae, Allan F.; Franco, Oscar H.; Hofman, Albert; Uitterlinden, André G.; Absher, Devin; Assimes, Themistocles; Levine, Morgan E.; Lu, Ake T.; Tsao, Philip S.; Hou, Lifang; Manson, JoAnn E.; Carty, Cara L.; LaCroix, Andrea Z.; Reiner, Alexander P.; Spector, Tim D.; Feinberg, Andrew P.; Levy, Daniel; Baccarelli, Andrea; van Meurs, Joyce; Bell, Jordana T.; Peters, Annette; Deary, Ian J.; Pankow, James S.; Ferrucci, Luigi; Horvath, Steve

    2016-01-01

    Estimates of biological age based on DNA methylation patterns, often referred to as “epigenetic age”, “DNAm age”, have been shown to be robust biomarkers of age in humans. We previously demonstrated that independent of chronological age, epigenetic age assessed in blood predicted all-cause mortality in four human cohorts. Here, we expanded our original observation to 13 different cohorts for a total sample size of 13,089 individuals, including three racial/ethnic groups. In addition, we examined whether incorporating information on blood cell composition into the epigenetic age metrics improves their predictive power for mortality. All considered measures of epigenetic age acceleration were predictive of mortality (p≤8.2×10−9), independent of chronological age, even after adjusting for additional risk factors (p<5.4×10−4), and within the racial/ethnic groups that we examined (non-Hispanic whites, Hispanics, African Americans). Epigenetic age estimates that incorporated information on blood cell composition led to the smallest p-values for time to death (p=7.5×10−43). Overall, this study a) strengthens the evidence that epigenetic age predicts all-cause mortality above and beyond chronological age and traditional risk factors, and b) demonstrates that epigenetic age estimates that incorporate information on blood cell counts lead to highly significant associations with all-cause mortality. PMID:27690265

  11. Odour-based context reinstatement effects with indirect measures of memory: the curious case of rosemary.

    Science.gov (United States)

    Ball, Linden J; Shoker, Jaswinder; Miles, Jeremy N V

    2010-11-01

    Previous studies examining environmental context-dependent memory (ECDM) effects using indirect measures of memory have produced inconsistent findings. We report three experiments that examined ECDM in an indirect memory paradigm (word-fragment completion) using ambient odours as environmental contexts. Expt 1 manipulated the odour present at learning and testing (rosemary or lemon) to produce reinstated-context or switched-context conditions. Reinstating rosemary led to a striking ECDM effect, indicating that indirect memory testing can be sensitive to ECDM manipulations. Odour ratings also indicated that rosemary induced a more unpleasant mood in participants than lemon. Expt 2 assessed the influence on indirect retrieval of odour-based mood induction as well as odour distinctiveness, and indicated that rosemary's capacity to promote ECDM effects appears to arise from an additive combination of its unpleasantness-inducing properties and its distinctiveness. Expt 3 partially supported these proposals. Overall, our findings indicate that some odours are capable of producing ECDM effects using indirect testing procedures. Moreover, it appears that it is the inherent proprieties of odours on dimensions such as unpleasantness and distinctiveness that mediate the emergence of ECDM effects, thereby explaining the particular potency of rosemary's mnemonic influence when it is reinstated.

  12. Do Performance-based Health Measures Reflect Differences in Frailty Among Immigrants Age 50+ in Europe?

    Science.gov (United States)

    Brothers, Thomas D; Theou, Olga; Rockwood, Kenneth

    2014-09-01

    Life course influences, including country of residence and country of birth, are associated with frailty index scores. We investigated these associations using performance-based health measures. Among 33,745 participants age 50+ (mean age 64.8 ± 10.1; 55% women) in the Survey of Health, Ageing, and Retirement in Europe, grip strength, delayed word recall, and semantic verbal fluency were assessed. Participants were grouped by country of residence (Northern/Western Europe or Southern/Eastern Europe), and by country of birth (native-born, immigrants born in low- and middle-income countries [LMICs], or immigrants born in high-income countries [HICs]). Participants in Southern/Eastern Europe had lower mean test scores than those in Northern/Western Europe, and their scores did not differ by country of birth group. In Northern/Western Europe, compared with native-born participants, LMIC-born immigrants demonstrated lower mean grip strength (32.8 ± 7.6 kg vs. 35.7 ± 7.7 kg), delayed recall (2.9 ± 1.9 vs. 3.6 ± 1.9), and verbal fluency scores (16.0 ± 6.9 vs. 20.3 ± 7.0). HIC-born immigrants had mean scores higher than LMIC-born immigrants, but lower than native-born participants (all pnational income levels of both country of residence and country of birth. This was similar to previously observed differences in frailty index scores.

  13. Software development based on high speed PC oscilloscope for automated pulsed magnetic field measurement system

    International Nuclear Information System (INIS)

    Sun Yuxiang; Shang Lei; Li Ji; Ge Lei

    2011-01-01

    It introduces a method of a software development which is based on high speed PC oscilloscope for pulsed magnetic field measurement system. The previous design has been improved by this design, high-speed virtual oscilloscope has been used in the field for the first time. In the design, the automatic data acquisition, data process, data analysis and storage have been realized. Automated point checking reduces the workload. The use of precise motion bench increases the positioning accuracy. The software gets the data from PC oscilloscope by calling DLLs and includes the function of oscilloscope, such as trigger, ranges, and sample rate setting etc. Spline Interpolation and Bandstop Filter are used to denoise the signals. The core of the software is the state machine which controls the motion of stepper motors and data acquisition and stores the data automatically. NI Vision Acquisition Software and Database Connectivity Toolkit make the video surveillance of laboratory and MySQL database connectivity available. The raw signal and processed signal have been compared in this paper. The waveform has been greatly improved by the signal processing. (authors)

  14. Evaluation of stress gradient by x-ray stress measurement based on change in angle phi

    International Nuclear Information System (INIS)

    Sasaki, Toshihiko; Kuramoto, Makoto; Yoshioka, Yasuo.

    1985-01-01

    A new principle of X-ray stress evaluation for a sample with steep stress gradient has been prosed. The feature of this method is that the stress is determined by using so-called phi-method based on the change of phi-angle and thus has no effect on the penetration depth of X-rays. The procedure is as follows; firstly, an average stress within the penetration depth of X-rays is determined by changing only phi-angle under a fixed psi-angle, and then a distribution of the average stress vs. the penetration depth of X-rays is detected by repeating the similar procedure at different psi-angles. The following conclusions were found out as the result of residual stress measurements on a carbon steel of type S 55 C polished by emery paper. This method is practical enough to use for a plane stress problem. And the assumption of a linear stress gradient adopted in the authors' previous investigations is valid. In case of a triaxial stress analysis, this method is effective for the solution of three shearing stresses. However, three normal stresses can not be solved perfectly except particular psi-angles. (author)

  15. α-Cut method based importance measure for criticality analysis in fuzzy probability – Based fault tree analysis

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry; Sony Tjahyani, D.T.; Widodo, Surip; Tjahjono, Hendro

    2017-01-01

    Highlights: •FPFTA deals with epistemic uncertainty using fuzzy probability. •Criticality analysis is important for reliability improvement. •An α-cut method based importance measure is proposed for criticality analysis in FPFTA. •The α-cut method based importance measure utilises α-cut multiplication, α-cut subtraction, and area defuzzification technique. •Benchmarking confirm that the proposed method is feasible for criticality analysis in FPFTA. -- Abstract: Fuzzy probability – based fault tree analysis (FPFTA) has been recently developed and proposed to deal with the limitations of conventional fault tree analysis. In FPFTA, reliabilities of basic events, intermediate events and top event are characterized by fuzzy probabilities. Furthermore, the quantification of the FPFTA is based on fuzzy multiplication rule and fuzzy complementation rule to propagate uncertainties from basic event to the top event. Since the objective of the fault tree analysis is to improve the reliability of the system being evaluated, it is necessary to find the weakest path in the system. For this purpose, criticality analysis can be implemented. Various importance measures, which are based on conventional probabilities, have been developed and proposed for criticality analysis in fault tree analysis. However, not one of those importance measures can be applied for criticality analysis in FPFTA, which is based on fuzzy probability. To be fully applied in nuclear power plant probabilistic safety assessment, FPFTA needs to have its corresponding importance measure. The objective of this study is to develop an α-cut method based importance measure to evaluate and rank the importance of basic events for criticality analysis in FPFTA. To demonstrate the applicability of the proposed measure, a case study is performed and its results are then benchmarked to the results generated by the four well known importance measures in conventional fault tree analysis. The results

  16. Quantum authentication based on the randomness of measurement bases in BB84

    International Nuclear Information System (INIS)

    Dang Minh Dung; Bellot, P.; Alleaume, R.

    2005-01-01

    Full text: The establishment of a secret key between two legitimate end points of a communication link, let us name them Alice and Bob, using Quantum key distribution (QKD) is unconditionally secure thanks to Quantum Physics laws.However, the various QKD protocols do not intend to provide the authentication of the end points: Alice cannot be sure that she is communicating with Bob and reciprocally. Therefore, these protocols are subjects to various attacks. The most obvious attack is the man-in-the-middle attack in which an eavesdropper, let us name her Eve, stands in the middle of the communication link. Alice communicates with Eve meanwhile she thinks she communicate with Bob. And Bob communicates with Eve meanwhile he thinks he is communicating with Alice. Eve, acting as a relay, can read all the communications between Alice and Bob and retransmit them. To prevent this kind of attack, the solution is to authenticate the two end points of the communication link. One solution is that Alice and Bob share an authentication key prior to the communication. In order to improve the security, Alice and Bob must share a set of authentication one-time keys. One-time key means that the key has to be used only once because each time a key is used, the eavesdropper Eve can gain a few information on the key. Re-using the same key many times would finally reveal the key to Eve. However, Eve can simulate many times the authentication process with Alice. Each time Eve simulates the authentication process, one of the pre-positioned keys is depleted leading to the exhaustion of the set of pre-positioned keys. This type of attack is named Denial of Service attack. In this work, we propose to use the randomness of the measurement bases in BB84 to build an authentication scheme based on the existence of a prepositioned authentication key. This authentication scheme can be used with BB84 but also with any other Quantum Key Distribution protocols. It is protected against the Denial of

  17. Research on Water Velocity Measurement of Reservoir Based on Pressure Sensor

    Directory of Open Access Journals (Sweden)

    Xiaoqiang Zhao

    2014-11-01

    Full Text Available To address the problem that pressure sensor can only measure the liquid level in reservoir, we designed a current velocity measurement system of reservoir based on pressure sensor, analyzed the error of current velocity measurement system, and proposed the error processing method and corresponding program. Several tests and experimental results show that in this measurement system, the liquid level measurement standard deviation is no more than 0.01 cm, and the current velocity measurement standard deviation is no more than 0.35 mL/s, which proves that the pressure sensor can measure both liquid level and current velocity synchronously.

  18. Model-based wear measurements in total knee arthroplasty : development and validation of novel radiographic techniques

    NARCIS (Netherlands)

    IJsseldijk, van E.A.

    2016-01-01

    The primary aim of this work was to develop novel model-based mJSW measurement methods using a 3D reconstruction and compare the accuracy and precision of these methods to conventional mJSW measurement. This thesis contributed to the development, validation and clinical application of model-based

  19. Measuring individual significant change on the Beck Depression Inventory-II through IRT-based statistics.

    NARCIS (Netherlands)

    Brouwer, D.; Meijer, R.R.; Zevalkink, D.J.

    2013-01-01

    Several researchers have emphasized that item response theory (IRT)-based methods should be preferred over classical approaches in measuring change for individual patients. In the present study we discuss and evaluate the use of IRT-based statistics to measure statistical significant individual

  20. NO2 DOAS measurements from ground and space: comparison of ground based measurements and OMI data in Mexico City

    Science.gov (United States)

    Rivera, C.; Stremme, W.; Grutter, M.

    2012-04-01

    The combination of satellite data and ground based measurements can provide valuable information about atmospheric chemistry and air quality. In this work we present a comparison between measured ground based NO2 differential columns at the Universidad Nacional Autónoma de México (UNAM) in Mexico City, using the Differential Optical Absorption Spectroscopy (DOAS) technique and NO2 total columns measured by the Ozone Monitoring Instrument (OMI) onboard the Aura satellite using the same measurement technique. From these data, distribution maps of average NO2 above the Mexico basin were constructed and hot spots inside the city could be identified. In addition, a clear footprint was detected from the Tula industrial area, ~50 km northwest of Mexico City, where a refinery, a power plant and other industries are located. A less defined footprint was identified in the Cuernavaca basin, South of Mexico City, and the nearby cities of Toluca and Puebla do not present strong enhancements in the NO2 total columns. With this study we expect to cross-validate space and ground measurements and provide useful information for future studies.

  1. Alcohol consumption trends in Australia: Comparing surveys and sales-based measures.

    Science.gov (United States)

    Livingston, Michael; Callinan, Sarah; Raninen, Jonas; Pennay, Amy; Dietze, Paul M

    2018-04-01

    Survey data remain a crucial means for monitoring alcohol consumption, but there has been limited work done to ensure that surveys adequately capture changes in per-capita consumption in Australia. In this study, we explore how trends in consumption from two major Australian surveys compare with an official measure of per-capita consumption between 2001 and 2014 and examine age-specific trends in drinking. Data were from five waves of the cross-sectional National Health Survey (total n = 113 279) and 12 waves of the longitudinal Household Income and Labour Dynamics in Australia Study (average n = 12 347). Overall and age-specific estimates of annual alcohol consumption were derived and compared with official per-capita consumption and previous analyses of the National Drug Strategy Household Survey. In terms of overall consumption, both surveys broadly reflected trends in per-capita consumption, especially the decline that has been observed since 2007/2008. Age-specific trends were broadly similar, with the recent decline in consumption clearly concentrated among teenagers and young adults. The main Australian monitoring surveys remain useful monitoring tools for alcohol consumption in Australia. There is consistent evidence that the recent declines in Australian per-capita consumption have been driven by sharp falls in drinking among young people, a trend that requires further study. [Livingston M, Callinan S, Raninen J, Pennay A, Dietze PM. Alcohol consumption trends in Australia: Comparing surveys and sales-based measures. Drug Alcohol Rev 2017;00:000-000]. © 2017 Australasian Professional Society on Alcohol and other Drugs.

  2. Effectiveness of Variable-Gain Kalman Filter Based on Angle Error Calculated from Acceleration Signals in Lower Limb Angle Measurement with Inertial Sensors

    Science.gov (United States)

    Watanabe, Takashi

    2013-01-01

    The wearable sensor system developed by our group, which measured lower limb angles using Kalman-filtering-based method, was suggested to be useful in evaluation of gait function for rehabilitation support. However, it was expected to reduce variations of measurement errors. In this paper, a variable-Kalman-gain method based on angle error that was calculated from acceleration signals was proposed to improve measurement accuracy. The proposed method was tested comparing to fixed-gain Kalman filter and a variable-Kalman-gain method that was based on acceleration magnitude used in previous studies. First, in angle measurement in treadmill walking, the proposed method measured lower limb angles with the highest measurement accuracy and improved significantly foot inclination angle measurement, while it improved slightly shank and thigh inclination angles. The variable-gain method based on acceleration magnitude was not effective for our Kalman filter system. Then, in angle measurement of a rigid body model, it was shown that the proposed method had measurement accuracy similar to or higher than results seen in other studies that used markers of camera-based motion measurement system fixing on a rigid plate together with a sensor or on the sensor directly. The proposed method was found to be effective in angle measurement with inertial sensors. PMID:24282442

  3. A Lab Assembled Microcontroller-Based Sensor Module for Continuous Oxygen Measurement in Portable Hypoxia Chambers

    Science.gov (United States)

    Mathupala, Saroj P.; Kiousis, Sam; Szerlip, Nicholas J.

    2016-01-01

    Background Hypoxia-based cell culture experiments are routine and essential components of in vitro cancer research. Most laboratories use low-cost portable modular chambers to achieve hypoxic conditions for cell cultures, where the sealed chambers are purged with a gas mixture of preset O2 concentration. Studies are conducted under the assumption that hypoxia remains unaltered throughout the 48 to 72 hour duration of such experiments. Since these chambers lack any sensor or detection system to monitor gas-phase O2, the cell-based data tend to be non-uniform due to the ad hoc nature of the experimental setup. Methodology With the availability of low-cost open-source microcontroller-based electronic project kits, it is now possible for researchers to program these with easy-to-use software, link them to sensors, and place them in basic scientific apparatus to monitor and record experimental parameters. We report here the design and construction of a small-footprint kit for continuous measurement and recording of O2 concentration in modular hypoxia chambers. The low-cost assembly (US$135) consists of an Arduino-based microcontroller, data-logging freeware, and a factory pre-calibrated miniature O2 sensor. A small, intuitive software program was written by the authors to control the data input and output. The basic nature of the kit will enable any student in biology with minimal experience in hobby-electronics to assemble the system and edit the program parameters to suit individual experimental conditions. Results/Conclusions We show the kit’s utility and stability of data output via a series of hypoxia experiments. The studies also demonstrated the critical need to monitor and adjust gas-phase O2 concentration during hypoxia-based experiments to prevent experimental errors or failure due to partial loss of hypoxia. Thus, incorporating the sensor-microcontroller module to a portable hypoxia chamber provides a researcher a capability that was previously available

  4. A Lab Assembled Microcontroller-Based Sensor Module for Continuous Oxygen Measurement in Portable Hypoxia Chambers.

    Directory of Open Access Journals (Sweden)

    Saroj P Mathupala

    Full Text Available Hypoxia-based cell culture experiments are routine and essential components of in vitro cancer research. Most laboratories use low-cost portable modular chambers to achieve hypoxic conditions for cell cultures, where the sealed chambers are purged with a gas mixture of preset O2 concentration. Studies are conducted under the assumption that hypoxia remains unaltered throughout the 48 to 72 hour duration of such experiments. Since these chambers lack any sensor or detection system to monitor gas-phase O2, the cell-based data tend to be non-uniform due to the ad hoc nature of the experimental setup.With the availability of low-cost open-source microcontroller-based electronic project kits, it is now possible for researchers to program these with easy-to-use software, link them to sensors, and place them in basic scientific apparatus to monitor and record experimental parameters. We report here the design and construction of a small-footprint kit for continuous measurement and recording of O2 concentration in modular hypoxia chambers. The low-cost assembly (US$135 consists of an Arduino-based microcontroller, data-logging freeware, and a factory pre-calibrated miniature O2 sensor. A small, intuitive software program was written by the authors to control the data input and output. The basic nature of the kit will enable any student in biology with minimal experience in hobby-electronics to assemble the system and edit the program parameters to suit individual experimental conditions.We show the kit's utility and stability of data output via a series of hypoxia experiments. The studies also demonstrated the critical need to monitor and adjust gas-phase O2 concentration during hypoxia-based experiments to prevent experimental errors or failure due to partial loss of hypoxia. Thus, incorporating the sensor-microcontroller module to a portable hypoxia chamber provides a researcher a capability that was previously available only to labs with access to

  5. A Lab Assembled Microcontroller-Based Sensor Module for Continuous Oxygen Measurement in Portable Hypoxia Chambers.

    Science.gov (United States)

    Mathupala, Saroj P; Kiousis, Sam; Szerlip, Nicholas J

    2016-01-01

    Hypoxia-based cell culture experiments are routine and essential components of in vitro cancer research. Most laboratories use low-cost portable modular chambers to achieve hypoxic conditions for cell cultures, where the sealed chambers are purged with a gas mixture of preset O2 concentration. Studies are conducted under the assumption that hypoxia remains unaltered throughout the 48 to 72 hour duration of such experiments. Since these chambers lack any sensor or detection system to monitor gas-phase O2, the cell-based data tend to be non-uniform due to the ad hoc nature of the experimental setup. With the availability of low-cost open-source microcontroller-based electronic project kits, it is now possible for researchers to program these with easy-to-use software, link them to sensors, and place them in basic scientific apparatus to monitor and record experimental parameters. We report here the design and construction of a small-footprint kit for continuous measurement and recording of O2 concentration in modular hypoxia chambers. The low-cost assembly (US$135) consists of an Arduino-based microcontroller, data-logging freeware, and a factory pre-calibrated miniature O2 sensor. A small, intuitive software program was written by the authors to control the data input and output. The basic nature of the kit will enable any student in biology with minimal experience in hobby-electronics to assemble the system and edit the program parameters to suit individual experimental conditions. We show the kit's utility and stability of data output via a series of hypoxia experiments. The studies also demonstrated the critical need to monitor and adjust gas-phase O2 concentration during hypoxia-based experiments to prevent experimental errors or failure due to partial loss of hypoxia. Thus, incorporating the sensor-microcontroller module to a portable hypoxia chamber provides a researcher a capability that was previously available only to labs with access to sophisticated (and

  6. The Relationship between Video Game Use and a Performance-Based Measure of Persistence

    Science.gov (United States)

    Ventura, Matthew; Shute, Valerie; Zhao, Weinan

    2013-01-01

    An online performance-based measure of persistence was developed using anagrams and riddles. Persistence was measured by recording the time spent on unsolved anagrams and riddles. Time spent on unsolved problems was correlated to a self-report measure of persistence. Additionally, frequent video game players spent longer times on unsolved problems…

  7. The Beast of Aggregating Cognitive Load Measures in Technology-Based Learning

    Science.gov (United States)

    Leppink, Jimmie; van Merriënboer, Jeroen J. G.

    2015-01-01

    An increasing part of cognitive load research in technology-based learning includes a component of repeated measurements, that is: participants are measured two or more times on the same performance, mental effort or other variable of interest. In many cases, researchers aggregate scores obtained from repeated measurements to one single sum or…

  8. Comparison of a New Cobinamide-Based Method to a Standard Laboratory Method for Measuring Cyanide in Human Blood

    Science.gov (United States)

    Swezey, Robert; Shinn, Walter; Green, Carol; Drover, David R.; Hammer, Gregory B.; Schulman, Scott R.; Zajicek, Anne; Jett, David A.; Boss, Gerry R.

    2013-01-01

    Most hospital laboratories do not measure blood cyanide concentrations, and samples must be sent to reference laboratories. A simple method is needed for measuring cyanide in hospitals. The authors previously developed a method to quantify cyanide based on the high binding affinity of the vitamin B12 analog, cobinamide, for cyanide and a major spectral change observed for cyanide-bound cobinamide. This method is now validated in human blood, and the findings include a mean inter-assay accuracy of 99.1%, precision of 8.75% and a lower limit of quantification of 3.27 µM cyanide. The method was applied to blood samples from children treated with sodium nitroprusside and it yielded measurable results in 88 of 172 samples (51%), whereas the reference laboratory yielded results in only 19 samples (11%). In all 19 samples, the cobinamide-based method also yielded measurable results. The two methods showed reasonable agreement when analyzed by linear regression, but not when analyzed by a standard error of the estimate or paired t-test. Differences in results between the two methods may be because samples were assayed at different times on different sample types. The cobinamide-based method is applicable to human blood, and can be used in hospital laboratories and emergency rooms. PMID:23653045

  9. Optimizing a micro-computed tomography-based surrogate measurement of bone-implant contact.

    Science.gov (United States)

    Meagher, Matthew J; Parwani, Rachna N; Virdi, Amarjit S; Sumner, Dale R

    2018-03-01

    Histology and backscatter scanning electron microscopy (bSEM) are the current gold standard methods for quantifying bone-implant contact (BIC), but are inherently destructive. Microcomputed tomography (μCT) is a non-destructive alternative, but attempts to validate μCT-based assessment of BIC in animal models have produced conflicting results. We previously showed in a rat model using a 1.5 mm diameter titanium implant that the extent of the metal-induced artefact precluded accurate measurement of bone sufficiently close to the interface to assess BIC. Recently introduced commercial laboratory μCT scanners have smaller voxels and improved imaging capabilities, possibly overcoming this limitation. The goals of the present study were to establish an approach for optimizing μCT imaging parameters and to validate μCT-based assessment of BIC. In an empirical parametric study using a 1.5 mm diameter titanium implant, we determined 90 kVp, 88 µA, 1.5 μm isotropic voxel size, 1600 projections/180°, and 750 ms integration time to be optimal. Using specimens from an in vivo rat experiment, we found significant correlations between bSEM and μCT for BIC with the manufacturer's automated analysis routine (r = 0.716, p = 0.003) or a line-intercept method (r = 0.797, p = 0.010). Thus, this newer generation scanner's improved imaging capability reduced the extent of the metal-induced artefact zone enough to permit assessment of BIC. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 36:979-986, 2018. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  10. Response to health insurance by previously uninsured rural children.

    Science.gov (United States)

    Tilford, J M; Robbins, J M; Shema, S J; Farmer, F L

    1999-08-01

    To examine the healthcare utilization and costs of previously uninsured rural children. Four years of claims data from a school-based health insurance program located in the Mississippi Delta. All children who were not Medicaid-eligible or were uninsured, were eligible for limited benefits under the program. The 1987 National Medical Expenditure Survey (NMES) was used to compare utilization of services. The study represents a natural experiment in the provision of insurance benefits to a previously uninsured population. Premiums for the claims cost were set with little or no information on expected use of services. Claims from the insurer were used to form a panel data set. Mixed model logistic and linear regressions were estimated to determine the response to insurance for several categories of health services. The use of services increased over time and approached the level of utilization in the NMES. Conditional medical expenditures also increased over time. Actuarial estimates of claims cost greatly exceeded actual claims cost. The provision of a limited medical, dental, and optical benefit package cost approximately $20-$24 per member per month in claims paid. An important uncertainty in providing health insurance to previously uninsured populations is whether a pent-up demand exists for health services. Evidence of a pent-up demand for medical services was not supported in this study of rural school-age children. States considering partnerships with private insurers to implement the State Children's Health Insurance Program could lower premium costs by assembling basic data on previously uninsured children.

  11. Coupon Test of an Elbow Component by Using Vision-based Measurement System

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung Wan; Jeon, Bub Gyu; Choi, Hyoung Suk; Kim, Nam Sik [Pusan National University, Busan (Korea, Republic of)

    2016-05-15

    Among the various methods to overcome this shortcoming, vision-based methods to measure the strain of a structure are being proposed and many studies are being conducted on them. The vision-based measurement method is a noncontact method for measuring displacement and strain of objects by comparing between images before and after deformation. This method offers such advantages as no limitations in the surface condition, temperature, and shape of objects, the possibility of full filed measurement, and the possibility of measuring the distribution of stress or defects of structures based on the measurement results of displacement and strain in a map. The strains were measured with various methods using images in coupon test and the measurements were compared. In the future, the validity of the algorithm will be compared using stain gauge and clip gage, and based on the results, the physical properties of materials will be measured using a vision-based measurement system. This will contribute to the evaluation of reliability and effectiveness which are required for investigating local damages.

  12. Coupon Test of an Elbow Component by Using Vision-based Measurement System

    International Nuclear Information System (INIS)

    Kim, Sung Wan; Jeon, Bub Gyu; Choi, Hyoung Suk; Kim, Nam Sik

    2016-01-01

    Among the various methods to overcome this shortcoming, vision-based methods to measure the strain of a structure are being proposed and many studies are being conducted on them. The vision-based measurement method is a noncontact method for measuring displacement and strain of objects by comparing between images before and after deformation. This method offers such advantages as no limitations in the surface condition, temperature, and shape of objects, the possibility of full filed measurement, and the possibility of measuring the distribution of stress or defects of structures based on the measurement results of displacement and strain in a map. The strains were measured with various methods using images in coupon test and the measurements were compared. In the future, the validity of the algorithm will be compared using stain gauge and clip gage, and based on the results, the physical properties of materials will be measured using a vision-based measurement system. This will contribute to the evaluation of reliability and effectiveness which are required for investigating local damages

  13. Measuring sustainability by Energy Efficiency Analysis for Korean Power Companies: A Sequential Slacks-Based Efficiency Measure

    Directory of Open Access Journals (Sweden)

    Ning Zhang

    2014-03-01

    Full Text Available Improving energy efficiency has been widely regarded as one of the most cost-effective ways to improve sustainability and mitigate climate change. This paper presents a sequential slack-based efficiency measure (SSBM application to model total-factor energy efficiency with undesirable outputs. This approach simultaneously takes into account the sequential environmental technology, total input slacks, and undesirable outputs for energy efficiency analysis. We conduct an empirical analysis of energy efficiency incorporating greenhouse gas emissions of Korean power companies during 2007–2011. The results indicate that most of the power companies are not performing at high energy efficiency. Sequential technology has a significant effect on the energy efficiency measurements. Some policy suggestions based on the empirical results are also presented.

  14. Advanced measurement systems based on digital processing techniques for superconducting LHC magnets

    CERN Document Server

    Masi, Alessandro; Cennamo, Felice

    The Large Hadron Collider (LHC), a particle accelerator aimed at exploring deeper into matter than ever before, is currently being constructed at CERN. Beam optics of the LHC, requires stringent control of the field quality of about 8400 superconducting magnets, including 1232 main dipoles and 360 main quadrupoles to assure the correct machine operation. The measurement challenges are various: accuracy on the field strength measurement up to 50 ppm, harmonics in the ppm range, measurement equipment robustness, low measurement times to characterize fast field phenomena. New magnetic measurement systems, principally based on analog solutions, have been developed at CERN to achieve these goals. This work proposes the introduction of digital technologies to improve measurement performance of three systems, aimed at different measurement target and characterized by different accuracy levels. The high accuracy measurement systems, based on rotating coils, exhibit high performance in static magnetic field. With vary...

  15. Study on Method of Ultrasonic Gas Temperature Measure Based on FPGA

    Energy Technology Data Exchange (ETDEWEB)

    Wen, S H; Xu, F R [Institute of Electrical Engineering, Yanshan University, Qinhuangdao, 066004 (China)

    2006-10-15

    It is always a problem to measure instantaneous temperature of high-temperature and high-pressure gas. There is difficulty for the conventional method of measuring temperature to measure quickly and exactly, and the measuring precision is low, the ability of anti-jamming is bad, etc. So the article introduces a method of measuring burning gas temperature using ultrasonic based on Field-Programmable Gate Array (FPGA). The mathematic model of measuring temperature is built with the relation of velocity of ultrasonic transmitting and gas Kelvin in the ideal gas. The temperature can be figured out by measuring the difference of ultrasonic frequency {delta}f. FPGA is introduced and a high-precision data acquisition system based on digital phase-shift technology is designed. The feasibility of proposed above is confirmed more by measuring pressure of burning gas timely. Experimental result demonstrates that the error is less than 12.. and the precision is heightened to 0.8%.

  16. Quantitative Sasang Constitution Diagnosis Method for Distinguishing between Tae-eumin and Soeumin Types Based on Elasticity Measurements of the Skin of the Human Hand

    OpenAIRE

    Song, Han Wook; Lee, SungJun; Park, Yon Kyu; Woo, Sam Yong

    2009-01-01

    The usefulness of constitutional diagnoses based on skin measurements has been established in oriental medicine. However, it is very difficult to standardize traditional diagnosis methods. According to Sasang constitutional medicine, humans can be distinguished based on properties of the skin, including its texture, roughness, hardness and elasticity. The elasticity of the skin was previously used to distinguish between people with Tae-eumin (TE) and Soeumin (SE) constitutions. The present st...

  17. A nuclear radiation multi-parameter measurement system based on pulse-shape sampling

    International Nuclear Information System (INIS)

    Qiu Xiaolin; Fang Guoming; Xu Peng; Di Yuming

    2007-01-01

    In this paper, A nuclear radiation multi-parameter measurement system based on pulse-shape sampling is introduced, including the system's characteristics, composition, operating principle, experiment data and analysis. Compared with conventional nuclear measuring apparatus, it has some remarkable advantages such as the synchronous detection using multi-parameter measurement in the same measurement platform and the general analysis of signal data by user-defined program. (authors)

  18. Medida da espessura do segmento uterino inferior em gestantes com cesárea prévia: análise da reprodutibilidade intra- e interobservador por ultra-sonografia bi- e tridimensional Lower uterine segment thickness measurement in pregnant women with previous caesarean section: intra- and interobserver reliability analysis using bi- and tridimensional ultrasonography

    Directory of Open Access Journals (Sweden)

    Daniela de Abreu Barra

    2008-03-01

    interferir na conduta do obstetra ou antecipar o parto, foi feito por medidas bidimensionais abdominais da espessura total.PURPOSE: to compare the intra and interobserver reproducibility of the total thickness measurement of the inferior uterine segment (IUS, through the abdominal route, and of the muscle layer measurement, through the vaginal route, using bi and tridimensional ultrasonography. METHODS: the IUS thickness measurement of 30 women, between the 36th and 39th weeks of gestation with previous caesarean section, done by two observers, was studied. Abdominal ultrasonography with the patient in both supine and lithotomy position was performed. In the sagittal section, the IUS was identified and four bidimensional images and two tridimensional blocks of the total thickness were collected through the abdominal route, and the same for the muscle layer, through the vaginal route. Tridimensional acquisitions were manipulated in the multiplanar mode. The time was measured with a chronometer. Reproducibility was evaluated by the computation of the absolute difference between measurements, the ratio of differences smaller than 1 mm, the intraclass coefficient (ICC, and the Bland and Altman's concordance limits. RESULTS: the average bidimensional measurement of IUS thickness was 7.4 mm through the abdominal and 2.7 mm through the vaginal route, and the tridimensional measurement was 6.9 mm through the abdominal and 5.1 mm through the vaginal route. Intra- and interobserver reproducibility of vaginal versus abdominal route: smaller absolute difference (0.2-0.4 mm versus 0.8-1.5 mm, greater ratio of differences (85.8-97.8% versus 48.7-72,8%, with p0.05[A1] and similar lower concordance limits (-38 to 3.4 versus -3.6 to 4 mm for tridimensional ultrasonography and ICC (0.6-0.9 versus 0.7-0.9. CONCLUSIONS: from the above, we came to the conclusion that the measurement of the IUS muscle layer, through the vaginal route using tridimensional ultrasonography is more reproducible

  19. The robustness and accuracy of in vivo linear wear measurements for knee prostheses based on model-based RSA.

    Science.gov (United States)

    van Ijsseldijk, E A; Valstar, E R; Stoel, B C; Nelissen, R G H H; Reiber, J H C; Kaptein, B L

    2011-10-13

    Accurate in vivo measurements methods of wear in total knee arthroplasty are required for a timely detection of excessive wear and to assess new implant designs. Component separation measurements based on model-based Roentgen stereophotogrammetric analysis (RSA), in which 3-dimensional reconstruction methods are used, have shown promising results, yet the robustness of these measurements is unknown. In this study, the accuracy and robustness of this measurement for clinical usage was assessed. The validation experiments were conducted in an RSA setup with a phantom setup of a knee in a vertical orientation. 72 RSA images were created using different variables for knee orientations, two prosthesis types (fixed-bearing Duracon knee and fixed-bearing Triathlon knee) and accuracies of the reconstruction models. The measurement error was determined for absolute and relative measurements and the effect of knee positioning and true seperation distance was determined. The measurement method overestimated the separation distance with 0.1mm on average. The precision of the method was 0.10mm (2*SD) for the Duracon prosthesis and 0.20mm for the Triathlon prosthesis. A slight difference in error was found between the measurements with 0° and 10° anterior tilt. (difference=0.08mm, p=0.04). The accuracy of 0.1mm and precision of 0.2mm can be achieved for linear wear measurements based on model-based RSA, which is more than adequate for clinical applications. The measurement is robust in clinical settings. Although anterior tilt seems to influence the measurement, the size of this influence is low and clinically irrelevant. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Note: A resonating reflector-based optical system for motion measurement in micro-cantilever arrays

    International Nuclear Information System (INIS)

    Sathishkumar, P.; Punyabrahma, P.; Sri Muthu Mrinalini, R.; Jayanth, G. R.

    2015-01-01

    A robust, compact optical measurement unit for motion measurement in micro-cantilever arrays enables development of portable micro-cantilever sensors. This paper reports on an optical beam deflection-based system to measure the deflection of micro-cantilevers in an array that employs a single laser source, a single detector, and a resonating reflector to scan the measurement laser across the array. A strategy is also proposed to extract the deflection of individual cantilevers from the acquired data. The proposed system and measurement strategy are experimentally evaluated and demonstrated to measure motion of multiple cantilevers in an array