WorldWideScience

Sample records for spatial precision hypothesis

  1. Synchronization and phonological skills: precise auditory timing hypothesis (PATH

    Adam eTierney

    2014-11-01

    Full Text Available Phonological skills are enhanced by music training, but the mechanisms enabling this cross-domain enhancement remain unknown. To explain this cross-domain transfer, we propose a precise auditory timing hypothesis (PATH whereby entrainment practice is the core mechanism underlying enhanced phonological abilities in musicians. Both rhythmic synchronization and language skills such as consonant discrimination, detection of word and phrase boundaries, and conversational turn-taking rely on the perception of extremely fine-grained timing details in sound. Auditory-motor timing is an acoustic feature which meets all five of the pre-conditions necessary for cross-domain enhancement to occur (Patel 2011, 2012, 2014. There is overlap between the neural networks that process timing in the context of both music and language. Entrainment to music demands more precise timing sensitivity than does language processing. Moreover, auditory-motor timing integration captures the emotion of the trainee, is repeatedly practiced, and demands focused attention. The precise auditory timing hypothesis predicts that musical training emphasizing entrainment will be particularly effective in enhancing phonological skills.

  2. Biodiversity, productivity, and the spatial insurance hypothesis revisited

    Shanafelt, David W.; Dieckmann, Ulf; Jonas, Matthias; Franklin, Oskar; Loreau, Michel; Perrings, Charles

    2015-01-01

    Accelerating rates of biodiversity loss have led ecologists to explore the effects of species richness on ecosystem functioning and the flow of ecosystem services. One explanation of the relationship between biodiversity and ecosystem functioning lies in the spatial insurance hypothesis, which centers on the idea that productivity and stability increase with biodiversity in a temporally varying, spatially heterogeneous environment. However, there has been little work on the impact of dispersal where environmental risks are more or less spatially correlated, or where dispersal rates are variable. In this paper, we extend the original Loreau model to consider stochastic temporal variation in resource availability, which we refer to as “environmental risk,” and heterogeneity in species dispersal rates. We find that asynchronies across communities and species provide community-level stabilizing effects on productivity, despite varying levels of species richness. Although intermediate dispersal rates play a role in mitigating risk, they are less effective in insuring productivity against global (metacommunity-level) than local (individual community-level) risks. These results are particularly interesting given the emergence of global sources of risk such as climate change or the closer integration of world markets. Our results offer deeper insights into the Loreau model and new perspectives on the effectiveness of spatial insurance in the face of environmental risks. PMID:26100182

  3. The Threshold Hypothesis Applied to Spatial Skill and Mathematics

    Freer, Daniel

    2017-01-01

    This cross-sectional study assessed the relation between spatial skills and mathematics in 854 participants across kindergarten, third grade, and sixth grade. Specifically, the study probed for a threshold for spatial skills when performing mathematics, above which spatial scores and mathematics scores would be significantly less related. This…

  4. The direct perception hypothesis: perceiving the intention of another’s action hinders its precise imitation

    Froese, Tom; Leavens, David A.

    2014-01-01

    We argue that imitation is a learning response to unintelligible actions, especially to social conventions. Various strands of evidence are converging on this conclusion, but further progress has been hampered by an outdated theory of perceptual experience. Comparative psychology continues to be premised on the doctrine that humans and non-human primates only perceive others’ physical “surface behavior,” while mental states are perceptually inaccessible. However, a growing consensus in social cognition research accepts the direct perception hypothesis: primarily we see what others aim to do; we do not infer it from their motions. Indeed, physical details are overlooked – unless the action is unintelligible. On this basis we hypothesize that apes’ propensity to copy the goal of an action, rather than its precise means, is largely dependent on its perceived intelligibility. Conversely, children copy means more often than adults and apes because, uniquely, much adult human behavior is completely unintelligible to unenculturated observers due to the pervasiveness of arbitrary social conventions, as exemplified by customs, rituals, and languages. We expect the propensity to imitate to be inversely correlated with the familiarity of cultural practices, as indexed by age and/or socio-cultural competence. The direct perception hypothesis thereby helps to parsimoniously explain the most important findings of imitation research, including children’s over-imitation and other species-typical and age-related variations. PMID:24600413

  5. Precise Mapping Of A Spatially Distributed Radioactive Source

    Beck, A.; Caras, I.; Piestum, S.; Sheli, E.; Melamud, Y.; Berant, S.; Kadmon, Y.; Tirosh, D.

    1999-01-01

    Spatial distribution measurement of radioactive sources is a routine task in the nuclear industry. The precision of each measurement depends upon the specific application. However, the technological edge of this precision is motivated by the production of standards for calibration. Within this definition, the most demanding field is the calibration of standards for medical equipment. In this paper, a semi-empirical method for controlling the measurement precision is demonstrated, using a relatively simple laboratory apparatus. The spatial distribution of the source radioactivity is measured as part of the quality assurance tests, during the production of flood sources. These sources are further used in calibration of medical gamma cameras. A typical flood source is a 40 x 60 cm 2 plate with an activity of 10 mCi (or more) of 57 Co isotope. The measurement set-up is based on a single NaI(Tl) scintillator with a photomultiplier tube, moving on an X Y table which scans the flood source. In this application the source is required to have a uniform activity distribution over its surface

  6. Human short-term spatial memory: precision predicts capacity.

    Banta Lavenex, Pamela; Boujon, Valérie; Ndarugendamwo, Angélique; Lavenex, Pierre

    2015-03-01

    Here, we aimed to determine the capacity of human short-term memory for allocentric spatial information in a real-world setting. Young adults were tested on their ability to learn, on a trial-unique basis, and remember over a 1-min interval the location(s) of 1, 3, 5, or 7 illuminating pads, among 23 pads distributed in a 4m×4m arena surrounded by curtains on three sides. Participants had to walk to and touch the pads with their foot to illuminate the goal locations. In contrast to the predictions from classical slot models of working memory capacity limited to a fixed number of items, i.e., Miller's magical number 7 or Cowan's magical number 4, we found that the number of visited locations to find the goals was consistently about 1.6 times the number of goals, whereas the number of correct choices before erring and the number of errorless trials varied with memory load even when memory load was below the hypothetical memory capacity. In contrast to resource models of visual working memory, we found no evidence that memory resources were evenly distributed among unlimited numbers of items to be remembered. Instead, we found that memory for even one individual location was imprecise, and that memory performance for one location could be used to predict memory performance for multiple locations. Our findings are consistent with a theoretical model suggesting that the precision of the memory for individual locations might determine the capacity of human short-term memory for spatial information. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Multiple Vehicle Cooperative Localization with Spatial Registration Based on a Probability Hypothesis Density Filter

    Feihu Zhang

    2014-01-01

    Full Text Available This paper studies the problem of multiple vehicle cooperative localization with spatial registration in the formulation of the probability hypothesis density (PHD filter. Assuming vehicles are equipped with proprioceptive and exteroceptive sensors (with biases to cooperatively localize positions, a simultaneous solution for joint spatial registration and state estimation is proposed. For this, we rely on the sequential Monte Carlo implementation of the PHD filtering. Compared to other methods, the concept of multiple vehicle cooperative localization with spatial registration is first proposed under Random Finite Set Theory. In addition, the proposed solution also addresses the challenges for multiple vehicle cooperative localization, e.g., the communication bandwidth issue and data association uncertainty. The simulation result demonstrates its reliability and feasibility in large-scale environments.

  8. Variability in the Precision of Children’s Spatial Working Memory

    Elena M. Galeano Weber

    2018-02-01

    Full Text Available Cognitive modeling studies in adults have established that visual working memory (WM capacity depends on the representational precision, as well as its variability from moment to moment. By contrast, visuospatial WM performance in children has been typically indexed by response accuracy—a binary measure that provides less information about precision with which items are stored. Here, we aimed at identifying whether and how children’s WM performance depends on the spatial precision and its variability over time in real-world contexts. Using smartphones, 110 Grade 3 and Grade 4 students performed a spatial WM updating task three times a day in school and at home for four weeks. Measures of spatial precision (i.e., Euclidean distance between presented and reported location were used for hierarchical modeling to estimate variability of spatial precision across different time scales. Results demonstrated considerable within-person variability in spatial precision across items within trials, from trial to trial and from occasion to occasion within days and from day to day. In particular, item-to-item variability was systematically increased with memory load and lowered with higher grade. Further, children with higher precision variability across items scored lower in measures of fluid intelligence. These findings emphasize the important role of transient changes in spatial precision for the development of WM.

  9. Testing the Environmental Kuznets Curve Hypothesis for Biodiversity Risk in the US: A Spatial Econometric Approach

    Robert P. Berrens

    2011-11-01

    Full Text Available This study investigates whether the environmental Kuznets curve (EKC relationship is supported for a measure of biodiversity risk and economic development across the United States (US. Using state-level data for all 48 contiguous states, biodiversity risk is measured using a Modified Index (MODEX. This index is an adaptation of a comprehensive National Biodiversity Risk Assessment Index. The MODEX differs from other measures in that it is takes into account the impact of human activities and conservation measures. The econometric approach includes corrections for spatial autocorrelation effects, which are present in the data. Modeling estimation results do not support the EKC hypothesis for biodiversity risk in the US. This finding is robust over ordinary least squares, spatial error, and spatial lag models, where the latter is shown to be the preferred model. Results from the spatial lag regression show that a 1% increase in human population density is associated with about a 0.19% increase in biodiversity risk. Spatial dependence in this case study explains 30% of the variation, as risk in one state spills over into adjoining states. From a policy perspective, this latter result supports the need for coordinated efforts at state and federal levels to address the problem of biodiversity loss.

  10. Convergence Hypothesis: Evidence from Panel Unit Root Test with Spatial Dependence

    Lezheng Liu

    2006-10-01

    Full Text Available In this paper we test the convergence hypothesis by using a revised 4- step procedure of panel unit root test suggested by Evans and Karras (1996. We use data on output for 24 OECD countries over 40 years long. Whether the convergence, if any, is conditional or absolute is also examined. According to a proposition by Baltagi, Bresson, and Pirotte (2005, we incorporate spatial autoregressive error into a fixedeffect panel model to account for not only the heterogeneous panel structure, but also spatial dependence, which might induce lower statistical power of conventional panel unit root test. Our empirical results indicate that output is converging among OECD countries. However, convergence is characterized as conditional. The results also report a relatively lower convergent speed compared to conventional panel studies.

  11. High spatial precision nano-imaging of polarization-sensitive plasmonic particles

    Liu, Yunbo; Wang, Yipei; Lee, Somin Eunice

    2018-02-01

    Precise polarimetric imaging of polarization-sensitive nanoparticles is essential for resolving their accurate spatial positions beyond the diffraction limit. However, conventional technologies currently suffer from beam deviation errors which cannot be corrected beyond the diffraction limit. To overcome this issue, we experimentally demonstrate a spatially stable nano-imaging system for polarization-sensitive nanoparticles. In this study, we show that by integrating a voltage-tunable imaging variable polarizer with optical microscopy, we are able to suppress beam deviation errors. We expect that this nano-imaging system should allow for acquisition of accurate positional and polarization information from individual nanoparticles in applications where real-time, high precision spatial information is required.

  12. The role of spatial memory and frames of reference in the precision of angular path integration.

    Arthur, Joeanna C; Philbeck, John W; Kleene, Nicholas J; Chichka, David

    2012-09-01

    Angular path integration refers to the ability to maintain an estimate of self-location after a rotational displacement by integrating internally-generated (idiothetic) self-motion signals over time. Previous work has found that non-sensory inputs, namely spatial memory, can play a powerful role in angular path integration (Arthur et al., 2007, 2009). Here we investigated the conditions under which spatial memory facilitates angular path integration. We hypothesized that the benefit of spatial memory is particularly likely in spatial updating tasks in which one's self-location estimate is referenced to external space. To test this idea, we administered passive, non-visual body rotations (ranging 40°-140°) about the yaw axis and asked participants to use verbal reports or open-loop manual pointing to indicate the magnitude of the rotation. Prior to some trials, previews of the surrounding environment were given. We found that when participants adopted an egocentric frame of reference, the previously-observed benefit of previews on within-subject response precision was not manifested, regardless of whether remembered spatial frameworks were derived from vision or spatial language. We conclude that the powerful effect of spatial memory is dependent on one's frame of reference during self-motion updating. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Spatial working memory for locations specified by vision and audition: testing the amodality hypothesis.

    Loomis, Jack M; Klatzky, Roberta L; McHugh, Brendan; Giudice, Nicholas A

    2012-08-01

    Spatial working memory can maintain representations from vision, hearing, and touch, representations referred to here as spatial images. The present experiment addressed whether spatial images from vision and hearing that are simultaneously present within working memory retain modality-specific tags or are amodal. Observers were presented with short sequences of targets varying in angular direction, with the targets in a given sequence being all auditory, all visual, or a sequential mixture of the two. On two thirds of the trials, one of the locations was repeated, and observers had to respond as quickly as possible when detecting this repetition. Ancillary detection and localization tasks confirmed that the visual and auditory targets were perceptually comparable. Response latencies in the working memory task showed small but reliable costs in performance on trials involving a sequential mixture of auditory and visual targets, as compared with trials of pure vision or pure audition. These deficits were statistically reliable only for trials on which the modalities of the matching location switched from the penultimate to the final target in the sequence, indicating a switching cost. The switching cost for the pair in immediate succession means that the spatial images representing the target locations retain features of the visual or auditory representations from which they were derived. However, there was no reliable evidence of a performance cost for mixed modalities in the matching pair when the second of the two did not immediately follow the first, suggesting that more enduring spatial images in working memory may be amodal.

  14. Spatial distribution of soil moisture in precision farming using integrated soil scanning and field telemetry data

    Kalopesas, Charalampos; Galanis, George; Kalopesa, Eleni; Katsogiannos, Fotis; Kalafatis, Panagiotis; Bilas, George; Patakas, Aggelos; Zalidis, George

    2015-04-01

    Mapping the spatial variation of soil moisture content is a vital parameter for precision agriculture techniques. The aim of this study was to examine the correlation of soil moisture and conductivity (EC) data obtained through scanning techniques with field telemetry data and to spatially separate the field into discrete irrigation management zones. Using the Veris MSP3 model, geo-referenced data for electrical conductivity and organic matter preliminary maps were produced in a pilot kiwifruit field in Chrysoupoli, Kavala. Data from 15 stratified sampling points was used in order to produce the corresponding soil maps. Fusion of the Veris produced maps (OM, pH, ECa) resulted on the delineation of the field into three zones of specific management interest. An appropriate pedotransfer function was used in order to estimate a capacity soil indicator, the saturated volumetric water content (θs) for each zone, while the relationship between ECs and ECa was established for each zone. Validation of the uniformity of the three management zones was achieved by measuring specific electrical conductivity (ECs) along a transect in each zone and corresponding semivariograms for ECs within each zone. Near real-time data produced by a telemetric network consisting of soil moisture and electrical conductivity sensors, were used in order to integrate the temporal component of the specific management zones, enabling the calculation of time specific volumetric water contents on a 10 minute interval, an intensity soil indicator necessary to be incorporated to differentiate spatially the irrigation strategies for each zone. This study emphasizes the benefits yielded by fusing near real time telemetric data with soil scanning data and spatial interpolation techniques, enhancing the precision and validity of the desired results. Furthermore the use of telemetric data in combination with modern database management and geospatial software leads to timely produced operational results

  15. The precision of spatial selection into the focus of attention in working memory.

    Souza, Alessandra S; Thalmann, Mirko; Oberauer, Klaus

    2018-04-23

    Attention helps manage the information held in visual working memory (vWM). Perceptual attention selects the stimuli to be represented in vWM, whereas internal attention prioritizes information already in vWM. In the present study we assessed the spatial precision of perceptual and internal attention in vWM. Participants encoded eight colored dots for a local-recognition test. To manipulate attention, a cue indicated the item most likely to be tested (~65% validity). The cue appeared either before the onset of the memory array (precue) or during the retention interval (retrocue). The precue guides perceptual attention to gate encoding into vWM, whereas the retrocue guides internal attention to prioritize the cued item within vWM. If attentional selection is spatially imprecise, attention should be preferentially allocated to the cued location, with a gradual drop-off of attention over space to nearby uncued locations. In this case, memory for uncued locations should vary as a function of their distance from the cued location. As compared to a no-cue condition, memory was better for validly cued items but worse for uncued items. The spatial distance between the uncued and cued locations modulated the cuing costs: Items close in space to the cued location were insulated from cuing costs. The extension of this spatial proximity effect was larger for precues than for retrocues, mostly because the benefits of attention were larger for precues. These results point to similar selection principles between perceptual and internal attention and to a critical role of spatial distance in the selection of visual representations.

  16. Geotechnical parameter spatial distribution stochastic analysis based on multi-precision information assimilation

    Wang, C.; Rubin, Y.

    2014-12-01

    Spatial distribution of important geotechnical parameter named compression modulus Es contributes considerably to the understanding of the underlying geological processes and the adequate assessment of the Es mechanics effects for differential settlement of large continuous structure foundation. These analyses should be derived using an assimilating approach that combines in-situ static cone penetration test (CPT) with borehole experiments. To achieve such a task, the Es distribution of stratum of silty clay in region A of China Expo Center (Shanghai) is studied using the Bayesian-maximum entropy method. This method integrates rigorously and efficiently multi-precision of different geotechnical investigations and sources of uncertainty. Single CPT samplings were modeled as a rational probability density curve by maximum entropy theory. Spatial prior multivariate probability density function (PDF) and likelihood PDF of the CPT positions were built by borehole experiments and the potential value of the prediction point, then, preceding numerical integration on the CPT probability density curves, the posterior probability density curve of the prediction point would be calculated by the Bayesian reverse interpolation framework. The results were compared between Gaussian Sequential Stochastic Simulation and Bayesian methods. The differences were also discussed between single CPT samplings of normal distribution and simulated probability density curve based on maximum entropy theory. It is shown that the study of Es spatial distributions can be improved by properly incorporating CPT sampling variation into interpolation process, whereas more informative estimations are generated by considering CPT Uncertainty for the estimation points. Calculation illustrates the significance of stochastic Es characterization in a stratum, and identifies limitations associated with inadequate geostatistical interpolation techniques. This characterization results will provide a multi-precision

  17. Spatially explicit genetic structure in the freshwater sponge Ephydatia fluviatilis (Linnaeus, 1759 within the framework of the monopolisation hypothesis

    Livia Lucentini

    2013-02-01

    Full Text Available An apparent paradox is known for crustaceans, rotifers and bryozoans living in inland small water bodies: a potential for wide distribution due to the presence of resting stages is coupled with marked genetic differences between nearby water bodies, with enclave distributions masking clear phylogeographic patterns. According to the monopolisation hypothesis, this is due to the accumulation of resting stages, monopolising each water body. Freshwater sponges could represent a useful system to assess the generality of the mo- nopolisation hypothesis: these organisms i live in the same habitats as crustaceans, rotifers and bryozoans, ii produce resting stages that can accumulate, and iii have indeed a wide distribution. Currently, no studies on spatially explicit genetic differentiation on fresh- water sponges are available. The aim of the present study is to provide additional empirical evidence in support of the generality of the scenario for small aquatic animals with resting stages by analysing genetic diversity at different spatial scales for an additional model system, the freshwater sponge ephydatia fluviatilis (Linnaeus, 1759. We expected that system genetic variability would follow enclave distributions, no clear phylogeographical patterns would be present, and nearby unconnected water bodies would show markedly different populations for this new model too. We analysed the ribosomal internal transcribed spacer regions 5.8S-ITS2-28S, the D3 domain of 28S subunit, the mitochondrial Cytochrome c Oxidase I (COI and ten specific microsatellite markers of nine Italian and one Hungarian populations. Mitochondrial and nuclear sequences showed no or very low genetic polymorphism, whereas high levels of differentiation among populations and a significant polymorphism were observed using microsatellites. Microsatellite loci also showed a high proportion of private alleles for each population and an overall correlation between geographic and genetic

  18. Correlated cryo-fluorescence and cryo-electron microscopy with high spatial precision and improved sensitivity

    Schorb, Martin; Briggs, John A.G.

    2014-01-01

    Performing fluorescence microscopy and electron microscopy on the same sample allows fluorescent signals to be used to identify and locate features of interest for subsequent imaging by electron microscopy. To carry out such correlative microscopy on vitrified samples appropriate for structural cryo-electron microscopy it is necessary to perform fluorescence microscopy at liquid-nitrogen temperatures. Here we describe an adaptation of a cryo-light microscopy stage to permit use of high-numerical aperture objectives. This allows high-sensitivity and high-resolution fluorescence microscopy of vitrified samples. We describe and apply a correlative cryo-fluorescence and cryo-electron microscopy workflow together with a fiducial bead-based image correlation procedure. This procedure allows us to locate fluorescent bacteriophages in cryo-electron microscopy images with an accuracy on the order of 50 nm, based on their fluorescent signal. It will allow the user to precisely and unambiguously identify and locate objects and events for subsequent high-resolution structural study, based on fluorescent signals. - Highlights: • Workflow for correlated cryo-fluorescence and cryo-electron microscopy. • Cryo-fluorescence microscopy setup incorporating a high numerical aperture objective. • Fluorescent signals located in cryo-electron micrographs with 50 nm spatial precision

  19. Correlated cryo-fluorescence and cryo-electron microscopy with high spatial precision and improved sensitivity

    Schorb, Martin [Structural and Computational Biology Unit, European Molecular Biology Laboratory, D-69117 Heidelberg (Germany); Briggs, John A.G., E-mail: john.briggs@embl.de [Structural and Computational Biology Unit, European Molecular Biology Laboratory, D-69117 Heidelberg (Germany); Cell Biology and Biophysics Unit, European Molecular Biology Laboratory, D-69117 Heidelberg (Germany)

    2014-08-01

    Performing fluorescence microscopy and electron microscopy on the same sample allows fluorescent signals to be used to identify and locate features of interest for subsequent imaging by electron microscopy. To carry out such correlative microscopy on vitrified samples appropriate for structural cryo-electron microscopy it is necessary to perform fluorescence microscopy at liquid-nitrogen temperatures. Here we describe an adaptation of a cryo-light microscopy stage to permit use of high-numerical aperture objectives. This allows high-sensitivity and high-resolution fluorescence microscopy of vitrified samples. We describe and apply a correlative cryo-fluorescence and cryo-electron microscopy workflow together with a fiducial bead-based image correlation procedure. This procedure allows us to locate fluorescent bacteriophages in cryo-electron microscopy images with an accuracy on the order of 50 nm, based on their fluorescent signal. It will allow the user to precisely and unambiguously identify and locate objects and events for subsequent high-resolution structural study, based on fluorescent signals. - Highlights: • Workflow for correlated cryo-fluorescence and cryo-electron microscopy. • Cryo-fluorescence microscopy setup incorporating a high numerical aperture objective. • Fluorescent signals located in cryo-electron micrographs with 50 nm spatial precision.

  20. Gelatin-based laser direct-write technique for the precise spatial patterning of cells.

    Schiele, Nathan R; Chrisey, Douglas B; Corr, David T

    2011-03-01

    Laser direct-writing provides a method to pattern living cells in vitro, to study various cell-cell interactions, and to build cellular constructs. However, the materials typically used may limit its long-term application. By utilizing gelatin coatings on the print ribbon and growth surface, we developed a new approach for laser cell printing that overcomes the limitations of Matrigel™. Gelatin is free of growth factors and extraneous matrix components that may interfere with cellular processes under investigation. Gelatin-based laser direct-write was able to successfully pattern human dermal fibroblasts with high post-transfer viability (91% ± 3%) and no observed double-strand DNA damage. As seen with atomic force microscopy, gelatin offers a unique benefit in that it is present temporarily to allow cell transfer, but melts and is removed with incubation to reveal the desired application-specific growth surface. This provides unobstructed cellular growth after printing. Monitoring cell location after transfer, we show that melting and removal of gelatin does not affect cellular placement; cells maintained registry within 5.6 ± 2.5 μm to the initial pattern. This study demonstrates the effectiveness of gelatin in laser direct-writing to create spatially precise cell patterns with the potential for applications in tissue engineering, stem cell, and cancer research.

  1. Strong spatial genetic structure in five tropical Piper species: should the Baker–Fedorov hypothesis be revived for tropical shrubs?

    Lasso, E; Dalling, J W; Bermingham, E

    2011-01-01

    Fifty years ago, Baker and Fedorov proposed that the high species diversity of tropical forests could arise from the combined effects of inbreeding and genetic drift leading to population differentiation and eventually to sympatric speciation. Decades of research, however have failed to support the Baker–Fedorov hypothesis (BFH), and it has now been discarded in favor of a paradigm where most trees are self-incompatible or strongly outcrossing, and where long-distance pollen dispersal prevents population drift. Here, we propose that several hyper-diverse genera of tropical herbs and shrubs, including Piper (>1,000 species), may provide an exception. Species in this genus often have aggregated, high-density populations with self-compatible breeding systems; characteristics which the BFH would predict lead to high local genetic differentiation. We test this prediction for five Piper species on Barro Colorado Island, Panama, using Amplified Fragment Length Polymorphism (AFLP) markers. All species showed strong genetic structure at both fine- and large-spatial scales. Over short distances (200–750 m) populations showed significant genetic differentiation (Fst 0.11–0.46, P < 0.05), with values of spatial genetic structure that exceed those reported for other tropical tree species (Sp = 0.03–0.136). This genetic structure probably results from the combined effects of limited seed and pollen dispersal, clonal spread, and selfing. These processes are likely to have facilitated the diversification of populations in response to local natural selection or genetic drift and may explain the remarkable diversity of this rich genus. PMID:22393518

  2. Spatial Distribution and Sampling Plans With Fixed Level of Precision for Citrus Aphids (Hom., Aphididae) on Two Orange Species.

    Kafeshani, Farzaneh Alizadeh; Rajabpour, Ali; Aghajanzadeh, Sirous; Gholamian, Esmaeil; Farkhari, Mohammad

    2018-04-02

    Aphis spiraecola Patch, Aphis gossypii Glover, and Toxoptera aurantii Boyer de Fonscolombe are three important aphid pests of citrus orchards. In this study, spatial distributions of the aphids on two orange species, Satsuma mandarin and Thomson navel, were evaluated using Taylor's power law and Iwao's patchiness. In addition, a fixed-precision sequential sampling plant was developed for each species on the host plant by Green's model at precision levels of 0.25 and 0.1. The results revealed that spatial distribution parameters and therefore the sampling plan were significantly different according to aphid and host plant species. Taylor's power law provides a better fit for the data than Iwao's patchiness regression. Except T. aurantii on Thomson navel orange, spatial distribution patterns of the aphids were aggregative on both citrus. T. aurantii had regular dispersion pattern on Thomson navel orange. Optimum sample size of the aphids varied from 30-2061 and 1-1622 shoots on Satsuma mandarin and Thomson navel orange based on aphid species and desired precision level. Calculated stop lines of the aphid species on Satsuma mandarin and Thomson navel orange ranged from 0.48 to 19 and 0.19 to 80.4 aphids per 24 shoots according to aphid species and desired precision level. The performance of the sampling plan was validated by resampling analysis using resampling for validation of sampling plans (RVSP) software. This sampling program is useful for IPM program of the aphids in citrus orchards.

  3. Transitions between central and peripheral vision create spatial/temporal distortions: a hypothesis concerning the perceived break of the curveball.

    Arthur Shapiro

    2010-10-01

    Full Text Available The human visual system does not treat all parts of an image equally: the central segments of an image, which fall on the fovea, are processed with a higher resolution than the segments that fall in the visual periphery. Even though the differences between foveal and peripheral resolution are large, these differences do not usually disrupt our perception of seamless visual space. Here we examine a motion stimulus in which the shift from foveal to peripheral viewing creates a dramatic spatial/temporal discontinuity.The stimulus consists of a descending disk (global motion with an internal moving grating (local motion. When observers view the disk centrally, they perceive both global and local motion (i.e., observers see the disk's vertical descent and the internal spinning. When observers view the disk peripherally, the internal portion appears stationary, and the disk appears to descend at an angle. The angle of perceived descent increases as the observer views the stimulus from further in the periphery. We examine the first- and second-order information content in the display with the use of a three-dimensional Fourier analysis and show how our results can be used to describe perceived spatial/temporal discontinuities in real-world situations.The perceived shift of the disk's direction in the periphery is consistent with a model in which foveal processing separates first- and second-order motion information while peripheral processing integrates first- and second-order motion information. We argue that the perceived distortion may influence real-world visual observations. To this end, we present a hypothesis and analysis of the perception of the curveball and rising fastball in the sport of baseball. The curveball is a physically measurable phenomenon: the imbalance of forces created by the ball's spin causes the ball to deviate from a straight line and to follow a smooth parabolic path. However, the curveball is also a perceptual puzzle

  4. Transitions between central and peripheral vision create spatial/temporal distortions: a hypothesis concerning the perceived break of the curveball.

    Shapiro, Arthur; Lu, Zhong-Lin; Huang, Chang-Bing; Knight, Emily; Ennis, Robert

    2010-10-13

    The human visual system does not treat all parts of an image equally: the central segments of an image, which fall on the fovea, are processed with a higher resolution than the segments that fall in the visual periphery. Even though the differences between foveal and peripheral resolution are large, these differences do not usually disrupt our perception of seamless visual space. Here we examine a motion stimulus in which the shift from foveal to peripheral viewing creates a dramatic spatial/temporal discontinuity. The stimulus consists of a descending disk (global motion) with an internal moving grating (local motion). When observers view the disk centrally, they perceive both global and local motion (i.e., observers see the disk's vertical descent and the internal spinning). When observers view the disk peripherally, the internal portion appears stationary, and the disk appears to descend at an angle. The angle of perceived descent increases as the observer views the stimulus from further in the periphery. We examine the first- and second-order information content in the display with the use of a three-dimensional Fourier analysis and show how our results can be used to describe perceived spatial/temporal discontinuities in real-world situations. The perceived shift of the disk's direction in the periphery is consistent with a model in which foveal processing separates first- and second-order motion information while peripheral processing integrates first- and second-order motion information. We argue that the perceived distortion may influence real-world visual observations. To this end, we present a hypothesis and analysis of the perception of the curveball and rising fastball in the sport of baseball. The curveball is a physically measurable phenomenon: the imbalance of forces created by the ball's spin causes the ball to deviate from a straight line and to follow a smooth parabolic path. However, the curveball is also a perceptual puzzle because batters often

  5. Generating a taxonomy of spatially cued attention for visual discrimination: Effects of judgment precision and set size on attention

    Hetley, Richard; Dosher, Barbara Anne; Lu, Zhong-Lin

    2014-01-01

    Attention precues improve the performance of perceptual tasks in many but not all circumstances. These spatial attention effects may depend upon display set size or workload, and have been variously attributed to external noise filtering, stimulus enhancement, contrast gain, or response gain, or to uncertainty or other decision effects. In this study, we document systematically different effects of spatial attention in low- and high-precision judgments, with and without external noise, and in different set sizes in order to contribute to the development of a taxonomy of spatial attention. An elaborated perceptual template model (ePTM) provides an integrated account of a complex set of effects of spatial attention with just two attention factors: a set-size dependent exclusion or filtering of external noise and a narrowing of the perceptual template to focus on the signal stimulus. These results are related to the previous literature by classifying the judgment precision and presence of external noise masks in those experiments, suggesting a taxonomy of spatially cued attention in discrimination accuracy. PMID:24939234

  6. Generating a taxonomy of spatially cued attention for visual discrimination: effects of judgment precision and set size on attention.

    Hetley, Richard; Dosher, Barbara Anne; Lu, Zhong-Lin

    2014-11-01

    Attention precues improve the performance of perceptual tasks in many but not all circumstances. These spatial attention effects may depend upon display set size or workload, and have been variously attributed to external noise filtering, stimulus enhancement, contrast gain, or response gain, or to uncertainty or other decision effects. In this study, we document systematically different effects of spatial attention in low- and high-precision judgments, with and without external noise, and in different set sizes in order to contribute to the development of a taxonomy of spatial attention. An elaborated perceptual template model (ePTM) provides an integrated account of a complex set of effects of spatial attention with just two attention factors: a set-size dependent exclusion or filtering of external noise and a narrowing of the perceptual template to focus on the signal stimulus. These results are related to the previous literature by classifying the judgment precision and presence of external noise masks in those experiments, suggesting a taxonomy of spatially cued attention in discrimination accuracy.

  7. The Role of Spatial Memory and Frames of Reference in the Precision of Angular Path Integration

    Arthur, Joeanna C.; Philbeck, John W.; Kleene, Nicholas J.; Chichka, David

    2012-01-01

    Angular path integration refers to the ability to maintain an estimate of self-location after a rotational displacement by integrating internally-generated (idiothetic) self-motion signals over time. Previous work has found that non-sensory inputs, namely spatial memory, can play a powerful role in angular path integration (Arthur et al., 2007, 2009). Here we investigated the conditions under which spatial memory facilitates angular path integration. We hypothesized that the benefit of spatia...

  8. Precision Viticulture : is it relevant to manage the vineyard according to the within field spatial variability of the environment ?

    Tisseyre, Bruno

    2015-04-01

    For more than 15 years, research projects are conducted in the precision viticulture (PV) area around the world. These research projects have provided new insights into the within-field variability in viticulture. Indeed, access to high spatial resolution data (remote sensing, embedded sensors, etc.) changes the knowledge we have of the fields in viticulture. In particular, the field which was until now considered as a homogeneous management unit, presents actually a high spatial variability in terms of yield, vigour an quality. This knowledge will lead (and is already causing) changes on how to manage the vineyard and the quality of the harvest at the within field scale. From the experimental results obtained in various countries of the world, the goal of the presentation is to provide figures on: - the spatial variability of the main parameters (yield, vigor, quality), and how this variability is organized spatially, - the temporal stability of the observed spatial variability and the potential link with environmental parameters like soil, topography, soil water availability, etc. - information sources available at a high spatial resolution conventionally used in precision agriculture likely to highlight this spatial variability (multi-spectral images, soil electrical conductivity, etc.) and the limitations that these information sources are likely to present in viticulture. Several strategies are currently being developed to take into account the within field variability in viticulture. They are based on the development of specific equipments, sensors, actuators and site specific strategies with the aim of adapting the vineyard operations at the within-field level. These strategies will be presented briefly in two ways : - Site specific operations (fertilization, pruning, thinning, irrigation, etc.) in order to counteract the effects of the environment and to obtain a final product with a controlled and consistent wine quality, - Differential harvesting with the

  9. The Focus of Spatial Attention Determines the Number and Precision of Face Representations in Working Memory.

    Towler, John; Kelly, Maria; Eimer, Martin

    2016-06-01

    The capacity of visual working memory for faces is extremely limited, but the reasons for these limitations remain unknown. We employed event-related brain potential measures to demonstrate that individual faces have to be focally attended in order to be maintained in working memory, and that attention is allocated to only a single face at a time. When 2 faces have to be memorized simultaneously in a face identity-matching task, the focus of spatial attention during encoding predicts which of these faces can be successfully maintained in working memory and matched to a subsequent test face. We also show that memory representations of attended faces are maintained in a position-dependent fashion. These findings demonstrate that the limited capacity of face memory is directly linked to capacity limits of spatial attention during the encoding and maintenance of individual face representations. We suggest that the capacity and distribution of selective spatial attention is a dynamic resource that constrains the capacity and fidelity of working memory for faces. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. Spatial Precision in Magnetic Resonance Imaging–Guided Radiation Therapy: The Role of Geometric Distortion

    Weygand, Joseph, E-mail: jw2899@columbia.edu [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); The University of Texas Graduate School of Biomedical Sciences at Houston, Houston, Texas (United States); Fuller, Clifton David [The University of Texas Graduate School of Biomedical Sciences at Houston, Houston, Texas (United States); Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Ibbott, Geoffrey S. [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); The University of Texas Graduate School of Biomedical Sciences at Houston, Houston, Texas (United States); Mohamed, Abdallah S.R. [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Department of Clinical Oncology and Nuclear Medicine, Alexandria University, Alexandria (Egypt); Ding, Yao [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Yang, Jinzhong [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); The University of Texas Graduate School of Biomedical Sciences at Houston, Houston, Texas (United States); Hwang, Ken-Pin [Department of Imaging Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Wang, Jihong [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); The University of Texas Graduate School of Biomedical Sciences at Houston, Houston, Texas (United States)

    2016-07-15

    Because magnetic resonance imaging–guided radiation therapy (MRIgRT) offers exquisite soft tissue contrast and the ability to image tissues in arbitrary planes, the interest in this technology has increased dramatically in recent years. However, intrinsic geometric distortion stemming from both the system hardware and the magnetic properties of the patient affects MR images and compromises the spatial integrity of MRI-based radiation treatment planning, given that for real-time MRIgRT, precision within 2 mm is desired. In this article, we discuss the causes of geometric distortion, describe some well-known distortion correction algorithms, and review geometric distortion measurements from 12 studies, while taking into account relevant imaging parameters. Eleven of the studies reported phantom measurements quantifying system-dependent geometric distortion, while 2 studies reported simulation data quantifying magnetic susceptibility–induced geometric distortion. Of the 11 studies investigating system-dependent geometric distortion, 5 reported maximum measurements less than 2 mm. The simulation studies demonstrated that magnetic susceptibility–induced distortion is typically smaller than system-dependent distortion but still nonnegligible, with maximum distortion ranging from 2.1 to 2.6 mm at a field strength of 1.5 T. As expected, anatomic landmarks containing interfaces between air and soft tissue had the largest distortions. The evidence indicates that geometric distortion reduces the spatial integrity of MRI-based radiation treatment planning and likely diminishes the efficacy of MRIgRT. Better phantom measurement techniques and more effective distortion correction algorithms are needed to achieve the desired spatial precision.

  11. Spatial Precision in Magnetic Resonance Imaging–Guided Radiation Therapy: The Role of Geometric Distortion

    Weygand, Joseph; Fuller, Clifton David; Ibbott, Geoffrey S.; Mohamed, Abdallah S.R.; Ding, Yao; Yang, Jinzhong; Hwang, Ken-Pin; Wang, Jihong

    2016-01-01

    Because magnetic resonance imaging–guided radiation therapy (MRIgRT) offers exquisite soft tissue contrast and the ability to image tissues in arbitrary planes, the interest in this technology has increased dramatically in recent years. However, intrinsic geometric distortion stemming from both the system hardware and the magnetic properties of the patient affects MR images and compromises the spatial integrity of MRI-based radiation treatment planning, given that for real-time MRIgRT, precision within 2 mm is desired. In this article, we discuss the causes of geometric distortion, describe some well-known distortion correction algorithms, and review geometric distortion measurements from 12 studies, while taking into account relevant imaging parameters. Eleven of the studies reported phantom measurements quantifying system-dependent geometric distortion, while 2 studies reported simulation data quantifying magnetic susceptibility–induced geometric distortion. Of the 11 studies investigating system-dependent geometric distortion, 5 reported maximum measurements less than 2 mm. The simulation studies demonstrated that magnetic susceptibility–induced distortion is typically smaller than system-dependent distortion but still nonnegligible, with maximum distortion ranging from 2.1 to 2.6 mm at a field strength of 1.5 T. As expected, anatomic landmarks containing interfaces between air and soft tissue had the largest distortions. The evidence indicates that geometric distortion reduces the spatial integrity of MRI-based radiation treatment planning and likely diminishes the efficacy of MRIgRT. Better phantom measurement techniques and more effective distortion correction algorithms are needed to achieve the desired spatial precision.

  12. High-precision spatial localization of mouse vocalizations during social interaction.

    Heckman, Jesse J; Proville, Rémi; Heckman, Gert J; Azarfar, Alireza; Celikel, Tansu; Englitz, Bernhard

    2017-06-07

    Mice display a wide repertoire of vocalizations that varies with age, sex, and context. Especially during courtship, mice emit ultrasonic vocalizations (USVs) of high complexity, whose detailed structure is poorly understood. As animals of both sexes vocalize, the study of social vocalizations requires attributing single USVs to individuals. The state-of-the-art in sound localization for USVs allows spatial localization at centimeter resolution, however, animals interact at closer ranges, involving tactile, snout-snout exploration. Hence, improved algorithms are required to reliably assign USVs. We develop multiple solutions to USV localization, and derive an analytical solution for arbitrary vertical microphone positions. The algorithms are compared on wideband acoustic noise and single mouse vocalizations, and applied to social interactions with optically tracked mouse positions. A novel, (frequency) envelope weighted generalised cross-correlation outperforms classical cross-correlation techniques. It achieves a median error of ~1.4 mm for noise and ~4-8.5 mm for vocalizations. Using this algorithms in combination with a level criterion, we can improve the assignment for interacting mice. We report significant differences in mean USV properties between CBA mice of different sexes during social interaction. Hence, the improved USV attribution to individuals lays the basis for a deeper understanding of social vocalizations, in particular sequences of USVs.

  13. Hippocampal structure and human cognition: key role of spatial processing and evidence supporting the efficiency hypothesis in females

    Colom, Roberto; Stein, Jason L.; Rajagopalan, Priya; Martínez, Kenia; Hermel, David; Wang, Yalin; Álvarez-Linera, Juan; Burgaleta, Miguel; Quiroga, MªÁngeles; Shih, Pei Chun; Thompson, Paul M.

    2014-01-01

    Here we apply a method for automated segmentation of the hippocampus in 3D high-resolution structural brain MRI scans. One hundred and four healthy young adults completed twenty one tasks measuring abstract, verbal, and spatial intelligence, along with working memory, executive control, attention, and processing speed. After permutation tests corrected for multiple comparisons across vertices (p related to hippocampal structural differences. PMID:25632167

  14. Spatial distribution, sampling precision and survey design optimisation with non-normal variables: The case of anchovy (Engraulis encrasicolus) recruitment in Spanish Mediterranean waters

    Tugores, M. Pilar; Iglesias, Magdalena; Oñate, Dolores; Miquel, Joan

    2016-02-01

    In the Mediterranean Sea, the European anchovy (Engraulis encrasicolus) displays a key role in ecological and economical terms. Ensuring stock sustainability requires the provision of crucial information, such as species spatial distribution or unbiased abundance and precision estimates, so that management strategies can be defined (e.g. fishing quotas, temporal closure areas or marine protected areas MPA). Furthermore, the estimation of the precision of global abundance at different sampling intensities can be used for survey design optimisation. Geostatistics provide a priori unbiased estimations of the spatial structure, global abundance and precision for autocorrelated data. However, their application to non-Gaussian data introduces difficulties in the analysis in conjunction with low robustness or unbiasedness. The present study applied intrinsic geostatistics in two dimensions in order to (i) analyse the spatial distribution of anchovy in Spanish Western Mediterranean waters during the species' recruitment season, (ii) produce distribution maps, (iii) estimate global abundance and its precision, (iv) analyse the effect of changing the sampling intensity on the precision of global abundance estimates and, (v) evaluate the effects of several methodological options on the robustness of all the analysed parameters. The results suggested that while the spatial structure was usually non-robust to the tested methodological options when working with the original dataset, it became more robust for the transformed datasets (especially for the log-backtransformed dataset). The global abundance was always highly robust and the global precision was highly or moderately robust to most of the methodological options, except for data transformation.

  15. Predicting recovery from acid rain using the micro-spatial heterogeneity of soil columns downhill the infiltration zone of beech stemflow: introduction of a hypothesis.

    Berger, Torsten W; Muras, Alexander

    Release of stored sulfur may delay the recovery of soil pH from Acid Rain. It is hypothesized that analyzing the micro-spatial heterogeneity of soil columns downhill of a beech stem enables predictions of soil recovery as a function of historic acid loads and time. We demonstrated in a very simplified approach, how these two different factors may be untangled from each other using synthetic data. Thereafter, we evaluated the stated hypothesis based upon chemical soil data with increasing distance from the stem of beech trees. It is predicted that the top soil will recover from acid deposition, as already recorded in the infiltration zone of stemflow near the base of the stem. However, in the between trees areas and especially in deeper soil horizons recovery may be highly delayed.

  16. Close but no cigar: Spatial precision deficits following medial temporal lobe lesions provide novel insight into theoretical models of navigation and memory.

    Kolarik, Branden S; Baer, Trevor; Shahlaie, Kiarash; Yonelinas, Andrew P; Ekstrom, Arne D

    2018-01-01

    Increasing evidence suggests that the human hippocampus contributes to a range of different behaviors, including episodic memory, language, short-term memory, and navigation. A novel theoretical framework, the Precision and Binding Model, accounts for these phenomenon by describing a role for the hippocampus in high-resolution, complex binding. Other theories like Cognitive Map Theory, in contrast, predict a specific role for the hippocampus in allocentric navigation, while Declarative Memory Theory predicts a specific role in delay-dependent conscious memory. Navigation provides a unique venue for testing these predictions, with past results from research with humans providing inconsistent findings regarding the role of the human hippocampus in spatial navigation. Here, we tested five patients with lesions primarily restricted to the hippocampus and those extending out into the surrounding medial temporal lobe cortex on a virtual water maze task. Consistent with the Precision and Binding Model, we found partially intact allocentric memory in all patients, with impairments in the spatial precision of their searches for a hidden target. We found similar impairments at both immediate and delayed testing. Our findings are consistent with the Precision and Binding Model of hippocampal function, arguing for its role across domains in high-resolution, complex binding. Remembering goal locations in one's environment is a critical skill for survival. How this information is represented in the brain is still not fully understood, but is believed to rely in some capacity on structures in the medial temporal lobe. Contradictory findings from studies of both humans and animals have been difficult to reconcile with regard to the role of the MTL, specifically the hippocampus. By assessing impairments observed during navigation to a goal in patients with medial temporal lobe damage we can better understand the role these structures play in such behavior. Utilizing virtual reality

  17. Using experimental design and spatial analyses to improve the precision of NDVI estimates in upland cotton field trials

    Controlling for spatial variability is important in high-throughput phenotyping studies that enable large numbers of genotypes to be evaluated across time and space. In the current study, we compared the efficacy of different experimental designs and spatial models in the analysis of canopy spectral...

  18. Regionalisation of a distributed method for flood quantiles estimation: Revaluation of local calibration hypothesis to enhance the spatial structure of the optimised parameter

    Odry, Jean; Arnaud, Patrick

    2016-04-01

    The SHYREG method (Aubert et al., 2014) associates a stochastic rainfall generator and a rainfall-runoff model to produce rainfall and flood quantiles on a 1 km2 mesh covering the whole French territory. The rainfall generator is based on the description of rainy events by descriptive variables following probability distributions and is characterised by a high stability. This stochastic generator is fully regionalised, and the rainfall-runoff transformation is calibrated with a single parameter. Thanks to the stability of the approach, calibration can be performed against only flood quantiles associated with observated frequencies which can be extracted from relatively short time series. The aggregation of SHYREG flood quantiles to the catchment scale is performed using an areal reduction factor technique unique on the whole territory. Past studies demonstrated the accuracy of SHYREG flood quantiles estimation for catchments where flow data are available (Arnaud et al., 2015). Nevertheless, the parameter of the rainfall-runoff model is independently calibrated for each target catchment. As a consequence, this parameter plays a corrective role and compensates approximations and modelling errors which makes difficult to identify its proper spatial pattern. It is an inherent objective of the SHYREG approach to be completely regionalised in order to provide a complete and accurate flood quantiles database throughout France. Consequently, it appears necessary to identify the model configuration in which the calibrated parameter could be regionalised with acceptable performances. The revaluation of some of the method hypothesis is a necessary step before the regionalisation. Especially the inclusion or the modification of the spatial variability of imposed parameters (like production and transfer reservoir size, base flow addition and quantiles aggregation function) should lead to more realistic values of the only calibrated parameter. The objective of the work presented

  19. Progressive impairment of directional and spatially precise trajectories by TgF344-AD Rats in the Morris Water Task

    Thompson, Shannon; Harvey, Ryan; Clark, Benjamin; Drake, Emma; Berkowitz, Laura

    2018-01-01

    Spatial navigation is impaired in early stages of Alzheimers disease (AD), and may be a defining behavioral marker of preclinical AD. Nevertheless, limitations of diagnostic criteria for AD and within animal models of AD make characterization of preclinical AD difficult. A new rat model (TgF344-AD) of AD overcomes many of these limitations, though spatial navigation has not been comprehensively assessed. Using the hidden and cued platform variants of the Morris water task, a longitudinal asse...

  20. A Precision-Positioning Method for a High-Acceleration Low-Load Mechanism Based on Optimal Spatial and Temporal Distribution of Inertial Energy

    Xin Chen

    2015-09-01

    Full Text Available High-speed and precision positioning are fundamental requirements for high-acceleration low-load mechanisms in integrated circuit (IC packaging equipment. In this paper, we derive the transient nonlinear dynamicresponse equations of high-acceleration mechanisms, which reveal that stiffness, frequency, damping, and driving frequency are the primary factors. Therefore, we propose a new structural optimization and velocity-planning method for the precision positioning of a high-acceleration mechanism based on optimal spatial and temporal distribution of inertial energy. For structural optimization, we first reviewed the commonly flexible multibody dynamic optimization using equivalent static loads method (ESLM, and then we selected the modified ESLM for optimal spatial distribution of inertial energy; hence, not only the stiffness but also the inertia and frequency of the real modal shapes are considered. For velocity planning, we developed a new velocity-planning method based on nonlinear dynamic-response optimization with varying motion conditions. Our method was verified on a high-acceleration die bonder. The amplitude of residual vibration could be decreased by more than 20% via structural optimization and the positioning time could be reduced by more than 40% via asymmetric variable velocity planning. This method provides an effective theoretical support for the precision positioning of high-acceleration low-load mechanisms.

  1. Accounting for the measurement error of spectroscopically inferred soil carbon data for improved precision of spatial predictions.

    Somarathna, P D S N; Minasny, Budiman; Malone, Brendan P; Stockmann, Uta; McBratney, Alex B

    2018-08-01

    Spatial modelling of environmental data commonly only considers spatial variability as the single source of uncertainty. In reality however, the measurement errors should also be accounted for. In recent years, infrared spectroscopy has been shown to offer low cost, yet invaluable information needed for digital soil mapping at meaningful spatial scales for land management. However, spectrally inferred soil carbon data are known to be less accurate compared to laboratory analysed measurements. This study establishes a methodology to filter out the measurement error variability by incorporating the measurement error variance in the spatial covariance structure of the model. The study was carried out in the Lower Hunter Valley, New South Wales, Australia where a combination of laboratory measured, and vis-NIR and MIR inferred topsoil and subsoil soil carbon data are available. We investigated the applicability of residual maximum likelihood (REML) and Markov Chain Monte Carlo (MCMC) simulation methods to generate parameters of the Matérn covariance function directly from the data in the presence of measurement error. The results revealed that the measurement error can be effectively filtered-out through the proposed technique. When the measurement error was filtered from the data, the prediction variance almost halved, which ultimately yielded a greater certainty in spatial predictions of soil carbon. Further, the MCMC technique was successfully used to define the posterior distribution of measurement error. This is an important outcome, as the MCMC technique can be used to estimate the measurement error if it is not explicitly quantified. Although this study dealt with soil carbon data, this method is amenable for filtering the measurement error of any kind of continuous spatial environmental data. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Precise and Arbitrary Deposition of Biomolecules onto Biomimetic Fibrous Matrices for Spatially Controlled Cell Distribution and Functions.

    Jia, Chao; Luo, Bowen; Wang, Haoyu; Bian, Yongqian; Li, Xueyong; Li, Shaohua; Wang, Hongjun

    2017-09-01

    Advances in nano-/microfabrication allow the fabrication of biomimetic substrates for various biomedical applications. In particular, it would be beneficial to control the distribution of cells and relevant biomolecules on an extracellular matrix (ECM)-like substrate with arbitrary micropatterns. In this regard, the possibilities of patterning biomolecules and cells on nanofibrous matrices are explored here by combining inkjet printing and electrospinning. Upon investigation of key parameters for patterning accuracy and reproducibility, three independent studies are performed to demonstrate the potential of this platform for: i) transforming growth factor (TGF)-β1-induced spatial differentiation of fibroblasts, ii) spatiotemporal interactions between breast cancer cells and stromal cells, and iii) cancer-regulated angiogenesis. The results show that TGF-β1 induces local fibroblast-to-myofibroblast differentiation in a dose-dependent fashion, and breast cancer clusters recruit activated stromal cells and guide the sprouting of endothelial cells in a spatially resolved manner. The established platform not only provides strategies to fabricate ECM-like interfaces for medical devices, but also offers the capability of spatially controlling cell organization for fundamental studies, and for high-throughput screening of various biomolecules for stem cell differentiation and cancer therapeutics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Precision Cosmology

    Jones, Bernard J. T.

    2017-04-01

    Preface; Notation and conventions; Part I. 100 Years of Cosmology: 1. Emerging cosmology; 2. The cosmic expansion; 3. The cosmic microwave background; 4. Recent cosmology; Part II. Newtonian Cosmology: 5. Newtonian cosmology; 6. Dark energy cosmological models; 7. The early universe; 8. The inhomogeneous universe; 9. The inflationary universe; Part III. Relativistic Cosmology: 10. Minkowski space; 11. The energy momentum tensor; 12. General relativity; 13. Space-time geometry and calculus; 14. The Einstein field equations; 15. Solutions of the Einstein equations; 16. The Robertson-Walker solution; 17. Congruences, curvature and Raychaudhuri; 18. Observing and measuring the universe; Part IV. The Physics of Matter and Radiation: 19. Physics of the CMB radiation; 20. Recombination of the primeval plasma; 21. CMB polarisation; 22. CMB anisotropy; Part V. Precision Tools for Precision Cosmology: 23. Likelihood; 24. Frequentist hypothesis testing; 25. Statistical inference: Bayesian; 26. CMB data processing; 27. Parametrising the universe; 28. Precision cosmology; 29. Epilogue; Appendix A. SI, CGS and Planck units; Appendix B. Magnitudes and distances; Appendix C. Representing vectors and tensors; Appendix D. The electromagnetic field; Appendix E. Statistical distributions; Appendix F. Functions on a sphere; Appendix G. Acknowledgements; References; Index.

  4. Does the stress-gradient hypothesis hold water? Disentangling spatial and temporal variation in plant effects on soil moisture in dryland systems

    Butterfield, Bradley J.; Bradford, John B.; Armas, Cristina; Prieto, Ivan; Pugnaire, Francisco I.

    2016-01-01

    The nature of the relationship between water limitation and facilitation has been one of the most contentious debates surrounding the stress-gradient hypothesis (SGH), which states that plant-plant interactions shift from competition to facilitation with increasing environmental stress.

  5. Spatial characteristics of sediment trace metals in an eastern boundary upwelling retention area (St. Helena Bay, South Africa): A hydrodynamic-biological pump hypothesis

    Monteiro, PMS

    2005-10-01

    Full Text Available fluxes from bottom sediments defined by a high sedimentation rate of organic matter. It is proposed that trace metals may play an important role in alleviating part of the ecological stress by forming sulfide complexes in such systems. A spatially...

  6. The religiosity as social value hypothesis: A multi-method replication and extension across 65 countries and three levels of spatial aggregation.

    Gebauer, Jochen E; Sedikides, Constantine; Schönbrodt, Felix D; Bleidorn, Wiebke; Rentfrow, Peter J; Potter, Jeff; Gosling, Samuel D

    2017-09-01

    Are religious people psychologically better or worse adjusted than their nonreligious counterparts? Hundreds of studies have reported a positive relation between religiosity and psychological adjustment. Recently, however, a comparatively small number of cross-cultural studies has questioned this staple of religiosity research. The latter studies find that religious adjustment benefits are restricted to religious cultures. Gebauer, Sedikides, and Neberich (2012) suggested the religiosity as social value hypothesis (RASV) as one explanation for those cross-cultural differences. RASV states that, in religious cultures, religiosity possesses much social value, and, as such, religious people will feel particularly good about themselves. In secular cultures, however, religiosity possesses limited social value, and, as such, religious people will feel less good about themselves, if at all. Yet, previous evidence has been inconclusive regarding RASV and regarding cross-cultural differences in religious adjustment benefits more generally. To clarify matters, we conducted 3 replication studies. We examined the relation between religiosity and self-esteem (the most direct and appropriate adjustment indicator, according to RASV) in a self-report study across 65 countries (N = 2,195,301), an informant-report study across 36 countries (N = 560,264), and another self-report study across 1,932 urban areas from 243 federal states in 18 countries (N = 1,188,536). Moreover, we scrutinized our results against 7, previously untested, alternative explanations. Our results fully and firmly replicated and extended prior evidence for cross-cultural differences in religious adjustment benefits. These cross-cultural differences were best explained by RASV. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  7. IQ as moderator of terminal decline in perceptual and motor speed, spatial, and verbal ability: Testing the cognitive reserve hypothesis in a population-based sample followed from age 70 until death.

    Thorvaldsson, Valgeir; Skoog, Ingmar; Johansson, Boo

    2017-03-01

    Terminal decline (TD) refers to acceleration in within-person cognitive decline prior to death. The cognitive reserve hypothesis postulates that individuals with higher IQ are able to better tolerate age-related increase in brain pathologies. On average, they will exhibit a later onset of TD, but once they start to decline, their trajectory is steeper relative to those with lower IQ. We tested these predictions using data from initially nondemented individuals (n = 179) in the H70-study repeatedly measured at ages 70, 75, 79, 81, 85, 88, 90, 92, 95, 97, 99, and 100, or until death, on cognitive tests of perceptual-and-motor-speed and spatial and verbal ability. We quantified IQ using the Raven's Coloured Progressive Matrices (RCPM) test administrated at age 70. We fitted random change point TD models to the data, within a Bayesian framework, conditioned on IQ, age of death, education, and sex. In line with predictions, we found that 1 additional standard deviation on the IQ scale was associated with a delay in onset of TD by 1.87 (95% highest density interval [HDI; 0.20, 4.08]) years on speed, 1.96 (95% HDI [0.15, 3.54]) years on verbal ability, but only 0.88 (95% HDI [-0.93, 3.49]) year on spatial ability. Higher IQ was associated with steeper rate of decline within the TD phase on measures of speed and verbal ability, whereas results on spatial ability were nonconclusive. Our findings provide partial support for the cognitive reserve hypothesis and demonstrate that IQ can be a significant moderator of cognitive change trajectories in old age. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  8. What is the role of spatial processing in the decline of episodic memory in Alzheimer's disease? The "mental frame syncing" hypothesis

    Silvia eSerino

    2014-03-01

    Full Text Available The current theories on episodic memory suggest a crucial role of spatial processing for an effective retrieval. When prompted by a retrieval cue, the full past scene can be retrieved through the process of pattern completion. Thanks to retrosplenial cortex, this allocentric representation is translated to an egocentric representation in the medial parietal areas via information updating from other cells: place cells inform the viewpoint location, head-direction cells the viewing direction, and grid cells the self-motion signals. Based on several evidence, we argue that a crucial role for an episodic retrieval is played by a "mental frame syncing" between the allocentric view-point dependent representation and the allocentric view-point independent representation. If the mental frame syncing stops, even momentarily, it is difficult to reconstruct a coherent representation for an effective episodic recall. This is what apparently happens in Alzheimer's disease: a break in the mental frame syncing between these two kinds of allocentric representations, underpinned by damage to the hippocampus, may contribute significantly to the early deficit in episodic memory.

  9. Precision manufacturing

    Dornfeld, David

    2008-01-01

    Today there is a high demand for high-precision products. The manufacturing processes are now highly sophisticated and derive from a specialized genre called precision engineering. Precision Manufacturing provides an introduction to precision engineering and manufacturing with an emphasis on the design and performance of precision machines and machine tools, metrology, tooling elements, machine structures, sources of error, precision machining processes and precision process planning. As well as discussing the critical role precision machine design for manufacturing has had in technological developments over the last few hundred years. In addition, the influence of sustainable manufacturing requirements in precision processes is introduced. Drawing upon years of practical experience and using numerous examples and illustrative applications, David Dornfeld and Dae-Eun Lee cover precision manufacturing as it applies to: The importance of measurement and metrology in the context of Precision Manufacturing. Th...

  10. Comparing Spatial Predictions

    Hering, Amanda S.; Genton, Marc G.

    2011-01-01

    Under a general loss function, we develop a hypothesis test to determine whether a significant difference in the spatial predictions produced by two competing models exists on average across the entire spatial domain of interest. The null hypothesis

  11. Dissecting the Gravitational Lens B1608 656. II. Precision Measurements of the Hubble Constant, Spatial Curvature, and the Dark Energy Equation of State

    Suyu, S.H.; /Argelander Inst. Astron.; Marshall, P.J.; /KIPAC, Menlo Park /UC, Santa Barbara; Auger, M.W.; /UC, Santa Barbara /UC, Davis; Hilbert, S.; /Argelander Inst. Astron. /Garching, Max Planck Inst.; Blandford, R.D.; /KIPAC, Menlo Park; Koopmans, L.V.E.; /Kapteyn Astron. Inst., Groningen; Fassnacht, C.D.; /UC, Davis; Treu, T.; /UC, Santa Barbara

    2009-12-11

    between {Omega}{sub m} and {Omega}{sub {Lambda}} at w = -1 and constrains the curvature parameter to be -0.031 < {Omega}{sub k} < 0.009 (95% CL), a level of precision comparable to that afforded by the current Type Ia SNe sample. Asserting a flat spatial geometry, we find that, in combination with WMAP, H{sub 0} = 69.7{sub 5.0}{sup +4.9} km s{sup -1} Mpc{sup -1} and w = -0.94{sub -0.19}{sup +0.17} (68% CL), suggesting that the observations of B1608+656 constrain w as tightly as do the current Baryon Acoustic Oscillation data.

  12. THE FRACTAL MARKET HYPOTHESIS

    FELICIA RAMONA BIRAU

    2012-01-01

    In this article, the concept of capital market is analysed using Fractal Market Hypothesis which is a modern, complex and unconventional alternative to classical finance methods. Fractal Market Hypothesis is in sharp opposition to Efficient Market Hypothesis and it explores the application of chaos theory and fractal geometry to finance. Fractal Market Hypothesis is based on certain assumption. Thus, it is emphasized that investors did not react immediately to the information they receive and...

  13. Variability: A Pernicious Hypothesis.

    Noddings, Nel

    1992-01-01

    The hypothesis of greater male variability in test results is discussed in its historical context, and reasons feminists have objected to the hypothesis are considered. The hypothesis acquires political importance if it is considered that variability results from biological, rather than cultural, differences. (SLD)

  14. THE FRACTAL MARKET HYPOTHESIS

    FELICIA RAMONA BIRAU

    2012-05-01

    Full Text Available In this article, the concept of capital market is analysed using Fractal Market Hypothesis which is a modern, complex and unconventional alternative to classical finance methods. Fractal Market Hypothesis is in sharp opposition to Efficient Market Hypothesis and it explores the application of chaos theory and fractal geometry to finance. Fractal Market Hypothesis is based on certain assumption. Thus, it is emphasized that investors did not react immediately to the information they receive and of course, the manner in which they interpret that information may be different. Also, Fractal Market Hypothesis refers to the way that liquidity and investment horizons influence the behaviour of financial investors.

  15. Why precision?

    Bluemlein, Johannes

    2012-05-15

    Precision measurements together with exact theoretical calculations have led to steady progress in fundamental physics. A brief survey is given on recent developments and current achievements in the field of perturbative precision calculations in the Standard Model of the Elementary Particles and their application in current high energy collider data analyses.

  16. Why precision?

    Bluemlein, Johannes

    2012-05-01

    Precision measurements together with exact theoretical calculations have led to steady progress in fundamental physics. A brief survey is given on recent developments and current achievements in the field of perturbative precision calculations in the Standard Model of the Elementary Particles and their application in current high energy collider data analyses.

  17. Physiopathological Hypothesis of Cellulite

    de Godoy, José Maria Pereira; de Godoy, Maria de Fátima Guerreiro

    2009-01-01

    A series of questions are asked concerning this condition including as regards to its name, the consensus about the histopathological findings, physiological hypothesis and treatment of the disease. We established a hypothesis for cellulite and confirmed that the clinical response is compatible with this hypothesis. Hence this novel approach brings a modern physiological concept with physiopathologic basis and clinical proof of the hypothesis. We emphasize that the choice of patient, correct diagnosis of cellulite and the technique employed are fundamental to success. PMID:19756187

  18. Development and validation of an automated and marker-free CT-based spatial analysis method (CTSA) for assessment of femoral hip implant migration: In vitro accuracy and precision comparable to that of radiostereometric analysis (RSA).

    Scheerlinck, Thierry; Polfliet, Mathias; Deklerck, Rudi; Van Gompel, Gert; Buls, Nico; Vandemeulebroucke, Jef

    2016-01-01

    We developed a marker-free automated CT-based spatial analysis (CTSA) method to detect stem-bone migration in consecutive CT datasets and assessed the accuracy and precision in vitro. Our aim was to demonstrate that in vitro accuracy and precision of CTSA is comparable to that of radiostereometric analysis (RSA). Stem and bone were segmented in 2 CT datasets and both were registered pairwise. The resulting rigid transformations were compared and transferred to an anatomically sound coordinate system, taking the stem as reference. This resulted in 3 translation parameters and 3 rotation parameters describing the relative amount of stem-bone displacement, and it allowed calculation of the point of maximal stem migration. Accuracy was evaluated in 39 comparisons by imposing known stem migration on a stem-bone model. Precision was estimated in 20 comparisons based on a zero-migration model, and in 5 patients without stem loosening. Limits of the 95% tolerance intervals (TIs) for accuracy did not exceed 0.28 mm for translations and 0.20° for rotations (largest standard deviation of the signed error (SD(SE)): 0.081 mm and 0.057°). In vitro, limits of the 95% TI for precision in a clinically relevant setting (8 comparisons) were below 0.09 mm and 0.14° (largest SD(SE): 0.012 mm and 0.020°). In patients, the precision was lower, but acceptable, and dependent on CT scan resolution. CTSA allows detection of stem-bone migration with an accuracy and precision comparable to that of RSA. It could be valuable for evaluation of subtle stem loosening in clinical practice.

  19. Life Origination Hydrate Hypothesis (LOH-Hypothesis

    Victor Ostrovskii

    2012-01-01

    Full Text Available The paper develops the Life Origination Hydrate Hypothesis (LOH-hypothesis, according to which living-matter simplest elements (LMSEs, which are N-bases, riboses, nucleosides, nucleotides, DNA- and RNA-like molecules, amino-acids, and proto-cells repeatedly originated on the basis of thermodynamically controlled, natural, and inevitable processes governed by universal physical and chemical laws from CH4, niters, and phosphates under the Earth's surface or seabed within the crystal cavities of the honeycomb methane-hydrate structure at low temperatures; the chemical processes passed slowly through all successive chemical steps in the direction that is determined by a gradual decrease in the Gibbs free energy of reacting systems. The hypothesis formulation method is based on the thermodynamic directedness of natural movement and consists ofan attempt to mentally backtrack on the progression of nature and thus reveal principal milestones alongits route. The changes in Gibbs free energy are estimated for different steps of the living-matter origination process; special attention is paid to the processes of proto-cell formation. Just the occurrence of the gas-hydrate periodic honeycomb matrix filled with LMSEs almost completely in its final state accounts for size limitation in the DNA functional groups and the nonrandom location of N-bases in the DNA chains. The slowness of the low-temperature chemical transformations and their “thermodynamic front” guide the gross process of living matter origination and its successive steps. It is shown that the hypothesis is thermodynamically justified and testable and that many observed natural phenomena count in its favor.

  20. Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture

    Sanfilippo, Antonio P [Richland, WA; Cowell, Andrew J [Kennewick, WA; Gregory, Michelle L [Richland, WA; Baddeley, Robert L [Richland, WA; Paulson, Patrick R [Pasco, WA; Tratz, Stephen C [Richland, WA; Hohimer, Ryan E [West Richland, WA

    2012-03-20

    Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

  1. On the Keyhole Hypothesis

    Mikkelsen, Kaare B.; Kidmose, Preben; Hansen, Lars Kai

    2017-01-01

    simultaneously recorded scalp EEG. A cross-validation procedure was employed to ensure unbiased estimates. We present several pieces of evidence in support of the keyhole hypothesis: There is a high mutual information between data acquired at scalp electrodes and through the ear-EEG "keyhole," furthermore we......We propose and test the keyhole hypothesis that measurements from low dimensional EEG, such as ear-EEG reflect a broadly distributed set of neural processes. We formulate the keyhole hypothesis in information theoretical terms. The experimental investigation is based on legacy data consisting of 10...

  2. Spatial analysis of soybean canopy response to soybean cyst nematodes (Heterodera glycines) in eastern Arkansas: An approach to future precision agriculture technology application

    Kulkarni, Subodh

    2008-10-01

    Heterodera glycines Ichinohe, commonly known as soybean cyst nematode (SCN) is a serious widespread pathogen of soybean in the US. Present research primarily investigated feasibility of detecting SCN infestation in the field using aerial images and ground level spectrometric sensing. Non-spatial and spatial linear regression analyses were performed to correlate SCN population densities with Normalized Difference Vegetation Index (NDVI) and Green NDVI (GNDVI) derived from soybean canopy spectra. Field data were obtained from two fields; Field A and B under different nematode control strategies in 2003 and 2004. Analysis of aerial image data from July 18, 2004 from the Field A showed a significant relationship between SCN population at planting and the GNDVI (R2=0.17 at p=0.0006). Linear regression analysis revealed that SCN had a little effect on yield (R2 =0.14, at p=0.0001, RMSEP=1052.42 kg ha-1) and GNDVI (R 2=0.17 at p=0.0006, RMSEP=0.087) derived from the aerial imagery on a single date. However, the spatial regression analysis based on spherical semivariogram showed that the RMSEP was 0.037 for the GNDVI on July 18, 2004 and 427.32 kg ha-1 for yield on October 14, 2003 indicating better model performance. For July 18, 2004 data from Field B, a relationship between NDVI and the cyst counts at planting was significant (R2=0.5 at p=0.0468). Non-spatial analyses of the ground level spectrometric data for the first field showed that NDVI and GNDVI were correlated with cyst counts at planting (R 2=0.34 and 0.27 at p=0.0015 and 0.0127, respectively), and GNDVI was correlated with eggs count at planting (R2= 0.27 at p=0.0118). Both NDVI and GNDVI were correlated with egg counts at flowering (R 2=0.34 and 0.27 at p=0.0013 and 0.0018, respectively). However, paired T test to validate the above relationships showed that, predicted values of NDVI and GNDVI were significantly different. The statistical evidences suggested that variability in vegetation indices was caused

  3. Precision translator

    Reedy, Robert P.; Crawford, Daniel W.

    1984-01-01

    A precision translator for focusing a beam of light on the end of a glass fiber which includes two turning fork-like members rigidly connected to each other. These members have two prongs each with its separation adjusted by a screw, thereby adjusting the orthogonal positioning of a glass fiber attached to one of the members. This translator is made of simple parts with capability to keep adjustment even in condition of rough handling.

  4. Different effects of the two types of spatial pre-cueing: what precisely is "attention" in Di Lollo's and Enns' substitution masking theory?

    Luiga, I; Bachmann, T

    2007-11-01

    Enns and Di Lollo [Psychological Science, 8 (2), 135-139, 1997] have introduced the object substitution theory of visual masking. Object substitution masking occurs when focusing attention on the target is delayed. However, Posner (Quarterly Journal of Experimental Psychology, 32, 3-25, 1980) has already shown that attention can be directed to a target at least in two ways: intentionally (endogenously) and automatically (exogenously). We conducted two experiments to explore the effects of endogenous and exogenous cues on substitution masking. The results showed that when attention was shifted to the target location automatically (using a local peripheral pre-cue), masking was attenuated. A decrease in target identification dependent on a delay of mask offset, typical to substitution masking, was not observed. However, strong substitution masking occurred when the target location was not pre-cued or when attention was directed to the target location intentionally (using a symbolic pre-cue displayed centrally). The hypothesis of two different mechanisms of attentional control in substitution masking was confirmed.

  5. Study of a particle detector with very high spatial precision (drift chambers), and analysis of the physical phenomena governing the operation of this detector

    Schultz, Guy.

    1976-01-01

    The physical principles of drift chambers are studied and various measurements which can be performed with these chambers are described. The laws governing the passage of particles through matter are first reviewed and different transport coefficients, (velocity, scattering coefficient, characteristic energy ...) of the electrons under the influence of an electric field for different gases (argon, CO 2 , isobutane, methane, methylal) are studied. The theoretical predictions are then compared with the experimental results. The different amplification processes in the gas and the space charge effect of the positive ions on electron multiplication for large particle fluxes are also studied as well as the mobility of positive ions in different gases. After these results, the operating characteristics (efficiency, linearity of the space-time ratio, spatial resolution), with and without an external magnetic field are determined [fr

  6. Precision Airdrop (Largage de precision)

    2005-12-01

    NAVIGATION TO A PRECISION AIRDROP OVERVIEW RTO-AG-300-V24 2 - 9 the point from various compass headings. As the tests are conducted, the resultant...rate. This approach avoids including a magnetic compass for the heading reference, which has difficulties due to local changes in the magnetic field...Scientifica della Difesa ROYAUME-UNI Via XX Settembre 123 Dstl Knowledge Services ESPAGNE 00187 Roma Information Centre, Building 247 SDG TECEN / DGAM

  7. The Stoichiometric Divisome: A Hypothesis

    Waldemar eVollmer

    2015-05-01

    Full Text Available Dividing Escherichia coli cells simultaneously constrict the inner membrane, peptidoglycan layer and outer membrane to synthesize the new poles of the daughter cells. For this, more than 30 proteins localize to mid-cell where they form a large, ring-like assembly, the divisome, facilitating division. Although the precise function of most divisome proteins is unknown, it became apparent in recent years that dynamic protein-protein interactions are essential for divisome assembly and function. However, little is known about the nature of the interactions involved and the stoichiometry of the proteins within the divisome. A recent study (Li et al., 2014 used ribosome profiling to measure the absolute protein synthesis rates in E. coli. Interestingly, they observed that most proteins which participate in known multiprotein complexes are synthesized proportional to their stoichiometry. Based on this principle we present a hypothesis for the stoichiometry of the core of the divisome, taking into account known protein-protein interactions. From this hypothesis we infer a possible mechanism for PG synthesis during division.

  8. The Qualitative Expectations Hypothesis

    Frydman, Roman; Johansen, Søren; Rahbek, Anders

    2017-01-01

    We introduce the Qualitative Expectations Hypothesis (QEH) as a new approach to modeling macroeconomic and financial outcomes. Building on John Muth's seminal insight underpinning the Rational Expectations Hypothesis (REH), QEH represents the market's forecasts to be consistent with the predictions...... of an economistís model. However, by assuming that outcomes lie within stochastic intervals, QEH, unlike REH, recognizes the ambiguity faced by an economist and market participants alike. Moreover, QEH leaves the model open to ambiguity by not specifying a mechanism determining specific values that outcomes take...

  9. The Qualitative Expectations Hypothesis

    Frydman, Roman; Johansen, Søren; Rahbek, Anders

    We introduce the Qualitative Expectations Hypothesis (QEH) as a new approach to modeling macroeconomic and financial outcomes. Building on John Muth's seminal insight underpinning the Rational Expectations Hypothesis (REH), QEH represents the market's forecasts to be consistent with the predictions...... of an economist's model. However, by assuming that outcomes lie within stochastic intervals, QEH, unlike REH, recognizes the ambiguity faced by an economist and market participants alike. Moreover, QEH leaves the model open to ambiguity by not specifying a mechanism determining specific values that outcomes take...

  10. Revisiting the Dutch hypothesis

    Postma, Dirkje S.; Weiss, Scott T.; van den Berge, Maarten; Kerstjens, Huib A. M.; Koppelman, Gerard H.

    The Dutch hypothesis was first articulated in 1961, when many novel and advanced scientific techniques were not available, such as genomics techniques for pinpointing genes, gene expression, lipid and protein profiles, and the microbiome. In addition, computed tomographic scans and advanced analysis

  11. The Lehman Sisters Hypothesis

    I.P. van Staveren (Irene)

    2014-01-01

    markdownabstract__Abstract__ This article explores the Lehman Sisters Hypothesis. It reviews empirical literature about gender differences in behavioral, experimental, and neuro-economics as well as in other fields of behavioral research. It discusses gender differences along three dimensions of

  12. Bayesian Hypothesis Testing

    Andrews, Stephen A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sigeti, David E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-11-15

    These are a set of slides about Bayesian hypothesis testing, where many hypotheses are tested. The conclusions are the following: The value of the Bayes factor obtained when using the median of the posterior marginal is almost the minimum value of the Bayes factor. The value of τ2 which minimizes the Bayes factor is a reasonable choice for this parameter. This allows a likelihood ratio to be computed with is the least favorable to H0.

  13. The Drift Burst Hypothesis

    Christensen, Kim; Oomen, Roel; Renò, Roberto

    2016-01-01

    The Drift Burst Hypothesis postulates the existence of short-lived locally explosive trends in the price paths of financial assets. The recent US equity and Treasury flash crashes can be viewed as two high profile manifestations of such dynamics, but we argue that drift bursts of varying magnitude are an expected and regular occurrence in financial markets that can arise through established mechanisms such as feedback trading. At a theoretical level, we show how to build drift bursts into the...

  14. Hypothesis in research

    Eudaldo Enrique Espinoza Freire

    2018-01-01

    Full Text Available It is intended with this work to have a material with the fundamental contents, which enable the university professor to formulate the hypothesis, for the development of an investigation, taking into account the problem to be solved. For its elaboration, the search of information in primary documents was carried out, such as thesis of degree and reports of research results, selected on the basis of its relevance with the analyzed subject, current and reliability, secondary documents, as scientific articles published in journals of recognized prestige, the selection was made with the same terms as in the previous documents. It presents a conceptualization of the updated hypothesis, its characterization and an analysis of the structure of the hypothesis in which the determination of the variables is deepened. The involvement of the university professor in the teaching-research process currently faces some difficulties, which are manifested, among other aspects, in an unstable balance between teaching and research, which leads to a separation between them.

  15. [Dilemma of null hypothesis in ecological hypothesis's experiment test.

    Li, Ji

    2016-06-01

    Experimental test is one of the major test methods of ecological hypothesis, though there are many arguments due to null hypothesis. Quinn and Dunham (1983) analyzed the hypothesis deduction model from Platt (1964) and thus stated that there is no null hypothesis in ecology that can be strictly tested by experiments. Fisher's falsificationism and Neyman-Pearson (N-P)'s non-decisivity inhibit statistical null hypothesis from being strictly tested. Moreover, since the null hypothesis H 0 (α=1, β=0) and alternative hypothesis H 1 '(α'=1, β'=0) in ecological progresses are diffe-rent from classic physics, the ecological null hypothesis can neither be strictly tested experimentally. These dilemmas of null hypothesis could be relieved via the reduction of P value, careful selection of null hypothesis, non-centralization of non-null hypothesis, and two-tailed test. However, the statistical null hypothesis significance testing (NHST) should not to be equivalent to the causality logistical test in ecological hypothesis. Hence, the findings and conclusions about methodological studies and experimental tests based on NHST are not always logically reliable.

  16. The Bergschrund Hypothesis Revisited

    Sanders, J. W.; Cuffey, K. M.; MacGregor, K. R.

    2009-12-01

    After Willard Johnson descended into the Lyell Glacier bergschrund nearly 140 years ago, he proposed that the presence of the bergschrund modulated daily air temperature fluctuations and enhanced freeze-thaw processes. He posited that glaciers, through their ability to birth bergschrunds, are thus able to induce rapid cirque headwall retreat. In subsequent years, many researchers challenged the bergschrund hypothesis on grounds that freeze-thaw events did not occur at depth in bergschrunds. We propose a modified version of Johnson’s original hypothesis: that bergschrunds maintain subfreezing temperatures at values that encourage rock fracture via ice lensing because they act as a cold air trap in areas that would otherwise be held near zero by temperate glacial ice. In support of this claim we investigated three sections of the bergschrund at the West Washmawapta Glacier, British Columbia, Canada, which sits in an east-facing cirque. During our bergschrund reconnaissance we installed temperature sensors at multiple elevations, light sensors at depth in 2 of the 3 locations and painted two 1 m2 sections of the headwall. We first emphasize bergschrunds are not wanting for ice: verglas covers significant fractions of the headwall and icicles dangle from the base of bödens or overhanging rocks. If temperature, rather than water availability, is the limiting factor governing ice-lensing rates, our temperature records demonstrate that the bergschrund provides a suitable environment for considerable rock fracture. At the three sites (north, west, and south walls), the average temperature at depth from 9/3/2006 to 8/6/2007 was -3.6, -3.6, and -2.0 °C, respectively. During spring, when we observed vast amounts of snow melt trickle in to the bergschrund, temperatures averaged -3.7, -3.8, and -2.2 °C, respectively. Winter temperatures are even lower: -8.5, -7.3, and -2.4 °C, respectively. Values during the following year were similar. During the fall, diurnal

  17. Is the Aluminum Hypothesis Dead?

    2014-01-01

    The Aluminum Hypothesis, the idea that aluminum exposure is involved in the etiology of Alzheimer disease, dates back to a 1965 demonstration that aluminum causes neurofibrillary tangles in the brains of rabbits. Initially the focus of intensive research, the Aluminum Hypothesis has gradually been abandoned by most researchers. Yet, despite this current indifference, the Aluminum Hypothesis continues to attract the attention of a small group of scientists and aluminum continues to be viewed with concern by some of the public. This review article discusses reasons that mainstream science has largely abandoned the Aluminum Hypothesis and explores a possible reason for some in the general public continuing to view aluminum with mistrust. PMID:24806729

  18. Multi-agent sequential hypothesis testing

    Kim, Kwang-Ki K.

    2014-12-15

    This paper considers multi-agent sequential hypothesis testing and presents a framework for strategic learning in sequential games with explicit consideration of both temporal and spatial coordination. The associated Bayes risk functions explicitly incorporate costs of taking private/public measurements, costs of time-difference and disagreement in actions of agents, and costs of false declaration/choices in the sequential hypothesis testing. The corresponding sequential decision processes have well-defined value functions with respect to (a) the belief states for the case of conditional independent private noisy measurements that are also assumed to be independent identically distributed over time, and (b) the information states for the case of correlated private noisy measurements. A sequential investment game of strategic coordination and delay is also discussed as an application of the proposed strategic learning rules.

  19. Hypothesis Designs for Three-Hypothesis Test Problems

    Yan Li; Xiaolong Pu

    2010-01-01

    As a helpful guide for applications, the alternative hypotheses of the three-hypothesis test problems are designed under the required error probabilities and average sample number in this paper. The asymptotic formulas and the proposed numerical quadrature formulas are adopted, respectively, to obtain the hypothesis designs and the corresponding sequential test schemes under the Koopman-Darmois distributions. The example of the normal mean test shows that our methods are qu...

  20. Tests of the lunar hypothesis

    Taylor, S. R.

    1984-01-01

    The concept that the Moon was fissioned from the Earth after core separation is the most readily testable hypothesis of lunar origin, since direct comparisons of lunar and terrestrial compositions can be made. Differences found in such comparisons introduce so many ad hoc adjustments to the fission hypothesis that it becomes untestable. Further constraints may be obtained from attempting to date the volatile-refractory element fractionation. The combination of chemical and isotopic problems suggests that the fission hypothesis is no longer viable, and separate terrestrial and lunar accretion from a population of fractionated precursor planetesimals provides a more reasonable explanation.

  1. Evaluating the Stage Learning Hypothesis.

    Thomas, Hoben

    1980-01-01

    A procedure for evaluating the Genevan stage learning hypothesis is illustrated by analyzing Inhelder, Sinclair, and Bovet's guided learning experiments (in "Learning and the Development of Cognition." Cambridge: Harvard University Press, 1974). (Author/MP)

  2. The Purchasing Power Parity Hypothesis:

    2011-10-02

    Oct 2, 2011 ... reject the unit root hypothesis in real exchange rates may simply be due to the shortness ..... Violations of Purchasing Power Parity and Their Implications for Efficient ... Official Intervention in the Foreign Exchange Market:.

  3. Comparing Spatial Predictions

    Hering, Amanda S.

    2011-11-01

    Under a general loss function, we develop a hypothesis test to determine whether a significant difference in the spatial predictions produced by two competing models exists on average across the entire spatial domain of interest. The null hypothesis is that of no difference, and a spatial loss differential is created based on the observed data, the two sets of predictions, and the loss function chosen by the researcher. The test assumes only isotropy and short-range spatial dependence of the loss differential but does allow it to be non-Gaussian, non-zero-mean, and spatially correlated. Constant and nonconstant spatial trends in the loss differential are treated in two separate cases. Monte Carlo simulations illustrate the size and power properties of this test, and an example based on daily average wind speeds in Oklahoma is used for illustration. Supplemental results are available online. © 2011 American Statistical Association and the American Society for Qualitys.

  4. A test of the reward-value hypothesis.

    Smith, Alexandra E; Dalecki, Stefan J; Crystal, Jonathon D

    2017-03-01

    Rats retain source memory (memory for the origin of information) over a retention interval of at least 1 week, whereas their spatial working memory (radial maze locations) decays within approximately 1 day. We have argued that different forgetting functions dissociate memory systems. However, the two tasks, in our previous work, used different reward values. The source memory task used multiple pellets of a preferred food flavor (chocolate), whereas the spatial working memory task provided access to a single pellet of standard chow-flavored food at each location. Thus, according to the reward-value hypothesis, enhanced performance in the source memory task stems from enhanced encoding/memory of a preferred reward. We tested the reward-value hypothesis by using a standard 8-arm radial maze task to compare spatial working memory accuracy of rats rewarded with either multiple chocolate or chow pellets at each location using a between-subjects design. The reward-value hypothesis predicts superior accuracy for high-valued rewards. We documented equivalent spatial memory accuracy for high- and low-value rewards. Importantly, a 24-h retention interval produced equivalent spatial working memory accuracy for both flavors. These data are inconsistent with the reward-value hypothesis and suggest that reward value does not explain our earlier findings that source memory survives unusually long retention intervals.

  5. Hypothesis Testing Using Spatially Dependent Heavy Tailed Multisensor Data

    2014-12-01

    has been unyielding, for which I shall be eternally grateful. v TABLE OF CONTENTS Acknowledgments v List of Tables ix List of Figures x 1 Introduction...2nd ed. New York, NY: Springer, 2009. 121 [8] M. J. Beal, N. Jojic, and H. Attias, “A graphical model for audiovisual object tracking,” IEEE

  6. Large numbers hypothesis. II - Electromagnetic radiation

    Adams, P. J.

    1983-01-01

    This paper develops the theory of electromagnetic radiation in the units covariant formalism incorporating Dirac's large numbers hypothesis (LNH). A direct field-to-particle technique is used to obtain the photon propagation equation which explicitly involves the photon replication rate. This replication rate is fixed uniquely by requiring that the form of a free-photon distribution function be preserved, as required by the 2.7 K cosmic radiation. One finds that with this particular photon replication rate the units covariant formalism developed in Paper I actually predicts that the ratio of photon number to proton number in the universe varies as t to the 1/4, precisely in accord with LNH. The cosmological red-shift law is also derived and it is shown to differ considerably from the standard form of (nu)(R) - const.

  7. Precision oncology: origins, optimism, and potential.

    Prasad, Vinay; Fojo, Tito; Brada, Michael

    2016-02-01

    Imatinib, the first and arguably the best targeted therapy, became the springboard for developing drugs aimed at molecular targets deemed crucial to tumours. As this development unfolded, a revolution in the speed and cost of genetic sequencing occurred. The result--an armamentarium of drugs and an array of molecular targets--set the stage for precision oncology, a hypothesis that cancer treatment could be markedly improved if therapies were guided by a tumour's genomic alterations. Drawing lessons from the biological basis of cancer and recent empirical investigations, we take a more measured view of precision oncology's promise. Ultimately, the promise is not our concern, but the threshold at which we declare success. We review reports of precision oncology alongside those of precision diagnostics and novel radiotherapy approaches. Although confirmatory evidence is scarce, these interventions have been widely endorsed. We conclude that the current path will probably not be successful or, at a minimum, will have to undergo substantive adjustments before it can be successful. For the sake of patients with cancer, we hope one form of precision oncology will deliver on its promise. However, until confirmatory studies are completed, precision oncology remains unproven, and as such, a hypothesis in need of rigorous testing. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. The atomic hypothesis: physical consequences

    Rivas, Martin

    2008-01-01

    The hypothesis that matter is made of some ultimate and indivisible objects, together with the restricted relativity principle, establishes a constraint on the kind of variables we are allowed to use for the variational description of elementary particles. We consider that the atomic hypothesis not only states the indivisibility of elementary particles, but also that these ultimate objects, if not annihilated, cannot be modified by any interaction so that all allowed states of an elementary particle are only kinematical modifications of any one of them. Therefore, an elementary particle cannot have excited states. In this way, the kinematical group of spacetime symmetries not only defines the symmetries of the system, but also the variables in terms of which the mathematical description of the elementary particles can be expressed in either the classical or the quantum mechanical description. When considering the interaction of two Dirac particles, the atomic hypothesis restricts the interaction Lagrangian to a kind of minimal coupling interaction

  9. Multiple sclerosis: a geographical hypothesis.

    Carlyle, I P

    1997-12-01

    Multiple sclerosis remains a rare neurological disease of unknown aetiology, with a unique distribution, both geographically and historically. Rare in equatorial regions, it becomes increasingly common in higher latitudes; historically, it was first clinically recognized in the early nineteenth century. A hypothesis, based on geographical reasoning, is here proposed: that the disease is the result of a specific vitamin deficiency. Different individuals suffer the deficiency in separate and often unique ways. Evidence to support the hypothesis exists in cultural considerations, in the global distribution of the disease, and in its historical prevalence.

  10. Discussion of the Porter hypothesis

    1999-11-01

    In the reaction to the long-range vision of RMNO, published in 1996, The Dutch government posed the question whether a far-going and progressive modernization policy will lead to competitive advantages of high-quality products on partly new markets. Such a question is connected to the so-called Porter hypothesis: 'By stimulating innovation, strict environmental regulations can actually enhance competitiveness', from which statement it can be concluded that environment and economy can work together quite well. A literature study has been carried out in order to determine under which conditions that hypothesis is endorsed in the scientific literature and policy documents. Recommendations are given for further studies. refs

  11. The thrifty phenotype hypothesis revisited

    Vaag, A A; Grunnet, L G; Arora, G P

    2012-01-01

    Twenty years ago, Hales and Barker along with their co-workers published some of their pioneering papers proposing the 'thrifty phenotype hypothesis' in Diabetologia (4;35:595-601 and 3;36:62-67). Their postulate that fetal programming could represent an important player in the origin of type 2...... of the underlying molecular mechanisms. Type 2 diabetes is a multiple-organ disease, and developmental programming, with its idea of organ plasticity, is a plausible hypothesis for a common basis for the widespread organ dysfunctions in type 2 diabetes and the metabolic syndrome. Only two among the 45 known type 2...

  12. The newest precision measurement

    Lee, Jing Gu; Lee, Jong Dae

    1974-05-01

    This book introduces basic of precision measurement, measurement of length, limit gauge, measurement of angles, measurement of surface roughness, measurement of shapes and locations, measurement of outline, measurement of external and internal thread, gear testing, accuracy inspection of machine tools, three dimension coordinate measuring machine, digitalisation of precision measurement, automation of precision measurement, measurement of cutting tools, measurement using laser, and point of choosing length measuring instrument.

  13. Practical precision measurement

    Kwak, Ho Chan; Lee, Hui Jun

    1999-01-01

    This book introduces basic knowledge of precision measurement, measurement of length, precision measurement of minor diameter, measurement of angles, measurement of surface roughness, three dimensional measurement, measurement of locations and shapes, measurement of screw, gear testing, cutting tools testing, rolling bearing testing, and measurement of digitalisation. It covers height gauge, how to test surface roughness, measurement of plan and straightness, external and internal thread testing, gear tooth measurement, milling cutter, tab, rotation precision measurement, and optical transducer.

  14. [Precision and personalized medicine].

    Sipka, Sándor

    2016-10-01

    The author describes the concept of "personalized medicine" and the newly introduced "precision medicine". "Precision medicine" applies the terms of "phenotype", "endotype" and "biomarker" in order to characterize more precisely the various diseases. Using "biomarkers" the homogeneous type of a disease (a "phenotype") can be divided into subgroups called "endotypes" requiring different forms of treatment and financing. The good results of "precision medicine" have become especially apparent in relation with allergic and autoimmune diseases. The application of this new way of thinking is going to be necessary in Hungary, too, in the near future for participants, controllers and financing boards of healthcare. Orv. Hetil., 2016, 157(44), 1739-1741.

  15. Precision Clock Evaluation Facility

    Federal Laboratory Consortium — FUNCTION: Tests and evaluates high-precision atomic clocks for spacecraft, ground, and mobile applications. Supports performance evaluation, environmental testing,...

  16. Questioning the social intelligence hypothesis.

    Holekamp, Kay E

    2007-02-01

    The social intelligence hypothesis posits that complex cognition and enlarged "executive brains" evolved in response to challenges that are associated with social complexity. This hypothesis has been well supported, but some recent data are inconsistent with its predictions. It is becoming increasingly clear that multiple selective agents, and non-selective constraints, must have acted to shape cognitive abilities in humans and other animals. The task now is to develop a larger theoretical framework that takes into account both inter-specific differences and similarities in cognition. This new framework should facilitate consideration of how selection pressures that are associated with sociality interact with those that are imposed by non-social forms of environmental complexity, and how both types of functional demands interact with phylogenetic and developmental constraints.

  17. Whiplash and the compensation hypothesis.

    Spearing, Natalie M; Connelly, Luke B

    2011-12-01

    Review article. To explain why the evidence that compensation-related factors lead to worse health outcomes is not compelling, either in general, or in the specific case of whiplash. There is a common view that compensation-related factors lead to worse health outcomes ("the compensation hypothesis"), despite the presence of important, and unresolved sources of bias. The empirical evidence on this question has ramifications for the design of compensation schemes. Using studies on whiplash, this article outlines the methodological problems that impede attempts to confirm or refute the compensation hypothesis. Compensation studies are prone to measurement bias, reverse causation bias, and selection bias. Errors in measurement are largely due to the latent nature of whiplash injuries and health itself, a lack of clarity over the unit of measurement (specific factors, or "compensation"), and a lack of appreciation for the heterogeneous qualities of compensation-related factors and schemes. There has been a failure to acknowledge and empirically address reverse causation bias, or the likelihood that poor health influences the decision to pursue compensation: it is unclear if compensation is a cause or a consequence of poor health, or both. Finally, unresolved selection bias (and hence, confounding) is evident in longitudinal studies and natural experiments. In both cases, between-group differences have not been addressed convincingly. The nature of the relationship between compensation-related factors and health is unclear. Current approaches to testing the compensation hypothesis are prone to several important sources of bias, which compromise the validity of their results. Methods that explicitly test the hypothesis and establish whether or not a causal relationship exists between compensation factors and prolonged whiplash symptoms are needed in future studies.

  18. Precision machining commercialization

    1978-01-01

    To accelerate precision machining development so as to realize more of the potential savings within the next few years of known Department of Defense (DOD) part procurement, the Air Force Materials Laboratory (AFML) is sponsoring the Precision Machining Commercialization Project (PMC). PMC is part of the Tri-Service Precision Machine Tool Program of the DOD Manufacturing Technology Five-Year Plan. The technical resources supporting PMC are provided under sponsorship of the Department of Energy (DOE). The goal of PMC is to minimize precision machining development time and cost risk for interested vendors. PMC will do this by making available the high precision machining technology as developed in two DOE contractor facilities, the Lawrence Livermore Laboratory of the University of California and the Union Carbide Corporation, Nuclear Division, Y-12 Plant, at Oak Ridge, Tennessee

  19. High-precision positioning of radar scatterers

    Dheenathayalan, P.; Small, D.; Schubert, A.; Hanssen, R.F.

    2016-01-01

    Remote sensing radar satellites cover wide areas and provide spatially dense measurements, with millions of scatterers. Knowledge of the precise position of each radar scatterer is essential to identify the corresponding object and interpret the estimated deformation. The absolute position accuracy

  20. A Molecular–Structure Hypothesis

    Jan C. A. Boeyens

    2010-11-01

    Full Text Available The self-similar symmetry that occurs between atomic nuclei, biological growth structures, the solar system, globular clusters and spiral galaxies suggests that a similar pattern should characterize atomic and molecular structures. This possibility is explored in terms of the current molecular structure-hypothesis and its extension into four-dimensional space-time. It is concluded that a quantum molecule only has structure in four dimensions and that classical (Newtonian structure, which occurs in three dimensions, cannot be simulated by quantum-chemical computation.

  1. Antiaging therapy: a prospective hypothesis

    Shahidi Bonjar MR

    2015-01-01

    Full Text Available Mohammad Rashid Shahidi Bonjar,1 Leyla Shahidi Bonjar2 1School of Dentistry, Kerman University of Medical Sciences, Kerman Iran; 2Department of Pharmacology, College of Pharmacy, Kerman University of Medical Sciences, Kerman, Iran Abstract: This hypothesis proposes a new prospective approach to slow the aging process in older humans. The hypothesis could lead to developing new treatments for age-related illnesses and help humans to live longer. This hypothesis has no previous documentation in scientific media and has no protocol. Scientists have presented evidence that systemic aging is influenced by peculiar molecules in the blood. Researchers at Albert Einstein College of Medicine, New York, and Harvard University in Cambridge discovered elevated titer of aging-related molecules (ARMs in blood, which trigger cascade of aging process in mice; they also indicated that the process can be reduced or even reversed. By inhibiting the production of ARMs, they could reduce age-related cognitive and physical declines. The present hypothesis offers a new approach to translate these findings into medical treatment: extracorporeal adjustment of ARMs would lead to slower rates of aging. A prospective “antiaging blood filtration column” (AABFC is a nanotechnological device that would fulfill the central role in this approach. An AABFC would set a near-youth homeostatic titer of ARMs in the blood. In this regard, the AABFC immobilizes ARMs from the blood while blood passes through the column. The AABFC harbors antibodies against ARMs. ARM antibodies would be conjugated irreversibly to ARMs on contact surfaces of the reaction platforms inside the AABFC till near-youth homeostasis is attained. The treatment is performed with the aid of a blood-circulating pump. Similar to a renal dialysis machine, blood would circulate from the body to the AABFC and from there back to the body in a closed circuit until ARMs were sufficiently depleted from the blood. The

  2. A test of the reward-contrast hypothesis.

    Dalecki, Stefan J; Panoz-Brown, Danielle E; Crystal, Jonathon D

    2017-12-01

    Source memory, a facet of episodic memory, is the memory of the origin of information. Whereas source memory in rats is sustained for at least a week, spatial memory degraded after approximately a day. Different forgetting functions may suggest that two memory systems (source memory and spatial memory) are dissociated. However, in previous work, the two tasks used baiting conditions consisting of chocolate and chow flavors; notably, the source memory task used the relatively better flavor. Thus, according to the reward-contrast hypothesis, when chocolate and chow were presented within the same context (i.e., within a single radial maze trial), the chocolate location was more memorable than the chow location because of contrast. We tested the reward-contrast hypothesis using baiting configurations designed to produce reward-contrast. The reward-contrast hypothesis predicts that under these conditions, spatial memory will survive a 24-h retention interval. We documented elimination of spatial memory performance after a 24-h retention interval using a reward-contrast baiting pattern. These data suggest that reward contrast does not explain our earlier findings that source memory survives unusually long retention intervals. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Is PMI the Hypothesis or the Null Hypothesis?

    Tarone, Aaron M; Sanford, Michelle R

    2017-09-01

    Over the past several decades, there have been several strident exchanges regarding whether forensic entomologists estimate the postmortem interval (PMI), minimum PMI, or something else. During that time, there has been a proliferation of terminology reflecting this concern regarding "what we do." This has been a frustrating conversation for some in the community because much of this debate appears to be centered on what assumptions are acknowledged directly and which are embedded within a list of assumptions (or ignored altogether) in the literature and in case reports. An additional component of the conversation centers on a concern that moving away from the use of certain terminology like PMI acknowledges limitations and problems that would make the application of entomology appear less useful in court-a problem for lawyers, but one that should not be problematic for scientists in the forensic entomology community, as uncertainty is part of science that should and can be presented effectively in the courtroom (e.g., population genetic concepts in forensics). Unfortunately, a consequence of the way this conversation is conducted is that even as all involved in the debate acknowledge the concerns of their colleagues, parties continue to talk past one another advocating their preferred terminology. Progress will not be made until the community recognizes that all of the terms under consideration take the form of null hypothesis statements and that thinking about "what we do" as a null hypothesis has useful legal and scientific ramifications that transcend arguments over the usage of preferred terminology. © The Authors 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. Precision digital control systems

    Vyskub, V. G.; Rozov, B. S.; Savelev, V. I.

    This book is concerned with the characteristics of digital control systems of great accuracy. A classification of such systems is considered along with aspects of stabilization, programmable control applications, digital tracking systems and servomechanisms, and precision systems for the control of a scanning laser beam. Other topics explored are related to systems of proportional control, linear devices and methods for increasing precision, approaches for further decreasing the response time in the case of high-speed operation, possibilities for the implementation of a logical control law, and methods for the study of precision digital control systems. A description is presented of precision automatic control systems which make use of electronic computers, taking into account the existing possibilities for an employment of computers in automatic control systems, approaches and studies required for including a computer in such control systems, and an analysis of the structure of automatic control systems with computers. Attention is also given to functional blocks in the considered systems.

  5. submitter LEP precision results

    Kawamoto, T

    2001-01-01

    Precision measurements at LEP are reviewed, with main focus on the electroweak measurements and tests of the Standard Model. Constraints placed by the LEP measurements on possible new physics are also discussed.

  6. Description of precision colorimeter

    Campos Acosta, Joaquín; Pons Aglio, Alicia; Corróns, Antonio

    1987-01-01

    Describes the use of a fully automatic, computer-controlled absolute spectroradiometer as a precision colorimeter. The chromaticity coordinates of several types of light sources have been obtained with this measurement system.

  7. NCI Precision Medicine

    This illustration represents the National Cancer Institute’s support of research to improve precision medicine in cancer treatment, in which unique therapies treat an individual’s cancer based on specific genetic abnormalities of that person’s tumor.

  8. Laser precision microfabrication

    Sugioka, Koji; Pique, Alberto

    2010-01-01

    Miniaturization and high precision are rapidly becoming a requirement for many industrial processes and products. As a result, there is greater interest in the use of laser microfabrication technology to achieve these goals. This book composed of 16 chapters covers all the topics of laser precision processing from fundamental aspects to industrial applications to both inorganic and biological materials. It reviews the sate of the art of research and technological development in the area of laser processing.

  9. The Stem Cell Hypothesis of Aging

    Anna Meiliana

    2010-04-01

    Full Text Available BACKGROUND: There is probably no single way to age. Indeed, so far there is no single accepted explanation or mechanisms of aging (although more than 300 theories have been proposed. There is an overall decline in tissue regenerative potential with age, and the question arises as to whether this is due to the intrinsic aging of stem cells or rather to the impairment of stem cell function in the aged tissue environment. CONTENT: Recent data suggest that we age, in part, because our self-renewing stem cells grow old as a result of heritable intrinsic events, such as DNA damage, as well as extrinsic forces, such as changes in their supporting niches. Mechanisms that suppress the development of cancer, such as senescence and apoptosis, which rely on telomere shortening and the activities of p53 and p16INK4a may also induce an unwanted consequence: a decline in the replicative function of certain stem cells types with advancing age. This decrease regenerative capacity appears to pointing to the stem cell hypothesis of aging. SUMMARY: Recent evidence suggested that we grow old partly because of our stem cells grow old as a result of mechanisms that suppress the development of cancer over a lifetime. We believe that a further, more precise mechanistic understanding of this process will be required before this knowledge can be translated into human anti-aging therapies. KEYWORDS: stem cells, senescence, telomere, DNA damage, epigenetic, aging.

  10. Influence of local topography on precision irrigation management

    Precision irrigation management is currently accomplished using spatial information about soil properties through soil series maps or electrical conductivity (EC measurements. Crop yield, however, is consistently influenced by local topography, both in rain-fed and irrigated environments. Utilizing ...

  11. The large numbers hypothesis and a relativistic theory of gravitation

    Lau, Y.K.; Prokhovnik, S.J.

    1986-01-01

    A way to reconcile Dirac's large numbers hypothesis and Einstein's theory of gravitation was recently suggested by Lau (1985). It is characterized by the conjecture of a time-dependent cosmological term and gravitational term in Einstein's field equations. Motivated by this conjecture and the large numbers hypothesis, we formulate here a scalar-tensor theory in terms of an action principle. The cosmological term is required to be spatially dependent as well as time dependent in general. The theory developed is appled to a cosmological model compatible with the large numbers hypothesis. The time-dependent form of the cosmological term and the scalar potential are then deduced. A possible explanation of the smallness of the cosmological term is also given and the possible significance of the scalar field is speculated

  12. On Using Taylor's Hypothesis for Three-Dimensional Mixing Layers

    LeBoeuf, Richard L.; Mehta, Rabindra D.

    1995-01-01

    In the present study, errors in using Taylor's hypothesis to transform measurements obtained in a temporal (or phase) frame onto a spatial one were evaluated. For the first time, phase-averaged ('real') spanwise and streamwise vorticity data measured on a three-dimensional grid were compared directly to those obtained using Taylor's hypothesis. The results show that even the qualitative features of the spanwise and streamwise vorticity distributions given by the two techniques can be very different. This is particularly true in the region of the spanwise roller pairing. The phase-averaged spanwise and streamwise peak vorticity levels given by Taylor's hypothesis are typically lower (by up to 40%) compared to the real measurements.

  13. Precision Experiments at LEP

    de Boer, Wim

    2015-01-01

    The Large Electron Positron Collider (LEP) established the Standard Model (SM) of particle physics with unprecedented precision, including all its radiative corrections. These led to predictions for the masses of the top quark and Higgs boson, which were beautifully confirmed later on. After these precision measurements the Nobel Prize in Physics was awarded in 1999 jointly to 't Hooft and Veltman "for elucidating the quantum structure of electroweak interactions in physics". Another hallmark of the LEP results were the precise measurements of the gauge coupling constants, which excluded unification of the forces within the SM, but allowed unification within the supersymmetric extension of the SM. This increased the interest in Supersymmetry (SUSY) and Grand Unified Theories, especially since the SM has no candidate for the elusive dark matter, while Supersymmetry provides an excellent candidate for dark matter. In addition, Supersymmetry removes the quadratic divergencies of the SM and {\\it predicts} the Hig...

  14. Precision muonium spectroscopy

    Jungmann, Klaus P.

    2016-01-01

    The muonium atom is the purely leptonic bound state of a positive muon and an electron. It has a lifetime of 2.2 µs. The absence of any known internal structure provides for precision experiments to test fundamental physics theories and to determine accurate values of fundamental constants. In particular ground state hyperfine structure transitions can be measured by microwave spectroscopy to deliver the muon magnetic moment. The frequency of the 1s–2s transition in the hydrogen-like atom can be determined with laser spectroscopy to obtain the muon mass. With such measurements fundamental physical interactions, in particular quantum electrodynamics, can also be tested at highest precision. The results are important input parameters for experiments on the muon magnetic anomaly. The simplicity of the atom enables further precise experiments, such as a search for muonium–antimuonium conversion for testing charged lepton number conservation and searches for possible antigravity of muons and dark matter. (author)

  15. Precision genome editing

    Steentoft, Catharina; Bennett, Eric P; Schjoldager, Katrine Ter-Borch Gram

    2014-01-01

    Precise and stable gene editing in mammalian cell lines has until recently been hampered by the lack of efficient targeting methods. While different gene silencing strategies have had tremendous impact on many biological fields, they have generally not been applied with wide success in the field...... of glycobiology, primarily due to their low efficiencies, with resultant failure to impose substantial phenotypic consequences upon the final glycosylation products. Here, we review novel nuclease-based precision genome editing techniques enabling efficient and stable gene editing, including gene disruption...... by introducing single or double-stranded breaks at a defined genomic sequence. We here compare and contrast the different techniques and summarize their current applications, highlighting cases from the field of glycobiology as well as pointing to future opportunities. The emerging potential of precision gene...

  16. Memory in astrocytes: a hypothesis

    Caudle Robert M

    2006-01-01

    Full Text Available Abstract Background Recent work has indicated an increasingly complex role for astrocytes in the central nervous system. Astrocytes are now known to exchange information with neurons at synaptic junctions and to alter the information processing capabilities of the neurons. As an extension of this trend a hypothesis was proposed that astrocytes function to store information. To explore this idea the ion channels in biological membranes were compared to models known as cellular automata. These comparisons were made to test the hypothesis that ion channels in the membranes of astrocytes form a dynamic information storage device. Results Two dimensional cellular automata were found to behave similarly to ion channels in a membrane when they function at the boundary between order and chaos. The length of time information is stored in this class of cellular automata is exponentially related to the number of units. Therefore the length of time biological ion channels store information was plotted versus the estimated number of ion channels in the tissue. This analysis indicates that there is an exponential relationship between memory and the number of ion channels. Extrapolation of this relationship to the estimated number of ion channels in the astrocytes of a human brain indicates that memory can be stored in this system for an entire life span. Interestingly, this information is not affixed to any physical structure, but is stored as an organization of the activity of the ion channels. Further analysis of two dimensional cellular automata also demonstrates that these systems have both associative and temporal memory capabilities. Conclusion It is concluded that astrocytes may serve as a dynamic information sink for neurons. The memory in the astrocytes is stored by organizing the activity of ion channels and is not associated with a physical location such as a synapse. In order for this form of memory to be of significant duration it is necessary

  17. Precision electron polarimetry

    Chudakov, E.

    2013-01-01

    A new generation of precise Parity-Violating experiments will require a sub-percent accuracy of electron beam polarimetry. Compton polarimetry can provide such accuracy at high energies, but at a few hundred MeV the small analyzing power limits the sensitivity. Mo/ller polarimetry provides a high analyzing power independent on the beam energy, but is limited by the properties of the polarized targets commonly used. Options for precision polarimetry at 300 MeV will be discussed, in particular a proposal to use ultra-cold atomic hydrogen traps to provide a 100%-polarized electron target for Mo/ller polarimetry

  18. A passion for precision

    CERN. Geneva. Audiovisual Unit

    2006-01-01

    For more than three decades, the quest for ever higher precision in laser spectroscopy of the simple hydrogen atom has inspired many advances in laser, optical, and spectroscopic techniques, culminating in femtosecond laser optical frequency combs as perhaps the most precise measuring tools known to man. Applications range from optical atomic clocks and tests of QED and relativity to searches for time variations of fundamental constants. Recent experiments are extending frequency comb techniques into the extreme ultraviolet. Laser frequency combs can also control the electric field of ultrashort light pulses, creating powerful new tools for the emerging field of attosecond science.

  19. Improving Precision of Types

    Winther, Johnni

    Types in programming languages provide a powerful tool for the programmer to document the code so that a large aspect of the intent can not only be presented to fellow programmers but also be checked automatically by compilers. The precision with which types model the behavior of programs...... is crucial to the quality of these automated checks, and in this thesis we present three different improvements to the precision of types in three different aspects of the Java programming language. First we show how to extend the type system in Java with a new type which enables the detection of unintended...

  20. Precision pharmacology for Alzheimer's disease.

    Hampel, Harald; Vergallo, Andrea; Aguilar, Lisi Flores; Benda, Norbert; Broich, Karl; Cuello, A Claudio; Cummings, Jeffrey; Dubois, Bruno; Federoff, Howard J; Fiandaca, Massimo; Genthon, Remy; Haberkamp, Marion; Karran, Eric; Mapstone, Mark; Perry, George; Schneider, Lon S; Welikovitch, Lindsay A; Woodcock, Janet; Baldacci, Filippo; Lista, Simone

    2018-04-01

    The complex multifactorial nature of polygenic Alzheimer's disease (AD) presents significant challenges for drug development. AD pathophysiology is progressing in a non-linear dynamic fashion across multiple systems levels - from molecules to organ systems - and through adaptation, to compensation, and decompensation to systems failure. Adaptation and compensation maintain homeostasis: a dynamic equilibrium resulting from the dynamic non-linear interaction between genome, epigenome, and environment. An individual vulnerability to stressors exists on the basis of individual triggers, drivers, and thresholds accounting for the initiation and failure of adaptive and compensatory responses. Consequently, the distinct pattern of AD pathophysiology in space and time must be investigated on the basis of the individual biological makeup. This requires the implementation of systems biology and neurophysiology to facilitate Precision Medicine (PM) and Precision Pharmacology (PP). The regulation of several processes at multiple levels of complexity from gene expression to cellular cycle to tissue repair and system-wide network activation has different time delays (temporal scale) according to the affected systems (spatial scale). The initial failure might originate and occur at every level potentially affecting the whole dynamic interrelated systems within an organism. Unraveling the spatial and temporal dynamics of non-linear pathophysiological mechanisms across the continuum of hierarchical self-organized systems levels and from systems homeostasis to systems failure is key to understand AD. Measuring and, possibly, controlling space- and time-scaled adaptive and compensatory responses occurring during AD will represent a crucial step to achieve the capacity to substantially modify the disease course and progression at the best suitable timepoints, thus counteracting disrupting critical pathophysiological inputs. This approach will provide the conceptual basis for effective

  1. Robust and distributed hypothesis testing

    Gül, Gökhan

    2017-01-01

    This book generalizes and extends the available theory in robust and decentralized hypothesis testing. In particular, it presents a robust test for modeling errors which is independent from the assumptions that a sufficiently large number of samples is available, and that the distance is the KL-divergence. Here, the distance can be chosen from a much general model, which includes the KL-divergence as a very special case. This is then extended by various means. A minimax robust test that is robust against both outliers as well as modeling errors is presented. Minimax robustness properties of the given tests are also explicitly proven for fixed sample size and sequential probability ratio tests. The theory of robust detection is extended to robust estimation and the theory of robust distributed detection is extended to classes of distributions, which are not necessarily stochastically bounded. It is shown that the quantization functions for the decision rules can also be chosen as non-monotone. Finally, the boo...

  2. The venom optimization hypothesis revisited.

    Morgenstern, David; King, Glenn F

    2013-03-01

    Animal venoms are complex chemical mixtures that typically contain hundreds of proteins and non-proteinaceous compounds, resulting in a potent weapon for prey immobilization and predator deterrence. However, because venoms are protein-rich, they come with a high metabolic price tag. The metabolic cost of venom is sufficiently high to result in secondary loss of venom whenever its use becomes non-essential to survival of the animal. The high metabolic cost of venom leads to the prediction that venomous animals may have evolved strategies for minimizing venom expenditure. Indeed, various behaviors have been identified that appear consistent with frugality of venom use. This has led to formulation of the "venom optimization hypothesis" (Wigger et al. (2002) Toxicon 40, 749-752), also known as "venom metering", which postulates that venom is metabolically expensive and therefore used frugally through behavioral control. Here, we review the available data concerning economy of venom use by animals with either ancient or more recently evolved venom systems. We conclude that the convergent nature of the evidence in multiple taxa strongly suggests the existence of evolutionary pressures favoring frugal use of venom. However, there remains an unresolved dichotomy between this economy of venom use and the lavish biochemical complexity of venom, which includes a high degree of functional redundancy. We discuss the evidence for biochemical optimization of venom as a means of resolving this conundrum. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Alien abduction: a medical hypothesis.

    Forrest, David V

    2008-01-01

    In response to a new psychological study of persons who believe they have been abducted by space aliens that found that sleep paralysis, a history of being hypnotized, and preoccupation with the paranormal and extraterrestrial were predisposing experiences, I noted that many of the frequently reported particulars of the abduction experience bear more than a passing resemblance to medical-surgical procedures and propose that experience with these may also be contributory. There is the altered state of consciousness, uniformly colored figures with prominent eyes, in a high-tech room under a round bright saucerlike object; there is nakedness, pain and a loss of control while the body's boundaries are being probed; and yet the figures are thought benevolent. No medical-surgical history was apparently taken in the above mentioned study, but psychological laboratory work evaluated false memory formation. I discuss problems in assessing intraoperative awareness and ways in which the medical hypothesis could be elaborated and tested. If physicians are causing this syndrome in a percentage of patients, we should know about it; and persons who feel they have been abducted should be encouraged to inform their surgeons and anesthesiologists without challenging their beliefs.

  4. The oxidative hypothesis of senescence

    Gilca M

    2007-01-01

    Full Text Available The oxidative hypothesis of senescence, since its origin in 1956, has garnered significant evidence and growing support among scientists for the notion that free radicals play an important role in ageing, either as "damaging" molecules or as signaling molecules. Age-increasing oxidative injuries induced by free radicals, higher susceptibility to oxidative stress in short-lived organisms, genetic manipulations that alter both oxidative resistance and longevity and the anti-ageing effect of caloric restriction and intermittent fasting are a few examples of accepted scientific facts that support the oxidative theory of senescence. Though not completely understood due to the complex "network" of redox regulatory systems, the implication of oxidative stress in the ageing process is now well documented. Moreover, it is compatible with other current ageing theories (e.g., those implicating the mitochondrial damage/mitochondrial-lysosomal axis, stress-induced premature senescence, biological "garbage" accumulation, etc. This review is intended to summarize and critically discuss the redox mechanisms involved during the ageing process: sources of oxidant agents in ageing (mitochondrial -electron transport chain, nitric oxide synthase reaction- and non-mitochondrial- Fenton reaction, microsomal cytochrome P450 enzymes, peroxisomal β -oxidation and respiratory burst of phagocytic cells, antioxidant changes in ageing (enzymatic- superoxide dismutase, glutathione-reductase, glutathion peroxidase, catalase- and non-enzymatic glutathione, ascorbate, urate, bilirubine, melatonin, tocopherols, carotenoids, ubiquinol, alteration of oxidative damage repairing mechanisms and the role of free radicals as signaling molecules in ageing.

  5. Precision physics at LHC

    Hinchliffe, I.

    1997-05-01

    In this talk the author gives a brief survey of some physics topics that will be addressed by the Large Hadron Collider currently under construction at CERN. Instead of discussing the reach of this machine for new physics, the author gives examples of the types of precision measurements that might be made if new physics is discovered

  6. Precision Muonium Spectroscopy

    Jungmann, Klaus P.

    2016-01-01

    The muonium atom is the purely leptonic bound state of a positive muon and an electron. It has a lifetime of 2.2 mu s. The absence of any known internal structure provides for precision experiments to test fundamental physics theories and to determine accurate values of fundamental constants. In

  7. What is precision medicine?

    König, Inke R; Fuchs, Oliver; Hansen, Gesine; von Mutius, Erika; Kopp, Matthias V

    2017-10-01

    The term "precision medicine" has become very popular over recent years, fuelled by scientific as well as political perspectives. Despite its popularity, its exact meaning, and how it is different from other popular terms such as "stratified medicine", "targeted therapy" or "deep phenotyping" remains unclear. Commonly applied definitions focus on the stratification of patients, sometimes referred to as a novel taxonomy, and this is derived using large-scale data including clinical, lifestyle, genetic and further biomarker information, thus going beyond the classical "signs-and-symptoms" approach.While these aspects are relevant, this description leaves open a number of questions. For example, when does precision medicine begin? In which way does the stratification of patients translate into better healthcare? And can precision medicine be viewed as the end-point of a novel stratification of patients, as implied, or is it rather a greater whole?To clarify this, the aim of this paper is to provide a more comprehensive definition that focuses on precision medicine as a process. It will be shown that this proposed framework incorporates the derivation of novel taxonomies and their role in healthcare as part of the cycle, but also covers related terms. Copyright ©ERS 2017.

  8. Isotopic Resonance Hypothesis: Experimental Verification by Escherichia coli Growth Measurements

    Xie, Xueshu; Zubarev, Roman A.

    2015-03-01

    Isotopic composition of reactants affects the rates of chemical and biochemical reactions. As a rule, enrichment of heavy stable isotopes leads to progressively slower reactions. But the recent isotopic resonance hypothesis suggests that the dependence of the reaction rate upon the enrichment degree is not monotonous. Instead, at some ``resonance'' isotopic compositions, the kinetics increases, while at ``off-resonance'' compositions the same reactions progress slower. To test the predictions of this hypothesis for the elements C, H, N and O, we designed a precise (standard error +/-0.05%) experiment that measures the parameters of bacterial growth in minimal media with varying isotopic composition. A number of predicted resonance conditions were tested, with significant enhancements in kinetics discovered at these conditions. The combined statistics extremely strongly supports the validity of the isotopic resonance phenomenon (p biotechnology, medicine, chemistry and other areas.

  9. Precision synchrotron radiation detectors

    Levi, M.; Rouse, F.; Butler, J.

    1989-03-01

    Precision detectors to measure synchrotron radiation beam positions have been designed and installed as part of beam energy spectrometers at the Stanford Linear Collider (SLC). The distance between pairs of synchrotron radiation beams is measured absolutely to better than 28 /mu/m on a pulse-to-pulse basis. This contributes less than 5 MeV to the error in the measurement of SLC beam energies (approximately 50 GeV). A system of high-resolution video cameras viewing precisely-aligned fiducial wire arrays overlaying phosphorescent screens has achieved this accuracy. Also, detectors of synchrotron radiation using the charge developed by the ejection of Compton-recoil electrons from an array of fine wires are being developed. 4 refs., 5 figs., 1 tab

  10. A passion for precision

    CERN. Geneva

    2006-01-01

    For more than three decades, the quest for ever higher precision in laser spectroscopy of the simple hydrogen atom has inspired many advances in laser, optical, and spectroscopic techniques, culminating in femtosecond laser optical frequency combs  as perhaps the most precise measuring tools known to man. Applications range from optical atomic clocks and tests of QED and relativity to searches for time variations of fundamental constants. Recent experiments are extending frequency comb techniques into the extreme ultraviolet. Laser frequency combs can also control the electric field of ultrashort light pulses, creating powerful new tools for the emerging field of attosecond science.Organiser(s): L. Alvarez-Gaume / PH-THNote: * Tea & coffee will be served at 16:00.

  11. Quad precision delay generator

    Krishnan, Shanti; Gopalakrishnan, K.R.; Marballi, K.R.

    1997-01-01

    A Quad Precision Delay Generator delays a digital edge by a programmed amount of time, varying from nanoseconds to microseconds. The output of this generator has an amplitude of the order of tens of volts and rise time of the order of nanoseconds. This was specifically designed and developed to meet the stringent requirements of the plasma focus experiments. Plasma focus is a laboratory device for producing and studying nuclear fusion reactions in hot deuterium plasma. 3 figs

  12. Precision Mass Measurement of Argon Isotopes

    Lunney, D

    2002-01-01

    % IS388\\\\ \\\\ A precision mass measurement of the neutron-deficient isotopes $^{32,33,34}$Ar is proposed. Mass values of these isotopes are of importance for: a) a stringent test of the Isobaric-Multiplet- Mass-Equation, b) a verification of the correctness of calculated charge-dependent corrections as used in super-allowed $\\beta$- decay studies aiming at a test of the CVC hypothesis, and c) the determination of the kinematics in electron-neutrino correlation experiments searching for scalar currents in weak interaction. The measurements will be carried out with the ISOLTRAP Penning trap mass spectrometer.

  13. Validity of Linder Hypothesis in Bric Countries

    Rana Atabay

    2016-03-01

    Full Text Available In this study, the theory of similarity in preferences (Linder hypothesis has been introduced and trade in BRIC countries has been examined whether the trade between these countries was valid for this hypothesis. Using the data for the period 1996 – 2010, the study applies to panel data analysis in order to provide evidence regarding the empirical validity of the Linder hypothesis for BRIC countries’ international trade. Empirical findings show that the trade between BRIC countries is in support of Linder hypothesis.

  14. Precision electroweak measurements

    Demarteau, M.

    1996-11-01

    Recent electroweak precision measurements fro e + e - and p anti p colliders are presented. Some emphasis is placed on the recent developments in the heavy flavor sector. The measurements are compared to predictions from the Standard Model of electroweak interactions. All results are found to be consistent with the Standard Model. The indirect constraint on the top quark mass from all measurements is in excellent agreement with the direct m t measurements. Using the world's electroweak data in conjunction with the current measurement of the top quark mass, the constraints on the Higgs' mass are discussed

  15. Electroweak precision tests

    Monteil, St.

    2009-12-01

    This document aims at summarizing a dozen of years of the author's research in High Energy Physics, in particular dealing with precision tests of the electroweak theory. Parity violating asymmetries measurements at LEP with the ALEPH detector together with global consistency checks of the Kobayashi-Maskawa paradigm within the CKM-fitter group are gathered in the first part of the document. The second part deals with the unpublished instrumental work about the design, tests, productions and commissioning of the elements of the Pre-Shower detector of the LHCb spectrometer at LHC. Physics perspectives with LHCb are eventually discussed as a conclusion. (author)

  16. Ultra-precision bearings

    Wardle, F

    2015-01-01

    Ultra-precision bearings can achieve extreme accuracy of rotation, making them ideal for use in numerous applications across a variety of fields, including hard disk drives, roundness measuring machines and optical scanners. Ultraprecision Bearings provides a detailed review of the different types of bearing and their properties, as well as an analysis of the factors that influence motion error, stiffness and damping. Following an introduction to basic principles of motion error, each chapter of the book is then devoted to the basic principles and properties of a specific type of bearin

  17. Seeing via miniature eye movements: A dynamic hypothesis for vision

    Ehud eAhissar

    2012-11-01

    Full Text Available During natural viewing, the eyes are never still. Even during fixation, miniature movements of the eyes move the retinal image across tens of foveal photoreceptors. Most theories of vision implicitly assume that the visual system ignores these movements and somehow overcomes the resulting smearing. However, evidence has accumulated to indicate that fixational eye movements cannot be ignored by the visual system if fine spatial details are to be resolved. We argue that the only way the visual system can achieve its high resolution given its fixational movements is by seeing via these movements. Seeing via eye movements also eliminates the instability of the image, which would be induced by them otherwise. Here we present a hypothesis for vision, in which coarse details are spatially-encoded in gaze-related coordinates, and fine spatial details are temporally-encoded in relative retinal coordinates. The temporal encoding presented here achieves its highest resolution by encoding along the elongated axes of simple cell receptive fields and not across these axes as suggested by spatial models of vision. According to our hypothesis, fine details of shape are encoded by inter-receptor temporal phases, texture by instantaneous intra-burst rates of individual receptors, and motion by inter-burst temporal frequencies. We further describe the ability of the visual system to readout the encoded information and recode it internally. We show how reading out of retinal signals can be facilitated by neuronal phase-locked loops (NPLLs, which lock to the retinal jitter; this locking enables recoding of motion information and temporal framing of shape and texture processing. A possible implementation of this locking-and-recoding process by specific thalamocortical loops is suggested. Overall it is suggested that high-acuity vision is based primarily on temporal mechanisms of the sort presented here and low-acuity vision is based primarily on spatial mechanisms.

  18. Precision lifetime measurements

    Tanner, C.E.

    1994-01-01

    Precision measurements of atomic lifetimes provide important information necessary for testing atomic theory. The authors employ resonant laser excitation of a fast atomic beam to measure excited state lifetimes by observing the decay-in-flight of the emitted fluorescence. A similar technique was used by Gaupp, et al., who reported measurements with precisions of less than 0.2%. Their program includes lifetime measurements of the low lying p states in alkali and alkali like systems. Motivation for this work comes from a need to test the atomic many-body-perturbation theory (MBPT) that is necessary for interpretation of parity nonconservation experiments in atomic cesium. The authors have measured the cesium 6p 2 P 1/2 and 6p 2 P 3/2 state lifetimes to be 34.934±0.094 ns and 30.499±0.070 ns respectively. With minor changes to the apparatus, they have extended their measurements to include the lithium 2p 2 P 1/2 and 2p 2 P 3/2 states

  19. Fundamentals of precision medicine

    Divaris, Kimon

    2018-01-01

    Imagine a world where clinicians make accurate diagnoses and provide targeted therapies to their patients according to well-defined, biologically-informed disease subtypes, accounting for individual differences in genetic make-up, behaviors, cultures, lifestyles and the environment. This is not as utopic as it may seem. Relatively recent advances in science and technology have led to an explosion of new information on what underlies health and what constitutes disease. These novel insights emanate from studies of the human genome and microbiome, their associated transcriptomes, proteomes and metabolomes, as well as epigenomics and exposomics—such ‘omics data can now be generated at unprecedented depth and scale, and at rapidly decreasing cost. Making sense and integrating these fundamental information domains to transform health care and improve health remains a challenge—an ambitious, laudable and high-yield goal. Precision dentistry is no longer a distant vision; it is becoming part of the rapidly evolving present. Insights from studies of the human genome and microbiome, their associated transcriptomes, proteomes and metabolomes, and epigenomics and exposomics have reached an unprecedented depth and scale. Much more needs to be done, however, for the realization of precision medicine in the oral health domain. PMID:29227115

  20. Precision measurements in supersymmetry

    Feng, Johnathan Lee [Stanford Univ., CA (United States)

    1995-05-01

    Supersymmetry is a promising framework in which to explore extensions of the standard model. If candidates for supersymmetric particles are found, precision measurements of their properties will then be of paramount importance. The prospects for such measurements and their implications are the subject of this thesis. If charginos are produced at the LEP II collider, they are likely to be one of the few available supersymmetric signals for many years. The author considers the possibility of determining fundamental supersymmetry parameters in such a scenario. The study is complicated by the dependence of observables on a large number of these parameters. He proposes a straightforward procedure for disentangling these dependences and demonstrate its effectiveness by presenting a number of case studies at representative points in parameter space. In addition to determining the properties of supersymmetric particles, precision measurements may also be used to establish that newly-discovered particles are, in fact, supersymmetric. Supersymmetry predicts quantitative relations among the couplings and masses of superparticles. The author discusses tests of such relations at a future e{sup +}e{sup {minus}} linear collider, using measurements that exploit the availability of polarizable beams. Stringent tests of supersymmetry from chargino production are demonstrated in two representative cases, and fermion and neutralino processes are also discussed.

  1. Precision muon physics

    Gorringe, T. P.; Hertzog, D. W.

    2015-09-01

    The muon is playing a unique role in sub-atomic physics. Studies of muon decay both determine the overall strength and establish the chiral structure of weak interactions, as well as setting extraordinary limits on charged-lepton-flavor-violating processes. Measurements of the muon's anomalous magnetic moment offer singular sensitivity to the completeness of the standard model and the predictions of many speculative theories. Spectroscopy of muonium and muonic atoms gives unmatched determinations of fundamental quantities including the magnetic moment ratio μμ /μp, lepton mass ratio mμ /me, and proton charge radius rp. Also, muon capture experiments are exploring elusive features of weak interactions involving nucleons and nuclei. We will review the experimental landscape of contemporary high-precision and high-sensitivity experiments with muons. One focus is the novel methods and ingenious techniques that achieve such precision and sensitivity in recent, present, and planned experiments. Another focus is the uncommonly broad and topical range of questions in atomic, nuclear and particle physics that such experiments explore.

  2. Precision Joining Center

    Powell, J. W.; Westphal, D. A.

    1991-08-01

    A workshop to obtain input from industry on the establishment of the Precision Joining Center (PJC) was held on July 10-12, 1991. The PJC is a center for training Joining Technologists in advanced joining techniques and concepts in order to promote the competitiveness of U.S. industry. The center will be established as part of the DOE Defense Programs Technology Commercialization Initiative, and operated by EG&G Rocky Flats in cooperation with the American Welding Society and the Colorado School of Mines Center for Welding and Joining Research. The overall objectives of the workshop were to validate the need for a Joining Technologists to fill the gap between the welding operator and the welding engineer, and to assure that the PJC will train individuals to satisfy that need. The consensus of the workshop participants was that the Joining Technologist is a necessary position in industry, and is currently used, with some variation, by many companies. It was agreed that the PJC core curriculum, as presented, would produce a Joining Technologist of value to industries that use precision joining techniques. The advantage of the PJC would be to train the Joining Technologist much more quickly and more completely. The proposed emphasis of the PJC curriculum on equipment intensive and hands-on training was judged to be essential.

  3. Hypothesis Testing in the Real World

    Miller, Jeff

    2017-01-01

    Critics of null hypothesis significance testing suggest that (a) its basic logic is invalid and (b) it addresses a question that is of no interest. In contrast to (a), I argue that the underlying logic of hypothesis testing is actually extremely straightforward and compelling. To substantiate that, I present examples showing that hypothesis…

  4. Error probabilities in default Bayesian hypothesis testing

    Gu, Xin; Hoijtink, Herbert; Mulder, J,

    2016-01-01

    This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for

  5. Reassessing the Trade-off Hypothesis

    Rosas, Guillermo; Manzetti, Luigi

    2015-01-01

    Do economic conditions drive voters to punish politicians that tolerate corruption? Previous scholarly work contends that citizens in young democracies support corrupt governments that are capable of promoting good economic outcomes, the so-called trade-off hypothesis. We test this hypothesis based...

  6. Mastery Learning and the Decreasing Variability Hypothesis.

    Livingston, Jennifer A.; Gentile, J. Ronald

    1996-01-01

    This report results from studies that tested two variations of Bloom's decreasing variability hypothesis using performance on successive units of achievement in four graduate classrooms that used mastery learning procedures. Data do not support the decreasing variability hypothesis; rather, they show no change over time. (SM)

  7. Precision Medicine in Cancer Treatment

    Precision medicine helps doctors select cancer treatments that are most likely to help patients based on a genetic understanding of their disease. Learn about the promise of precision medicine and the role it plays in cancer treatment.

  8. Simple Syllabic Calls Accompany Discrete Behavior Patterns in Captive Pteronotus parnellii: An Illustration of the Motivation-Structure Hypothesis

    Matthew J. Clement

    2012-01-01

    Full Text Available Mustached bats, Pteronotus parnellii, are highly social and vocal. Individuals of this species roost in tight clusters, and emit an acoustically rich repertoire of calls whose behavioral significance is largely unknown. We recorded their social and vocal behaviors within a colony housed under semi-natural conditions. We also quantified the spatial spread of each bat’s roosting location and discovered that this was relatively fixed and roughly confined to an individual’s body width. The spatial precision in roosting was accompanied by an equally remarkable match between specific vocalizations and well-timed, discrete, identifiable postures/behaviors, as revealed by logistic regression analysis. The bodily behaviors included crouching, marking, yawning, nipping, flicking, fighting, kissing, inspecting, and fly-bys. Two echolocation-like calls were used to maintain spacing in the colony, two noisy broadband calls were emitted during fights, two tonal calls conveyed fear, and another tonal call signaled appeasement. Overall, the results establish that mustached bats exhibit complex social interactions common to other social mammals. The correspondence of relatively low frequency and noisy, broadband calls with aggression, and of tonal, high frequency calls with fear supports Morton’s Motivation-Structure hypothesis, and establishes a link between motivation and the acoustic structure of social calls emitted by mustached bats.

  9. Usefulness of Models in Precision Nutrient Management

    Plauborg, Finn; Manevski, Kiril; Zhenjiang, Zhou

    Modern agriculture increasingly applies new methods and technologies to increase production and nutrient use efficiencies and at the same time reduce leaching of nutrients and greenhouse gas emissions. GPS based ECa-measurement equipment, ER or EM instrumentations, are used to spatially character......Modern agriculture increasingly applies new methods and technologies to increase production and nutrient use efficiencies and at the same time reduce leaching of nutrients and greenhouse gas emissions. GPS based ECa-measurement equipment, ER or EM instrumentations, are used to spatially...... and mineral composition. Mapping of crop status and the spatial-temporal variability within fields with red-infrared reflection are used to support decision on split fertilisation and more precise dosing. The interpretation and use of these various data in precise nutrient management is not straightforward...... of mineralisation. However, whether the crop would benefit from this depended to a large extent on soil hydraulic conductivity within the range of natural variation when testing the model. In addition the initialisation of the distribution of soil total carbon and nitrogen into conceptual model compartments...

  10. Correction of Spatial Bias in Oligonucleotide Array Data

    Philippe Serhal

    2013-01-01

    Full Text Available Background. Oligonucleotide microarrays allow for high-throughput gene expression profiling assays. The technology relies on the fundamental assumption that observed hybridization signal intensities (HSIs for each intended target, on average, correlate with their target’s true concentration in the sample. However, systematic, nonbiological variation from several sources undermines this hypothesis. Background hybridization signal has been previously identified as one such important source, one manifestation of which appears in the form of spatial autocorrelation. Results. We propose an algorithm, pyn, for the elimination of spatial autocorrelation in HSIs, exploiting the duality of desirable mutual information shared by probes in a common probe set and undesirable mutual information shared by spatially proximate probes. We show that this correction procedure reduces spatial autocorrelation in HSIs; increases HSI reproducibility across replicate arrays; increases differentially expressed gene detection power; and performs better than previously published methods. Conclusions. The proposed algorithm increases both precision and accuracy, while requiring virtually no changes to users’ current analysis pipelines: the correction consists merely of a transformation of raw HSIs (e.g., CEL files for Affymetrix arrays. A free, open-source implementation is provided as an R package, compatible with standard Bioconductor tools. The approach may also be tailored to other platform types and other sources of bias.

  11. Correction of Spatial Bias in Oligonucleotide Array Data

    Lemieux, Sébastien

    2013-01-01

    Background. Oligonucleotide microarrays allow for high-throughput gene expression profiling assays. The technology relies on the fundamental assumption that observed hybridization signal intensities (HSIs) for each intended target, on average, correlate with their target's true concentration in the sample. However, systematic, nonbiological variation from several sources undermines this hypothesis. Background hybridization signal has been previously identified as one such important source, one manifestation of which appears in the form of spatial autocorrelation. Results. We propose an algorithm, pyn, for the elimination of spatial autocorrelation in HSIs, exploiting the duality of desirable mutual information shared by probes in a common probe set and undesirable mutual information shared by spatially proximate probes. We show that this correction procedure reduces spatial autocorrelation in HSIs; increases HSI reproducibility across replicate arrays; increases differentially expressed gene detection power; and performs better than previously published methods. Conclusions. The proposed algorithm increases both precision and accuracy, while requiring virtually no changes to users' current analysis pipelines: the correction consists merely of a transformation of raw HSIs (e.g., CEL files for Affymetrix arrays). A free, open-source implementation is provided as an R package, compatible with standard Bioconductor tools. The approach may also be tailored to other platform types and other sources of bias. PMID:23573083

  12. French Meteor Network for High Precision Orbits of Meteoroids

    Atreya, P.; Vaubaillon, J.; Colas, F.; Bouley, S.; Gaillard, B.; Sauli, I.; Kwon, M. K.

    2011-01-01

    There is a lack of precise meteoroids orbit from video observations as most of the meteor stations use off-the-shelf CCD cameras. Few meteoroids orbit with precise semi-major axis are available using film photographic method. Precise orbits are necessary to compute the dust flux in the Earth s vicinity, and to estimate the ejection time of the meteoroids accurately by comparing them with the theoretical evolution model. We investigate the use of large CCD sensors to observe multi-station meteors and to compute precise orbit of these meteoroids. An ideal spatial and temporal resolution to get an accuracy to those similar of photographic plates are discussed. Various problems faced due to the use of large CCD, such as increasing the spatial and the temporal resolution at the same time and computational problems in finding the meteor position are illustrated.

  13. Precision Medicine in Cardiovascular Diseases

    Yan Liu

    2017-02-01

    Full Text Available Since President Obama announced the Precision Medicine Initiative in the United States, more and more attention has been paid to precision medicine. However, clinicians have already used it to treat conditions such as cancer. Many cardiovascular diseases have a familial presentation, and genetic variants are associated with the prevention, diagnosis, and treatment of cardiovascular diseases, which are the basis for providing precise care to patients with cardiovascular diseases. Large-scale cohorts and multiomics are critical components of precision medicine. Here we summarize the application of precision medicine to cardiovascular diseases based on cohort and omic studies, and hope to elicit discussion about future health care.

  14. The Literal Translation Hypothesis in ESP Teaching/Learning Environments

    Pedro A. Fuertes-Olivera

    2015-11-01

    Full Text Available Research on the characteristics of specialized vocabulary usually replicates studies that deal with general words, e.g. they typically describe frequent terms and focus on their linguistic characteristics to aid in the learning and acquisition of the terms. We dispute this practise, as we believe that the basic characteristic of terms is that they are coined to restrict meaning, i.e. to be as precise and as specific as possible in a particular context. For instance, around 70% of English and Spanish accounting terms are multi-word terms, most of which contain more than three orthographic words that syntactically behave in a way that is very different from the syntactic behaviour of the node on which they are formed (Fuertes-Olivera and Tarp, forthcoming. This has prompted us to propose a research framework that investigates whether or not the literal translation hypothesis, which has been addressed in several areas of translation studies, can also be applied in ESP teaching/learning environments. If plausible, the assumptions on which this hypothesis is based can shed light on how learners disambiguate terms they encounter. Within this framework, this paper presents evidence that the literal translation hypothesis is possible in ESP; it offers the results of a pilot study that sheds light on how this hypothesis may work, and also discusses its usability in the context of ESP learning. In particular, this paper presents strategies for teaching multi-word terms that are different from those currently based on corpus data. We believe that exercises such as “cloze”, “fill in” and similar “guessing” exercises must be abandoned in ESP teaching/learning environments. Instead, we propose exercises that reproduce L1 teaching and learning activities, i.e., exercises that are typically used when acquiring specialised knowledge and skills in any domain, e.g. taking part in meetings and giving presentations in a business context.

  15. Precisely predictable Dirac observables

    Cordes, Heinz Otto

    2006-01-01

    This work presents a "Clean Quantum Theory of the Electron", based on Dirac’s equation. "Clean" in the sense of a complete mathematical explanation of the well known paradoxes of Dirac’s theory, and a connection to classical theory, including the motion of a magnetic moment (spin) in the given field, all for a charged particle (of spin ½) moving in a given electromagnetic field. This theory is relativistically covariant, and it may be regarded as a mathematically consistent quantum-mechanical generalization of the classical motion of such a particle, à la Newton and Einstein. Normally, our fields are time-independent, but also discussed is the time-dependent case, where slightly different features prevail. A "Schroedinger particle", such as a light quantum, experiences a very different (time-dependent) "Precise Predictablity of Observables". An attempt is made to compare both cases. There is not the Heisenberg uncertainty of location and momentum; rather, location alone possesses a built-in uncertainty ...

  16. Prompt and Precise Prototyping

    2003-01-01

    For Sanders Design International, Inc., of Wilton, New Hampshire, every passing second between the concept and realization of a product is essential to succeed in the rapid prototyping industry where amongst heavy competition, faster time-to-market means more business. To separate itself from its rivals, Sanders Design aligned with NASA's Marshall Space Flight Center to develop what it considers to be the most accurate rapid prototyping machine for fabrication of extremely precise tooling prototypes. The company's Rapid ToolMaker System has revolutionized production of high quality, small-to-medium sized prototype patterns and tooling molds with an exactness that surpasses that of computer numerically-controlled (CNC) machining devices. Created with funding and support from Marshall under a Small Business Innovation Research (SBIR) contract, the Rapid ToolMaker is a dual-use technology with applications in both commercial and military aerospace fields. The advanced technology provides cost savings in the design and manufacturing of automotive, electronic, and medical parts, as well as in other areas of consumer interest, such as jewelry and toys. For aerospace applications, the Rapid ToolMaker enables fabrication of high-quality turbine and compressor blades for jet engines on unmanned air vehicles, aircraft, and missiles.

  17. Precisely Tracking Childhood Death.

    Farag, Tamer H; Koplan, Jeffrey P; Breiman, Robert F; Madhi, Shabir A; Heaton, Penny M; Mundel, Trevor; Ordi, Jaume; Bassat, Quique; Menendez, Clara; Dowell, Scott F

    2017-07-01

    Little is known about the specific causes of neonatal and under-five childhood death in high-mortality geographic regions due to a lack of primary data and dependence on inaccurate tools, such as verbal autopsy. To meet the ambitious new Sustainable Development Goal 3.2 to eliminate preventable child mortality in every country, better approaches are needed to precisely determine specific causes of death so that prevention and treatment interventions can be strengthened and focused. Minimally invasive tissue sampling (MITS) is a technique that uses needle-based postmortem sampling, followed by advanced histopathology and microbiology to definitely determine cause of death. The Bill & Melinda Gates Foundation is supporting a new surveillance system called the Child Health and Mortality Prevention Surveillance network, which will determine cause of death using MITS in combination with other information, and yield cause-specific population-based mortality rates, eventually in up to 12-15 sites in sub-Saharan Africa and south Asia. However, the Gates Foundation funding alone is not enough. We call on governments, other funders, and international stakeholders to expand the use of pathology-based cause of death determination to provide the information needed to end preventable childhood mortality.

  18. Problems in implementation of the spatial plan of the Republic of Srpska until 2015: Quantitative analysis

    Bijelić Branislav

    2017-01-01

    Full Text Available The implementation of spatial plans in the Republic of Srpska is certainly the weakest phase of the process of spatial planning in this entity. It is particularly evident in the case of the Spatial Plan of the Republic of Srpska until 2015 which is the highest strategic spatial planning document in the Republic of Srpska. More precisely, the implementation of spatial plans has been defined as the carrying out of spatial planning documents, i.e. planning propositions as defined in the spatial plans. For the purpose of this paper, a quantitative analysis of the implementation of the planning propositions envisioned by this document has been carried out. The difference between what was planned and what was implemented at the end of the planning period (ex-post evaluation of planning decisions is presented in this paper. The weighting factor is defined for each thematic field and planning proposition, where the main criterion for determining the weighting factor is the share of the planning proposition and thematic field in the estimated total costs of the plan (financial criterion. The paper has also tackled the issue of the implementation of the Spatial Plan of Bosnia and Herzegovina for the period 1981 - 2000, as well as of the Spatial Plan of the Republic of Srpska 1996 - 2001 - Phased Plan for the period 1996 - 2001, as the previous strategic spatial planning documents of the highest rank covering the area of the Republic of Srpska. The research results have proven primary hypothesis of the paper that the level of the implementation of Spatial Plan of the Republic of Srpska until 2015 is less than 10%.

  19. Implications of the Bohm-Aharonov hypothesis

    Ghirardi, G.C.; Rimini, A.; Weber, T.

    1976-01-01

    It is proved that the Bohm-Aharonov hypothesis concerning largerly separated subsystems of composite quantum systems implies that it is impossible to express the dynamical evolution in terms of the density operator

  20. Multi-agent sequential hypothesis testing

    Kim, Kwang-Ki K.; Shamma, Jeff S.

    2014-01-01

    incorporate costs of taking private/public measurements, costs of time-difference and disagreement in actions of agents, and costs of false declaration/choices in the sequential hypothesis testing. The corresponding sequential decision processes have well

  1. The (not so) Immortal Strand Hypothesis

    Tomasetti, Cristian; Bozic, Ivana

    2015-01-01

    Background: Non-random segregation of DNA strands during stem cell replication has been proposed as a mechanism to minimize accumulated genetic errors in stem cells of rapidly dividing tissues. According to this hypothesis, an “immortal” DNA strand is passed to the stem cell daughter and not the more differentiated cell, keeping the stem cell lineage replication error-free. After it was introduced, experimental evidence both in favor and against the hypothesis has been presented. Principal...

  2. A precise extragalactic test of General Relativity.

    Collett, Thomas E; Oldham, Lindsay J; Smith, Russell J; Auger, Matthew W; Westfall, Kyle B; Bacon, David; Nichol, Robert C; Masters, Karen L; Koyama, Kazuya; van den Bosch, Remco

    2018-06-22

    Einstein's theory of gravity, General Relativity, has been precisely tested on Solar System scales, but the long-range nature of gravity is still poorly constrained. The nearby strong gravitational lens ESO 325-G004 provides a laboratory to probe the weak-field regime of gravity and measure the spatial curvature generated per unit mass, γ. By reconstructing the observed light profile of the lensed arcs and the observed spatially resolved stellar kinematics with a single self-consistent model, we conclude that γ = 0.97 ± 0.09 at 68% confidence. Our result is consistent with the prediction of 1 from General Relativity and provides a strong extragalactic constraint on the weak-field metric of gravity. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  3. Precise Truss Assembly Using Commodity Parts and Low Precision Welding

    Komendera, Erik; Reishus, Dustin; Dorsey, John T.; Doggett, W. R.; Correll, Nikolaus

    2014-01-01

    Hardware and software design and system integration for an intelligent precision jigging robot (IPJR), which allows high precision assembly using commodity parts and low-precision bonding, is described. Preliminary 2D experiments that are motivated by the problem of assembling space telescope optical benches and very large manipulators on orbit using inexpensive, stock hardware and low-precision welding are also described. An IPJR is a robot that acts as the precise "jigging", holding parts of a local structure assembly site in place, while an external low precision assembly agent cuts and welds members. The prototype presented in this paper allows an assembly agent (for this prototype, a human using only low precision tools), to assemble a 2D truss made of wooden dowels to a precision on the order of millimeters over a span on the order of meters. The analysis of the assembly error and the results of building a square structure and a ring structure are discussed. Options for future work, to extend the IPJR paradigm to building in 3D structures at micron precision are also summarized.

  4. [Precision nutrition in the era of precision medicine].

    Chen, P Z; Wang, H

    2016-12-06

    Precision medicine has been increasingly incorporated into clinical practice and is enabling a new era for disease prevention and treatment. As an important constituent of precision medicine, precision nutrition has also been drawing more attention during physical examinations. The main aim of precision nutrition is to provide safe and efficient intervention methods for disease treatment and management, through fully considering the genetics, lifestyle (dietary, exercise and lifestyle choices), metabolic status, gut microbiota and physiological status (nutrient level and disease status) of individuals. Three major components should be considered in precision nutrition, including individual criteria for sufficient nutritional status, biomarker monitoring or techniques for nutrient detection and the applicable therapeutic or intervention methods. It was suggested that, in clinical practice, many inherited and chronic metabolic diseases might be prevented or managed through precision nutritional intervention. For generally healthy populations, because lifestyles, dietary factors, genetic factors and environmental exposures vary among individuals, precision nutrition is warranted to improve their physical activity and reduce disease risks. In summary, research and practice is leading toward precision nutrition becoming an integral constituent of clinical nutrition and disease prevention in the era of precision medicine.

  5. Precision medicine in myasthenia graves: begin from the data precision

    Hong, Yu; Xie, Yanchen; Hao, Hong-Jun; Sun, Ren-Cheng

    2016-01-01

    Myasthenia gravis (MG) is a prototypic autoimmune disease with overt clinical and immunological heterogeneity. The data of MG is far from individually precise now, partially due to the rarity and heterogeneity of this disease. In this review, we provide the basic insights of MG data precision, including onset age, presenting symptoms, generalization, thymus status, pathogenic autoantibodies, muscle involvement, severity and response to treatment based on references and our previous studies. Subgroups and quantitative traits of MG are discussed in the sense of data precision. The role of disease registries and scientific bases of precise analysis are also discussed to ensure better collection and analysis of MG data. PMID:27127759

  6. Multiple hypothesis tracking for the cyber domain

    Schwoegler, Stefan; Blackman, Sam; Holsopple, Jared; Hirsch, Michael J.

    2011-09-01

    This paper discusses how methods used for conventional multiple hypothesis tracking (MHT) can be extended to domain-agnostic tracking of entities from non-kinematic constraints such as those imposed by cyber attacks in a potentially dense false alarm background. MHT is widely recognized as the premier method to avoid corrupting tracks with spurious data in the kinematic domain but it has not been extensively applied to other problem domains. The traditional approach is to tightly couple track maintenance (prediction, gating, filtering, probabilistic pruning, and target confirmation) with hypothesis management (clustering, incompatibility maintenance, hypothesis formation, and Nassociation pruning). However, by separating the domain specific track maintenance portion from the domain agnostic hypothesis management piece, we can begin to apply the wealth of knowledge gained from ground and air tracking solutions to the cyber (and other) domains. These realizations led to the creation of Raytheon's Multiple Hypothesis Extensible Tracking Architecture (MHETA). In this paper, we showcase MHETA for the cyber domain, plugging in a well established method, CUBRC's INFormation Engine for Real-time Decision making, (INFERD), for the association portion of the MHT. The result is a CyberMHT. We demonstrate the power of MHETA-INFERD using simulated data. Using metrics from both the tracking and cyber domains, we show that while no tracker is perfect, by applying MHETA-INFERD, advanced nonkinematic tracks can be captured in an automated way, perform better than non-MHT approaches, and decrease analyst response time to cyber threats.

  7. Aminoglycoside antibiotics and autism: a speculative hypothesis

    Manev Hari

    2001-10-01

    Full Text Available Abstract Background Recently, it has been suspected that there is a relationship between therapy with some antibiotics and the onset of autism; but even more curious, some children benefited transiently from a subsequent treatment with a different antibiotic. Here, we speculate how aminoglycoside antibiotics might be associated with autism. Presentation We hypothesize that aminoglycoside antibiotics could a trigger the autism syndrome in susceptible infants by causing the stop codon readthrough, i.e., a misreading of the genetic code of a hypothetical critical gene, and/or b improve autism symptoms by correcting the premature stop codon mutation in a hypothetical polymorphic gene linked to autism. Testing Investigate, retrospectively, whether a link exists between aminoglycoside use (which is not extensive in children and the onset of autism symptoms (hypothesis "a", or between amino glycoside use and improvement of these symptoms (hypothesis "b". Whereas a prospective study to test hypothesis "a" is not ethically justifiable, a study could be designed to test hypothesis "b". Implications It should be stressed that at this stage no direct evidence supports our speculative hypothesis and that its main purpose is to initiate development of new ideas that, eventually, would improve our understanding of the pathobiology of autism.

  8. Multineuronal Spike Sequences Repeat with Millisecond Precision

    Koki eMatsumoto

    2013-06-01

    Full Text Available Cortical microcircuits are nonrandomly wired by neurons. As a natural consequence, spikes emitted by microcircuits are also nonrandomly patterned in time and space. One of the prominent spike organizations is a repetition of fixed patterns of spike series across multiple neurons. However, several questions remain unsolved, including how precisely spike sequences repeat, how the sequences are spatially organized, how many neurons participate in sequences, and how different sequences are functionally linked. To address these questions, we monitored spontaneous spikes of hippocampal CA3 neurons ex vivo using a high-speed functional multineuron calcium imaging technique that allowed us to monitor spikes with millisecond resolution and to record the location of spiking and nonspiking neurons. Multineuronal spike sequences were overrepresented in spontaneous activity compared to the statistical chance level. Approximately 75% of neurons participated in at least one sequence during our observation period. The participants were sparsely dispersed and did not show specific spatial organization. The number of sequences relative to the chance level decreased when larger time frames were used to detect sequences. Thus, sequences were precise at the millisecond level. Sequences often shared common spikes with other sequences; parts of sequences were subsequently relayed by following sequences, generating complex chains of multiple sequences.

  9. Testing competing forms of the Milankovitch hypothesis

    Kaufmann, Robert K.; Juselius, Katarina

    2016-01-01

    We test competing forms of the Milankovitch hypothesis by estimating the coefficients and diagnostic statistics for a cointegrated vector autoregressive model that includes 10 climate variables and four exogenous variables for solar insolation. The estimates are consistent with the physical...... ice volume and solar insolation. The estimated adjustment dynamics show that solar insolation affects an array of climate variables other than ice volume, each at a unique rate. This implies that previous efforts to test the strong form of the Milankovitch hypothesis by examining the relationship...... that the latter is consistent with a weak form of the Milankovitch hypothesis and that it should be restated as follows: Internal climate dynamics impose perturbations on glacial cycles that are driven by solar insolation. Our results show that these perturbations are likely caused by slow adjustment between land...

  10. Rejecting the equilibrium-point hypothesis.

    Gottlieb, G L

    1998-01-01

    The lambda version of the equilibrium-point (EP) hypothesis as developed by Feldman and colleagues has been widely used and cited with insufficient critical understanding. This article offers a small antidote to that lack. First, the hypothesis implicitly, unrealistically assumes identical transformations of lambda into muscle tension for antagonist muscles. Without that assumption, its definitions of command variables R, C, and lambda are incompatible and an EP is not defined exclusively by R nor is it unaffected by C. Second, the model assumes unrealistic and unphysiological parameters for the damping properties of the muscles and reflexes. Finally, the theory lacks rules for two of its three command variables. A theory of movement should offer insight into why we make movements the way we do and why we activate muscles in particular patterns. The EP hypothesis offers no unique ideas that are helpful in addressing either of these questions.

  11. The linear hypothesis and radiation carcinogenesis

    Roberts, P.B.

    1981-10-01

    An assumption central to most estimations of the carcinogenic potential of low levels of ionising radiation is that the risk always increases in direct proportion to the dose received. This assumption (the linear hypothesis) has been both strongly defended and attacked on several counts. It appears unlikely that conclusive, direct evidence on the validity of the hypothesis will be forthcoming. We review the major indirect arguments used in the debate. All of them are subject to objections that can seriously weaken their case. In the present situation, retention of the linear hypothesis as the basis of extrapolations from high to low dose levels can lead to excessive fears, over-regulation and unnecessarily expensive protection measures. To offset these possibilities, support is given to suggestions urging a cut-off dose, probably some fraction of natural background, below which risks can be deemed acceptable

  12. Rayleigh's hypothesis and the geometrical optics limit.

    Elfouhaily, Tanos; Hahn, Thomas

    2006-09-22

    The Rayleigh hypothesis (RH) is often invoked in the theoretical and numerical treatment of rough surface scattering in order to decouple the analytical form of the scattered field. The hypothesis stipulates that the scattered field away from the surface can be extended down onto the rough surface even though it is formed by solely up-going waves. Traditionally this hypothesis is systematically used to derive the Volterra series under the small perturbation method which is equivalent to the low-frequency limit. In this Letter we demonstrate that the RH also carries the high-frequency or the geometrical optics limit, at least to first order. This finding has never been explicitly derived in the literature. Our result comforts the idea that the RH might be an exact solution under some constraints in the general case of random rough surfaces and not only in the case of small-slope deterministic periodic gratings.

  13. Why would musical training benefit the neural encoding of speech? The OPERA hypothesis.

    Aniruddh D. Patel

    2011-06-01

    Full Text Available Mounting evidence suggests that musical training benefits the neural encoding of speech. This paper offers a hypothesis specifying why such benefits occur. The OPERA hypothesis proposes that such benefits are driven by adaptive plasticity in speech-processing networks, and that this plasticity occurs when five conditions are met. These are: 1 Overlap: there is anatomical overlap in the brain networks that process an acoustic feature used in both music and speech (e.g., waveform periodicity, amplitude envelope, 2 Precision: music places higher demands on these shared networks than does speech, in terms of the precision of processing, 3 Emotion: the musical activities that engage this network elicit strong positive emotion, 4 Repetition: the musical activities that engage this network are frequently repeated, and 5 Attention: the musical activities that engage this network are associated with focused attention. According to the OPERA hypothesis, when these conditions are met neural plasticity drives the networks in question to function with higher precision than needed for ordinary speech communication. Yet since speech shares these networks with music, speech processing benefits. The OPERA hypothesis is used to account for the observed superior subcortical encoding of speech in musically trained individuals, and to suggest mechanisms by which musical training might improve linguistic reading abilities.

  14. Why would Musical Training Benefit the Neural Encoding of Speech? The OPERA Hypothesis.

    Patel, Aniruddh D

    2011-01-01

    Mounting evidence suggests that musical training benefits the neural encoding of speech. This paper offers a hypothesis specifying why such benefits occur. The "OPERA" hypothesis proposes that such benefits are driven by adaptive plasticity in speech-processing networks, and that this plasticity occurs when five conditions are met. These are: (1) Overlap: there is anatomical overlap in the brain networks that process an acoustic feature used in both music and speech (e.g., waveform periodicity, amplitude envelope), (2) Precision: music places higher demands on these shared networks than does speech, in terms of the precision of processing, (3) Emotion: the musical activities that engage this network elicit strong positive emotion, (4) Repetition: the musical activities that engage this network are frequently repeated, and (5) Attention: the musical activities that engage this network are associated with focused attention. According to the OPERA hypothesis, when these conditions are met neural plasticity drives the networks in question to function with higher precision than needed for ordinary speech communication. Yet since speech shares these networks with music, speech processing benefits. The OPERA hypothesis is used to account for the observed superior subcortical encoding of speech in musically trained individuals, and to suggest mechanisms by which musical training might improve linguistic reading abilities.

  15. MEASUREMENT AND PRECISION, EXPERIMENTAL VERSION.

    Harvard Univ., Cambridge, MA. Harvard Project Physics.

    THIS DOCUMENT IS AN EXPERIMENTAL VERSION OF A PROGRAMED TEXT ON MEASUREMENT AND PRECISION. PART I CONTAINS 24 FRAMES DEALING WITH PRECISION AND SIGNIFICANT FIGURES ENCOUNTERED IN VARIOUS MATHEMATICAL COMPUTATIONS AND MEASUREMENTS. PART II BEGINS WITH A BRIEF SECTION ON EXPERIMENTAL DATA, COVERING SUCH POINTS AS (1) ESTABLISHING THE ZERO POINT, (2)…

  16. Precision medicine for nurses: 101.

    Lemoine, Colleen

    2014-05-01

    To introduce the key concepts and terms associated with precision medicine and support understanding of future developments in the field by providing an overview and history of precision medicine, related ethical considerations, and nursing implications. Current nursing, medical and basic science literature. Rapid progress in understanding the oncogenic drivers associated with cancer is leading to a shift toward precision medicine, where treatment is based on targeting specific genetic and epigenetic alterations associated with a particular cancer. Nurses will need to embrace the paradigm shift to precision medicine, expend the effort necessary to learn the essential terminology, concepts and principles, and work collaboratively with physician colleagues to best position our patients to maximize the potential that precision medicine can offer. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. On the generalized gravi-magnetic hypothesis

    Massa, C.

    1989-01-01

    According to a generalization of the gravi-magnetic hypothesis (GMH) any neutral mass moving in a curvilinear path with respect to an inertial frame creates a magnetic field, dependent on the curvature radius of the path. A simple astrophysical consequence of the generalized GMH is suggested considering the special cases of binary pulsars and binary neutron stars

  18. Remarks about the hypothesis of limiting fragmentation

    Chou, T.T.; Yang, C.N.

    1987-01-01

    Remarks are made about the hypothesis of limiting fragmentation. In particular, the concept of favored and disfavored fragment distribution is introduced. Also, a sum rule is proved leading to a useful quantity called energy-fragmentation fraction. (author). 11 refs, 1 fig., 2 tabs

  19. Multiple hypothesis clustering in radar plot extraction

    Huizing, A.G.; Theil, A.; Dorp, Ph. van; Ligthart, L.P.

    1995-01-01

    False plots and plots with inaccurate range and Doppler estimates may severely degrade the performance of tracking algorithms in radar systems. This paper describes how a multiple hypothesis clustering technique can be applied to mitigate the problems involved in plot extraction. The measures of

  20. The (not so immortal strand hypothesis

    Cristian Tomasetti

    2015-03-01

    Significance: Utilizing an approach that is fundamentally different from previous efforts to confirm or refute the immortal strand hypothesis, we provide evidence against non-random segregation of DNA during stem cell replication. Our results strongly suggest that parental DNA is passed randomly to stem cell daughters and provides new insight into the mechanism of DNA replication in stem cells.

  1. A Developmental Study of the Infrahumanization Hypothesis

    Martin, John; Bennett, Mark; Murray, Wayne S.

    2008-01-01

    Intergroup attitudes in children were examined based on Leyen's "infrahumanization hypothesis". This suggests that some uniquely human emotions, such as shame and guilt (secondary emotions), are reserved for the in-group, whilst other emotions that are not uniquely human and shared with animals, such as anger and pleasure (primary…

  2. Morbidity and Infant Development: A Hypothesis.

    Pollitt, Ernesto

    1983-01-01

    Results of a study conducted in 14 villages of Sui Lin Township, Taiwan, suggest the hypothesis that, under conditions of extreme economic impoverishment and among children within populations where energy protein malnutrition is endemic, there is an inverse relationship between incidence of morbidity in infancy and measures of motor and mental…

  3. Diagnostic Hypothesis Generation and Human Judgment

    Thomas, Rick P.; Dougherty, Michael R.; Sprenger, Amber M.; Harbison, J. Isaiah

    2008-01-01

    Diagnostic hypothesis-generation processes are ubiquitous in human reasoning. For example, clinicians generate disease hypotheses to explain symptoms and help guide treatment, auditors generate hypotheses for identifying sources of accounting errors, and laypeople generate hypotheses to explain patterns of information (i.e., data) in the…

  4. Multi-hypothesis distributed stereo video coding

    Salmistraro, Matteo; Zamarin, Marco; Forchhammer, Søren

    2013-01-01

    for stereo sequences, exploiting an interpolated intra-view SI and two inter-view SIs. The quality of the SI has a major impact on the DVC Rate-Distortion (RD) performance. As the inter-view SIs individually present lower RD performance compared with the intra-view SI, we propose multi-hypothesis decoding...

  5. [Resonance hypothesis of heart rate variability origin].

    Sheĭkh-Zade, Iu R; Mukhambetaliev, G Kh; Cherednik, I L

    2009-09-01

    A hypothesis is advanced of the heart rate variability being subjected to beat-to-beat regulation of cardiac cycle duration in order to ensure the resonance interaction between respiratory and own fluctuation of the arterial system volume for minimization of power expenses of cardiorespiratory system. Myogenic, parasympathetic and sympathetic machanisms of heart rate variability are described.

  6. In Defense of Chi's Ontological Incompatibility Hypothesis

    Slotta, James D.

    2011-01-01

    This article responds to an article by A. Gupta, D. Hammer, and E. F. Redish (2010) that asserts that M. T. H. Chi's (1992, 2005) hypothesis of an "ontological commitment" in conceptual development is fundamentally flawed. In this article, I argue that Chi's theoretical perspective is still very much intact and that the critique offered by Gupta…

  7. Vacuum counterexamples to the cosmic censorship hypothesis

    Miller, B.D.

    1981-01-01

    In cylindrically symmetric vacuum spacetimes it is possible to specify nonsingular initial conditions such that timelike singularities will (necessarily) evolve from these conditions. Examples are given; the spacetimes are somewhat analogous to one of the spherically symmetric counterexamples to the cosmic censorship hypothesis

  8. A novel hypothesis splitting method implementation for multi-hypothesis filters

    Bayramoglu, Enis; Ravn, Ole; Andersen, Nils Axel

    2013-01-01

    The paper presents a multi-hypothesis filter library featuring a novel method for splitting Gaussians into ones with smaller variances. The library is written in C++ for high performance and the source code is open and free1. The multi-hypothesis filters commonly approximate the distribution tran...

  9. The Income Inequality Hypothesis Revisited : Assessing the Hypothesis Using Four Methodological Approaches

    Kragten, N.; Rözer, J.

    The income inequality hypothesis states that income inequality has a negative effect on individual’s health, partially because it reduces social trust. This article aims to critically assess the income inequality hypothesis by comparing several analytical strategies, namely OLS regression,

  10. Einstein's Revolutionary Light-Quantum Hypothesis

    Stuewer, Roger H.

    2005-05-01

    The paper in which Albert Einstein proposed his light-quantum hypothesis was the only one of his great papers of 1905 that he himself termed ``revolutionary.'' Contrary to widespread belief, Einstein did not propose his light-quantum hypothesis ``to explain the photoelectric effect.'' Instead, he based his argument for light quanta on the statistical interpretation of the second law of thermodynamics, with the photoelectric effect being only one of three phenomena that he offered as possible experimental support for it. I will discuss Einstein's light-quantum hypothesis of 1905 and his introduction of the wave-particle duality in 1909 and then turn to the reception of his work on light quanta by his contemporaries. We will examine the reasons that prominent physicists advanced to reject Einstein's light-quantum hypothesis in succeeding years. Those physicists included Robert A. Millikan, even though he provided convincing experimental proof of the validity of Einstein's equation of the photoelectric effect in 1915. The turning point came after Arthur Holly Compton discovered the Compton effect in late 1922, but even then Compton's discovery was contested both on experimental and on theoretical grounds. Niels Bohr, in particular, had never accepted the reality of light quanta and now, in 1924, proposed a theory, the Bohr-Kramers-Slater theory, which assumed that energy and momentum were conserved only statistically in microscopic interactions. Only after that theory was disproved experimentally in 1925 was Einstein's revolutionary light-quantum hypothesis generally accepted by physicists---a full two decades after Einstein had proposed it.

  11. A Dopamine Hypothesis of Autism Spectrum Disorder.

    Pavăl, Denis

    2017-01-01

    Autism spectrum disorder (ASD) comprises a group of neurodevelopmental disorders characterized by social deficits and stereotyped behaviors. While several theories have emerged, the pathogenesis of ASD remains unknown. Although studies report dopamine signaling abnormalities in autistic patients, a coherent dopamine hypothesis which could link neurobiology to behavior in ASD is currently lacking. In this paper, we present such a hypothesis by proposing that autistic behavior arises from dysfunctions in the midbrain dopaminergic system. We hypothesize that a dysfunction of the mesocorticolimbic circuit leads to social deficits, while a dysfunction of the nigrostriatal circuit leads to stereotyped behaviors. Furthermore, we discuss 2 key predictions of our hypothesis, with emphasis on clinical and therapeutic aspects. First, we argue that dopaminergic dysfunctions in the same circuits should associate with autistic-like behavior in nonautistic subjects. Concerning this, we discuss the case of PANDAS (pediatric autoimmune neuropsychiatric disorder associated with streptococcal infections) which displays behaviors similar to those of ASD, presumed to arise from dopaminergic dysfunctions. Second, we argue that providing dopamine modulators to autistic subjects should lead to a behavioral improvement. Regarding this, we present clinical studies of dopamine antagonists which seem to have improving effects on autistic behavior. Furthermore, we explore the means of testing our hypothesis by using neuroreceptor imaging, which could provide comprehensive evidence for dopamine signaling dysfunctions in autistic subjects. Lastly, we discuss the limitations of our hypothesis. Along these lines, we aim to provide a dopaminergic model of ASD which might lead to a better understanding of the ASD pathogenesis. © 2017 S. Karger AG, Basel.

  12. Evaluation of 7Be fallout spatial variability

    Pinto, Victor Meriguetti

    2011-01-01

    The cosmogenic radionuclide beryllium-7 (Be) is produced in the atmosphere by cosmic particle reactions and is being used as a tracer for soil erosion and climatic processes research. After the production, 7 Be bonds to aerosol particles in the atmosphere and is deposited on the soil surface with other radionuclide species by rainfall. Because of the high adsorption on soil particles and its short half-life of 53.2 days, this radionuclide follows of the erosion process and can be used as a tracer to evaluate the sediment transport that occurs during a single rain event or short period of rain events. A key assumption for the erosion evaluation through this radiotracer is the uniformity of the spatial distribution of the 7 Be fallout. The 7 Be method was elaborated recently and due to its few applications, some assumptions related to the method were not yet properly investigated yet, and the hypothesis of 7 Be fallout uniformity needs to be evaluated. The aim of this study was to evaluate the 7 Be fallout spatial distribution through the rain water 7 Be activity analysis of the first five millimeters of single rain events. The rain water was sampled using twelve collectors distributed on an experimental area of about 300 m2 , located in the campus of Sao Paulo University, Piracicaba. The 7 Be activities were measured using a 53% efficiency gamma-ray spectrometer from the Radioisotope laboratory of CENA. The 7 Be activities in rain water varied from 0.26 to 1.81 Sq.L - 1, with the highest values in summer and lowest in spring. In each one of the 5 single events, the spatial variability of 7 Se activity in rain water was high, showing the high randomness of the fallout spatial distribution. A simulation using the 7 Be spatial variability values obtained here and 7 Se average reference inventories taken from the literature was performed determining the lowest detectable erosion rate estimated by 7 Be model. The importance of taking a representative number of samples to

  13. Advanced bioanalytics for precision medicine.

    Roda, Aldo; Michelini, Elisa; Caliceti, Cristiana; Guardigli, Massimo; Mirasoli, Mara; Simoni, Patrizia

    2018-01-01

    Precision medicine is a new paradigm that combines diagnostic, imaging, and analytical tools to produce accurate diagnoses and therapeutic interventions tailored to the individual patient. This approach stands in contrast to the traditional "one size fits all" concept, according to which researchers develop disease treatments and preventions for an "average" patient without considering individual differences. The "one size fits all" concept has led to many ineffective or inappropriate treatments, especially for pathologies such as Alzheimer's disease and cancer. Now, precision medicine is receiving massive funding in many countries, thanks to its social and economic potential in terms of improved disease prevention, diagnosis, and therapy. Bioanalytical chemistry is critical to precision medicine. This is because identifying an appropriate tailored therapy requires researchers to collect and analyze information on each patient's specific molecular biomarkers (e.g., proteins, nucleic acids, and metabolites). In other words, precision diagnostics is not possible without precise bioanalytical chemistry. This Trend article highlights some of the most recent advances, including massive analysis of multilayer omics, and new imaging technique applications suitable for implementing precision medicine. Graphical abstract Precision medicine combines bioanalytical chemistry, molecular diagnostics, and imaging tools for performing accurate diagnoses and selecting optimal therapies for each patient.

  14. Precision Oncology: Between Vaguely Right and Precisely Wrong.

    Brock, Amy; Huang, Sui

    2017-12-01

    Precision Oncology seeks to identify and target the mutation that drives a tumor. Despite its straightforward rationale, concerns about its effectiveness are mounting. What is the biological explanation for the "imprecision?" First, Precision Oncology relies on indiscriminate sequencing of genomes in biopsies that barely represent the heterogeneous mix of tumor cells. Second, findings that defy the orthodoxy of oncogenic "driver mutations" are now accumulating: the ubiquitous presence of oncogenic mutations in silent premalignancies or the dynamic switching without mutations between various cell phenotypes that promote progression. Most troublesome is the observation that cancer cells that survive treatment still will have suffered cytotoxic stress and thereby enter a stem cell-like state, the seeds for recurrence. The benefit of "precision targeting" of mutations is inherently limited by this counterproductive effect. These findings confirm that there is no precise linear causal relationship between tumor genotype and phenotype, a reminder of logician Carveth Read's caution that being vaguely right may be preferable to being precisely wrong. An open-minded embrace of the latest inconvenient findings indicating nongenetic and "imprecise" phenotype dynamics of tumors as summarized in this review will be paramount if Precision Oncology is ultimately to lead to clinical benefits. Cancer Res; 77(23); 6473-9. ©2017 AACR . ©2017 American Association for Cancer Research.

  15. Numerical precision control and GRACE

    Fujimoto, J.; Hamaguchi, N.; Ishikawa, T.; Kaneko, T.; Morita, H.; Perret-Gallix, D.; Tokura, A.; Shimizu, Y.

    2006-01-01

    The control of the numerical precision of large-scale computations like those generated by the GRACE system for automatic Feynman diagram calculations has become an intrinsic part of those packages. Recently, Hitachi Ltd. has developed in FORTRAN a new library HMLIB for quadruple and octuple precision arithmetic where the number of lost-bits is made available. This library has been tested with success on the 1-loop radiative correction to e + e - ->e + e - τ + τ - . It is shown that the approach followed by HMLIB provides an efficient way to track down the source of numerical significance losses and to deliver high-precision results yet minimizing computing time

  16. Spatial Operations

    Anda VELICANU

    2010-09-01

    Full Text Available This paper contains a brief description of the most important operations that can be performed on spatial data such as spatial queries, create, update, insert, delete operations, conversions, operations on the map or analysis on grid cells. Each operation has a graphical example and some of them have code examples in Oracle and PostgreSQL.

  17. Spatializing Time

    Thomsen, Bodil Marie Stavning

    2011-01-01

    The article analyses some of artist Søren Lose's photographic installations in which time, history and narration is reflected in the creation of allegoric, spatial relations.......The article analyses some of artist Søren Lose's photographic installations in which time, history and narration is reflected in the creation of allegoric, spatial relations....

  18. Spatial Computation

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  19. Contacting nanowires and nanotubes with atomic precision for electronic transport

    Qin, Shengyong; Hellstrom, Sondra; Bao, Zhenan; Boyanov, Boyan; Li, An-Ping

    2012-01-01

    Making contacts to nanostructures with atomic precision is an important process in the bottom-up fabrication and characterization of electronic nanodevices. Existing contacting techniques use top-down lithography and chemical etching, but lack atomic precision and introduce the possibility of contamination. Here, we report that a field-induced emission process can be used to make local contacts onto individual nanowires and nanotubes with atomic spatial precision. The gold nano-islands are deposited onto nanostructures precisely by using a scanning tunneling microscope tip, which provides a clean and controllable method to ensure both electrically conductive and mechanically reliable contacts. To demonstrate the wide applicability of the technique, nano-contacts are fabricated on silicide atomic wires, carbon nanotubes, and copper nanowires. The electrical transport measurements are performed in situ by utilizing the nanocontacts to bridge the nanostructures to the transport probes. © 2012 American Institute of Physics.

  20. The demand for environmental quality and the environmental Kuznets Curve hypothesis

    Khanna, Neha; Plassmann, Florenz

    2004-01-01

    Household demand for better environmental quality is the key factor in the long-term global applicability of the Environmental Kuznets Curve (EKC) hypothesis. We argue that, for given consumer preferences, the threshold income level at which the EKC turns downwards or the equilibrium income elasticity changes sign from positive to negative depends on the ability to spatially separate production and consumption. We test our hypothesis by estimating the equilibrium income elasticities of five pollutants, using 1990 data for the United States. We find that the change in sign occurs at lower income levels for pollutants for which spatial separation is relatively easy as compared to pollutants for which spatial separation is difficult. Our results suggest that even high-income households in the United States have not yet reached the income level at which their demand for better environmental quality is high enough to cause the income-pollution relationship to turn downwards for all the pollutants that we analyzed

  1. The demand for environmental quality and the environmental Kuznets Curve hypothesis

    Khanna, Neha [Department of Economics and Environmental Studies Program, Binghamton, University (LT 1004), P.O. Box 6000, Binghamton, NY 13902-6000 (United States); Plassmann, Florenz [Department of Economics, Binghamton University (LT 904), P.O. Box 6000, Binghamton, NY 13902-6000 (United States)

    2004-12-01

    Household demand for better environmental quality is the key factor in the long-term global applicability of the Environmental Kuznets Curve (EKC) hypothesis. We argue that, for given consumer preferences, the threshold income level at which the EKC turns downwards or the equilibrium income elasticity changes sign from positive to negative depends on the ability to spatially separate production and consumption. We test our hypothesis by estimating the equilibrium income elasticities of five pollutants, using 1990 data for the United States. We find that the change in sign occurs at lower income levels for pollutants for which spatial separation is relatively easy as compared to pollutants for which spatial separation is difficult. Our results suggest that even high-income households in the United States have not yet reached the income level at which their demand for better environmental quality is high enough to cause the income-pollution relationship to turn downwards for all the pollutants that we analyzed.

  2. Lung Cancer Precision Medicine Trials

    Patients with lung cancer are benefiting from the boom in targeted and immune-based therapies. With a series of precision medicine trials, NCI is keeping pace with the rapidly changing treatment landscape for lung cancer.

  3. Precision engineering: an evolutionary perspective.

    Evans, Chris J

    2012-08-28

    Precision engineering is a relatively new name for a technology with roots going back over a thousand years; those roots span astronomy, metrology, fundamental standards, manufacturing and money-making (literally). Throughout that history, precision engineers have created links across disparate disciplines to generate innovative responses to society's needs and wants. This review combines historical and technological perspectives to illuminate precision engineering's current character and directions. It first provides us a working definition of precision engineering and then reviews the subject's roots. Examples will be given showing the contributions of the technology to society, while simultaneously showing the creative tension between the technological convergence that spurs new directions and the vertical disintegration that optimizes manufacturing economics.

  4. How GNSS Enables Precision Farming

    2014-12-01

    Precision farming: Feeding a Growing Population Enables Those Who Feed the World. Immediate and Ongoing Needs - population growth (more to feed) - urbanization (decrease in arable land) Double food production by 2050 to meet world demand. To meet thi...

  5. Using Big Data Analytics to Advance Precision Radiation Oncology.

    McNutt, Todd R; Benedict, Stanley H; Low, Daniel A; Moore, Kevin; Shpitser, Ilya; Jiang, Wei; Lakshminarayanan, Pranav; Cheng, Zhi; Han, Peijin; Hui, Xuan; Nakatsugawa, Minoru; Lee, Junghoon; Moore, Joseph A; Robertson, Scott P; Shah, Veeraj; Taylor, Russ; Quon, Harry; Wong, John; DeWeese, Theodore

    2018-06-01

    Big clinical data analytics as a primary component of precision medicine is discussed, identifying where these emerging tools fit in the spectrum of genomics and radiomics research. A learning health system (LHS) is conceptualized that uses clinically acquired data with machine learning to advance the initiatives of precision medicine. The LHS is comprehensive and can be used for clinical decision support, discovery, and hypothesis derivation. These developing uses can positively impact the ultimate management and therapeutic course for patients. The conceptual model for each use of clinical data, however, is different, and an overview of the implications is discussed. With advancements in technologies and culture to improve the efficiency, accuracy, and breadth of measurements of the patient condition, the concept of an LHS may be realized in precision radiation therapy. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. Spatial manipulation with microfluidics

    Benjamin eLin

    2015-04-01

    Full Text Available Biochemical gradients convey information through space, time, and concentration, and are ultimately capable of spatially resolving distinct cellular phenotypes, such as differentiation, proliferation, and migration. How these gradients develop, evolve, and function during development, homeostasis, and various disease states is a subject of intense interest across a variety of disciplines. Microfluidic technologies have become essential tools for investigating gradient sensing in vitro due to their ability to precisely manipulate fluids on demand in well controlled environments at cellular length scales. This minireview will highlight their utility for studying gradient sensing along with relevant applications to biology.

  7. Mapping spatial patterns with morphological image processing

    Peter Vogt; Kurt H. Riitters; Christine Estreguil; Jacek Kozak; Timothy G. Wade; James D. Wickham

    2006-01-01

    We use morphological image processing for classifying spatial patterns at the pixel level on binary land-cover maps. Land-cover pattern is classified as 'perforated,' 'edge,' 'patch,' and 'core' with higher spatial precision and thematic accuracy compared to a previous approach based on image convolution, while retaining the...

  8. FROM PERSONALIZED TO PRECISION MEDICINE

    K. V. Raskina

    2017-01-01

    Full Text Available The need to maintain a high quality of life against a backdrop of its inevitably increasing duration is one of the main problems of modern health care. The concept of "right drug to the right patient at the right time", which at first was bearing the name "personalized", is currently unanimously approved by international scientific community as "precision medicine". Precision medicine takes all the individual characteristics into account: genes diversity, environment, lifestyles, and even bacterial microflora and also involves the use of the latest technological developments, which serves to ensure that each patient gets assistance fitting his state best. In the United States, Canada and France national precision medicine programs have already been submitted and implemented. The aim of this review is to describe the dynamic integration of precision medicine methods into routine medical practice and life of modern society. The new paradigm prospects description are complemented by figures, proving the already achieved success in the application of precise methods for example, the targeted therapy of cancer. All in all, the presence of real-life examples, proving the regularity of transition to a new paradigm, and a wide range  of technical and diagnostic capabilities available and constantly evolving make the all-round transition to precision medicine almost inevitable.

  9. Tests of the Giant Impact Hypothesis

    Jones, J. H.

    1998-01-01

    The giant impact hypothesis has gained popularity as a means of explaining a volatile-depleted Moon that still has a chemical affinity to the Earth. As Taylor's Axiom decrees, the best models of lunar origin are testable, but this is difficult with the giant impact model. The energy associated with the impact would be sufficient to totally melt and partially vaporize the Earth. And this means that there should he no geological vestige of Barber times. Accordingly, it is important to devise tests that may be used to evaluate the giant impact hypothesis. Three such tests are discussed here. None of these is supportive of the giant impact model, but neither do they disprove it.

  10. The discovered preference hypothesis - an empirical test

    Lundhede, Thomas; Ladenburg, Jacob; Olsen, Søren Bøye

    Using stated preference methods for valuation of non-market goods is known to be vulnerable to a range of biases. Some authors claim that these so-called anomalies in effect render the methods useless for the purpose. However, the Discovered Preference Hypothesis, as put forth by Plott [31], offers...... an nterpretation and explanation of biases which entails that the stated preference methods need not to be completely written off. In this paper we conduct a test for the validity and relevance of the DPH interpretation of biases. In a choice experiment concerning preferences for protection of Danish nature areas...... as respondents evaluate more and more choice sets. This finding supports the Discovered Preference Hypothesis interpretation and explanation of starting point bias....

  11. The Hypothesis-Driven Physical Examination.

    Garibaldi, Brian T; Olson, Andrew P J

    2018-05-01

    The physical examination remains a vital part of the clinical encounter. However, physical examination skills have declined in recent years, in part because of decreased time at the bedside. Many clinicians question the relevance of physical examinations in the age of technology. A hypothesis-driven approach to teaching and practicing the physical examination emphasizes the performance of maneuvers that can alter the likelihood of disease. Likelihood ratios are diagnostic weights that allow clinicians to estimate the post-probability of disease. This hypothesis-driven approach to the physical examination increases its value and efficiency, while preserving its cultural role in the patient-physician relationship. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. MOLIERE: Automatic Biomedical Hypothesis Generation System.

    Sybrandt, Justin; Shtutman, Michael; Safro, Ilya

    2017-08-01

    Hypothesis generation is becoming a crucial time-saving technique which allows biomedical researchers to quickly discover implicit connections between important concepts. Typically, these systems operate on domain-specific fractions of public medical data. MOLIERE, in contrast, utilizes information from over 24.5 million documents. At the heart of our approach lies a multi-modal and multi-relational network of biomedical objects extracted from several heterogeneous datasets from the National Center for Biotechnology Information (NCBI). These objects include but are not limited to scientific papers, keywords, genes, proteins, diseases, and diagnoses. We model hypotheses using Latent Dirichlet Allocation applied on abstracts found near shortest paths discovered within this network, and demonstrate the effectiveness of MOLIERE by performing hypothesis generation on historical data. Our network, implementation, and resulting data are all publicly available for the broad scientific community.

  13. The Method of Hypothesis in Plato's Philosophy

    Malihe Aboie Mehrizi

    2016-09-01

    Full Text Available The article deals with the examination of method of hypothesis in Plato's philosophy. This method, respectively, will be examined in three dialogues of Meno, Phaedon and Republic in which it is explicitly indicated. It will be shown the process of change of Plato’s attitude towards the position and usage of the method of hypothesis in his realm of philosophy. In Meno, considering the geometry, Plato attempts to introduce a method that can be used in the realm of philosophy. But, ultimately in Republic, Plato’s special attention to the method and its importance in the philosophical investigations, leads him to revise it. Here, finally Plato introduces the particular method of philosophy, i.e., the dialectic

  14. Debates—Hypothesis testing in hydrology: Introduction

    Blöschl, Günter

    2017-03-01

    This paper introduces the papers in the "Debates—Hypothesis testing in hydrology" series. The four articles in the series discuss whether and how the process of testing hypotheses leads to progress in hydrology. Repeated experiments with controlled boundary conditions are rarely feasible in hydrology. Research is therefore not easily aligned with the classical scientific method of testing hypotheses. Hypotheses in hydrology are often enshrined in computer models which are tested against observed data. Testability may be limited due to model complexity and data uncertainty. All four articles suggest that hypothesis testing has contributed to progress in hydrology and is needed in the future. However, the procedure is usually not as systematic as the philosophy of science suggests. A greater emphasis on a creative reasoning process on the basis of clues and explorative analyses is therefore needed.

  15. Hypothesis testing of scientific Monte Carlo calculations

    Wallerberger, Markus; Gull, Emanuel

    2017-11-01

    The steadily increasing size of scientific Monte Carlo simulations and the desire for robust, correct, and reproducible results necessitates rigorous testing procedures for scientific simulations in order to detect numerical problems and programming bugs. However, the testing paradigms developed for deterministic algorithms have proven to be ill suited for stochastic algorithms. In this paper we demonstrate explicitly how the technique of statistical hypothesis testing, which is in wide use in other fields of science, can be used to devise automatic and reliable tests for Monte Carlo methods, and we show that these tests are able to detect some of the common problems encountered in stochastic scientific simulations. We argue that hypothesis testing should become part of the standard testing toolkit for scientific simulations.

  16. Reverse hypothesis machine learning a practitioner's perspective

    Kulkarni, Parag

    2017-01-01

    This book introduces a paradigm of reverse hypothesis machines (RHM), focusing on knowledge innovation and machine learning. Knowledge- acquisition -based learning is constrained by large volumes of data and is time consuming. Hence Knowledge innovation based learning is the need of time. Since under-learning results in cognitive inabilities and over-learning compromises freedom, there is need for optimal machine learning. All existing learning techniques rely on mapping input and output and establishing mathematical relationships between them. Though methods change the paradigm remains the same—the forward hypothesis machine paradigm, which tries to minimize uncertainty. The RHM, on the other hand, makes use of uncertainty for creative learning. The approach uses limited data to help identify new and surprising solutions. It focuses on improving learnability, unlike traditional approaches, which focus on accuracy. The book is useful as a reference book for machine learning researchers and professionals as ...

  17. Exploring heterogeneous market hypothesis using realized volatility

    Chin, Wen Cheong; Isa, Zaidi; Mohd Nor, Abu Hassan Shaari

    2013-04-01

    This study investigates the heterogeneous market hypothesis using high frequency data. The cascaded heterogeneous trading activities with different time durations are modelled by the heterogeneous autoregressive framework. The empirical study indicated the presence of long memory behaviour and predictability elements in the financial time series which supported heterogeneous market hypothesis. Besides the common sum-of-square intraday realized volatility, we also advocated two power variation realized volatilities in forecast evaluation and risk measurement in order to overcome the possible abrupt jumps during the credit crisis. Finally, the empirical results are used in determining the market risk using the value-at-risk approach. The findings of this study have implications for informationally market efficiency analysis, portfolio strategies and risk managements.

  18. Water Taxation and the Double Dividend Hypothesis

    Nicholas Kilimani

    2014-01-01

    The double dividend hypothesis contends that environmental taxes have the potential to yield multiple benefits for the economy. However, empirical evidence of the potential impacts of environmental taxation in developing countries is still limited. This paper seeks to contribute to the literature by exploring the impact of a water tax in a developing country context, with Uganda as a case study. Policy makers in Uganda are exploring ways of raising revenue by taxing environmental goods such a...

  19. [Working memory, phonological awareness and spelling hypothesis].

    Gindri, Gigiane; Keske-Soares, Márcia; Mota, Helena Bolli

    2007-01-01

    Working memory, phonological awareness and spelling hypothesis. To verify the relationship between working memory, phonological awareness and spelling hypothesis in pre-school children and first graders. Participants of this study were 90 students, belonging to state schools, who presented typical linguistic development. Forty students were preschoolers, with the average age of six and 50 students were first graders, with the average age of seven. Participants were submitted to an evaluation of the working memory abilities based on the Working Memory Model (Baddeley, 2000), involving phonological loop. Phonological loop was evaluated using the Auditory Sequential Test, subtest 5 of Illinois Test of Psycholinguistic Abilities (ITPA), Brazilian version (Bogossian & Santos, 1977), and the Meaningless Words Memory Test (Kessler, 1997). Phonological awareness abilities were investigated using the Phonological Awareness: Instrument of Sequential Assessment (CONFIAS - Moojen et al., 2003), involving syllabic and phonemic awareness tasks. Writing was characterized according to Ferreiro & Teberosky (1999). Preschoolers presented the ability of repeating sequences of 4.80 digits and 4.30 syllables. Regarding phonological awareness, the performance in the syllabic level was of 19.68 and in the phonemic level was of 8.58. Most of the preschoolers demonstrated to have a pre-syllabic writing hypothesis. First graders repeated, in average, sequences of 5.06 digits and 4.56 syllables. These children presented a phonological awareness of 31.12 in the syllabic level and of 16.18 in the phonemic level, and demonstrated to have an alphabetic writing hypothesis. The performance of working memory, phonological awareness and spelling level are inter-related, as well as being related to chronological age, development and scholarity.

  20. Privacy on Hypothesis Testing in Smart Grids

    Li, Zuxing; Oechtering, Tobias

    2015-01-01

    In this paper, we study the problem of privacy information leakage in a smart grid. The privacy risk is assumed to be caused by an unauthorized binary hypothesis testing of the consumer's behaviour based on the smart meter readings of energy supplies from the energy provider. Another energy supplies are produced by an alternative energy source. A controller equipped with an energy storage device manages the energy inflows to satisfy the energy demand of the consumer. We study the optimal ener...

  1. Box-particle probability hypothesis density filtering

    Schikora, M.; Gning, A.; Mihaylova, L.; Cremers, D.; Koch, W.

    2014-01-01

    This paper develops a novel approach for multitarget tracking, called box-particle probability hypothesis density filter (box-PHD filter). The approach is able to track multiple targets and estimates the unknown number of targets. Furthermore, it is capable of dealing with three sources of uncertainty: stochastic, set-theoretic, and data association uncertainty. The box-PHD filter reduces the number of particles significantly, which improves the runtime considerably. The small number of box-p...

  2. Quantum effects and hypothesis of cosmic censorship

    Parnovskij, S.L.

    1989-01-01

    It is shown that filamentary characteristics with linear mass of less than 10 25 g/cm distort slightly the space-time at distances, exceeding Planck ones. Their formation doesn't change vacuum energy and doesn't lead to strong quantum radiation. Therefore, the problem of their occurrence can be considered within the framework of classical collapse. Quantum effects can be ignored when considering the problem of validity of cosmic censorship hypothesis

  3. Rumlig kultur / Spatial Culture

    RUMLIG KULTUR / SPATIAL CULTURE præsenterer et humanvidenskabeligt livtag med storbyens erfaringsverden. Emnerne for 21 kapitler spænder fra billedhuggeren Bjørn Nørgaard og boligbyggeriet Bispebjerg Bakke til stedsopfattelsen i moderne guidebøger. Undervjs inddrages bykulturens tænkere såsom Steen...... artikler et forskningsfelt for rumlig kultur, hvori alskens sanse- og refleksionsformer finder sammen. Based in humanistic urban studies as practiced in the Department of Arts and Cultural Studies, University of Copenhagen, SPATIAL CULTURE outlines a novel framework for understanding the social...... and cultural environments of the modern and contemporary metropolis. The contributions focus on urban and suburban cultures of Copenhagen, New York, Hong Kong, Berlin and anderswo, demonstrating how the precise analysis of cultural and artistic phenomena informs a multilayered understanding...

  4. Spatial assimilation?

    Andersen, Hans Skifter

    of housing and neighbourhoods. In this paper the difference over time between immigrants’ residential careers and Danes during the years after their arrival is examined. The hypothesis tested is that immigrants’ residential situation gets closer to comparable Danes during the course of time...

  5. The (not so) immortal strand hypothesis.

    Tomasetti, Cristian; Bozic, Ivana

    2015-03-01

    Non-random segregation of DNA strands during stem cell replication has been proposed as a mechanism to minimize accumulated genetic errors in stem cells of rapidly dividing tissues. According to this hypothesis, an "immortal" DNA strand is passed to the stem cell daughter and not the more differentiated cell, keeping the stem cell lineage replication error-free. After it was introduced, experimental evidence both in favor and against the hypothesis has been presented. Using a novel methodology that utilizes cancer sequencing data we are able to estimate the rate of accumulation of mutations in healthy stem cells of the colon, blood and head and neck tissues. We find that in these tissues mutations in stem cells accumulate at rates strikingly similar to those expected without the protection from the immortal strand mechanism. Utilizing an approach that is fundamentally different from previous efforts to confirm or refute the immortal strand hypothesis, we provide evidence against non-random segregation of DNA during stem cell replication. Our results strongly suggest that parental DNA is passed randomly to stem cell daughters and provides new insight into the mechanism of DNA replication in stem cells. Copyright © 2015. Published by Elsevier B.V.

  6. A test of the orthographic recoding hypothesis

    Gaygen, Daniel E.

    2003-04-01

    The Orthographic Recoding Hypothesis [D. E. Gaygen and P. A. Luce, Percept. Psychophys. 60, 465-483 (1998)] was tested. According to this hypothesis, listeners recognize spoken words heard for the first time by mapping them onto stored representations of the orthographic forms of the words. Listeners have a stable orthographic representation of words, but no phonological representation, when those words have been read frequently but never heard or spoken. Such may be the case for low frequency words such as jargon. Three experiments using visually and auditorily presented nonword stimuli tested this hypothesis. The first two experiments were explicit tests of memory (old-new tests) for words presented visually. In the first experiment, the recognition of auditorily presented nonwords was facilitated when they previously appeared on a visually presented list. The second experiment was similar, but included a concurrent articulation task during a visual word list presentation, thus preventing covert rehearsal of the nonwords. The results were similar to the first experiment. The third experiment was an indirect test of memory (auditory lexical decision task) for visually presented nonwords. Auditorily presented nonwords were identified as nonwords significantly more slowly if they had previously appeared on the visually presented list accompanied by a concurrent articulation task.

  7. Consumer health information seeking as hypothesis testing.

    Keselman, Alla; Browne, Allen C; Kaufman, David R

    2008-01-01

    Despite the proliferation of consumer health sites, lay individuals often experience difficulty finding health information online. The present study attempts to understand users' information seeking difficulties by drawing on a hypothesis testing explanatory framework. It also addresses the role of user competencies and their interaction with internet resources. Twenty participants were interviewed about their understanding of a hypothetical scenario about a family member suffering from stable angina and then searched MedlinePlus consumer health information portal for information on the problem presented in the scenario. Participants' understanding of heart disease was analyzed via semantic analysis. Thematic coding was used to describe information seeking trajectories in terms of three key strategies: verification of the primary hypothesis, narrowing search within the general hypothesis area and bottom-up search. Compared to an expert model, participants' understanding of heart disease involved different key concepts, which were also differently grouped and defined. This understanding provided the framework for search-guiding hypotheses and results interpretation. Incorrect or imprecise domain knowledge led individuals to search for information on irrelevant sites, often seeking out data to confirm their incorrect initial hypotheses. Online search skills enhanced search efficiency, but did not eliminate these difficulties. Regardless of their web experience and general search skills, lay individuals may experience difficulty with health information searches. These difficulties may be related to formulating and evaluating hypotheses that are rooted in their domain knowledge. Informatics can provide support at the levels of health information portals, individual websites, and consumer education tools.

  8. Precision half-life measurement of 11C: The most precise mirror transition F t value

    Valverde, A. A.; Brodeur, M.; Ahn, T.; Allen, J.; Bardayan, D. W.; Becchetti, F. D.; Blankstein, D.; Brown, G.; Burdette, D. P.; Frentz, B.; Gilardy, G.; Hall, M. R.; King, S.; Kolata, J. J.; Long, J.; Macon, K. T.; Nelson, A.; O'Malley, P. D.; Skulski, M.; Strauss, S. Y.; Vande Kolk, B.

    2018-03-01

    Background: The precise determination of the F t value in T =1 /2 mixed mirror decays is an important avenue for testing the standard model of the electroweak interaction through the determination of Vu d in nuclear β decays. 11C is an interesting case, as its low mass and small QE C value make it particularly sensitive to violations of the conserved vector current hypothesis. The present dominant source of uncertainty in the 11CF t value is the half-life. Purpose: A high-precision measurement of the 11C half-life was performed, and a new world average half-life was calculated. Method: 11C was created by transfer reactions and separated using the TwinSol facility at the Nuclear Science Laboratory at the University of Notre Dame. It was then implanted into a tantalum foil, and β counting was used to determine the half-life. Results: The new half-life, t1 /2=1220.27 (26 ) s, is consistent with the previous values but significantly more precise. A new world average was calculated, t1/2 world=1220.41 (32 ) s, and a new estimate for the Gamow-Teller to Fermi mixing ratio ρ is presented along with standard model correlation parameters. Conclusions: The new 11C world average half-life allows the calculation of a F tmirror value that is now the most precise value for all superallowed mixed mirror transitions. This gives a strong impetus for an experimental determination of ρ , to allow for the determination of Vu d from this decay.

  9. Drift chambers for a large-area, high-precision muon spectrometer

    Alberini, C.; Bari, G.; Cara Romeo, G.; Cifarelli, L.; Del Papa, C.; Iacobucci, G.; Laurenti, G.; Maccarrone, G.; Massam, T.; Motta, F.; Nania, R.; Perotto, E.; Prisco, G.; Willutsky, M.; Basile, M.; Contin, A.; Palmonari, F.; Sartorelli, G.

    1987-01-01

    We have tested two prototypes of high-precision drift chamber for a magnetic muon spectrometer. Results of the tests are presented, with special emphasis on their efficiency and spatial resolution as a function of particle rate. (orig.)

  10. Spatial Theography

    van Noppen, Jean Pierre

    1995-01-01

    Descriptive theology («theography») frequently resorts to metaphorical modes of meaning. Among these metaphors, the spatial language of localization and orientation plays an important role to delineate tentative insights into the relationship between the human and the divine. These spatial metaphors are presumably based on the universal human experience of interaction between the body and its environment. It is dangerous, however, to postulate universal agreement on meanings associated with s...

  11. Nanomaterials for Cancer Precision Medicine.

    Wang, Yilong; Sun, Shuyang; Zhang, Zhiyuan; Shi, Donglu

    2018-04-01

    Medical science has recently advanced to the point where diagnosis and therapeutics can be carried out with high precision, even at the molecular level. A new field of "precision medicine" has consequently emerged with specific clinical implications and challenges that can be well-addressed by newly developed nanomaterials. Here, a nanoscience approach to precision medicine is provided, with a focus on cancer therapy, based on a new concept of "molecularly-defined cancers." "Next-generation sequencing" is introduced to identify the oncogene that is responsible for a class of cancers. This new approach is fundamentally different from all conventional cancer therapies that rely on diagnosis of the anatomic origins where the tumors are found. To treat cancers at molecular level, a recently developed "microRNA replacement therapy" is applied, utilizing nanocarriers, in order to regulate the driver oncogene, which is the core of cancer precision therapeutics. Furthermore, the outcome of the nanomediated oncogenic regulation has to be accurately assessed by the genetically characterized, patient-derived xenograft models. Cancer therapy in this fashion is a quintessential example of precision medicine, presenting many challenges to the materials communities with new issues in structural design, surface functionalization, gene/drug storage and delivery, cell targeting, and medical imaging. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Precision Medicine and Men's Health.

    Mata, Douglas A; Katchi, Farhan M; Ramasamy, Ranjith

    2017-07-01

    Precision medicine can greatly benefit men's health by helping to prevent, diagnose, and treat prostate cancer, benign prostatic hyperplasia, infertility, hypogonadism, and erectile dysfunction. For example, precision medicine can facilitate the selection of men at high risk for prostate cancer for targeted prostate-specific antigen screening and chemoprevention administration, as well as assist in identifying men who are resistant to medical therapy for prostatic hyperplasia, who may instead require surgery. Precision medicine-trained clinicians can also let couples know whether their specific cause of infertility should be bypassed by sperm extraction and in vitro fertilization to prevent abnormalities in their offspring. Though precision medicine's role in the management of hypogonadism has yet to be defined, it could be used to identify biomarkers associated with individual patients' responses to treatment so that appropriate therapy can be prescribed. Last, precision medicine can improve erectile dysfunction treatment by identifying genetic polymorphisms that regulate response to medical therapies and by aiding in the selection of patients for further cardiovascular disease screening.

  13. Precision Medicine in Gastrointestinal Pathology.

    Wang, David H; Park, Jason Y

    2016-05-01

    -Precision medicine is the promise of individualized therapy and management of patients based on their personal biology. There are now multiple global initiatives to perform whole-genome sequencing on millions of individuals. In the United States, an early program was the Million Veteran Program, and a more recent proposal in 2015 by the president of the United States is the Precision Medicine Initiative. To implement precision medicine in routine oncology care, genetic variants present in tumors need to be matched with effective clinical therapeutics. When we focus on the current state of precision medicine for gastrointestinal malignancies, it becomes apparent that there is a mixed history of success and failure. -To present the current state of precision medicine using gastrointestinal oncology as a model. We will present currently available targeted therapeutics, promising new findings in clinical genomic oncology, remaining quality issues in genomic testing, and emerging oncology clinical trial designs. -Review of the literature including clinical genomic studies on gastrointestinal malignancies, clinical oncology trials on therapeutics targeted to molecular alterations, and emerging clinical oncology study designs. -Translating our ability to sequence thousands of genes into meaningful improvements in patient survival will be the challenge for the next decade.

  14. Updating the lamellar hypothesis of hippocampal organization

    Robert S Sloviter

    2012-12-01

    Full Text Available In 1971, Andersen and colleagues proposed that excitatory activity in the entorhinal cortex propagates topographically to the dentate gyrus, and on through a trisynaptic circuit lying within transverse hippocampal slices or lamellae [Andersen, Bliss, and Skrede. 1971. Lamellar organization of hippocampal pathways. Exp Brain Res 13, 222-238]. In this way, a relatively simple structure might mediate complex functions in a manner analogous to the way independent piano keys can produce a nearly infinite variety of unique outputs. The lamellar hypothesis derives primary support from the lamellar distribution of dentate granule cell axons (the mossy fibers, which innervate dentate hilar neurons and area CA3 pyramidal cells and interneurons within the confines of a thin transverse hippocampal segment. Following the initial formulation of the lamellar hypothesis, anatomical studies revealed that unlike granule cells, hilar mossy cells, CA3 pyramidal cells, and Layer II entorhinal cells all form axonal projections that are more divergent along the longitudinal axis than the clearly lamellar mossy fiber pathway. The existence of pathways with translamellar distribution patterns has been interpreted, incorrectly in our view, as justifying outright rejection of the lamellar hypothesis [Amaral and Witter. 1989. The three-dimensional organization of the hippocampal formation: a review of anatomical data. Neuroscience 31, 571-591]. We suggest that the functional implications of longitudinally-projecting axons depend not on whether they exist, but on what they do. The observation that focal granule cell layer discharges normally inhibit, rather than excite, distant granule cells suggests that longitudinal axons in the dentate gyrus may mediate "lateral" inhibition and define lamellar function, rather than undermine it. In this review, we attempt a reconsideration of the evidence that most directly impacts the physiological concept of hippocampal lamellar

  15. Hypothesis Testing as an Act of Rationality

    Nearing, Grey

    2017-04-01

    Statistical hypothesis testing is ad hoc in two ways. First, setting probabilistic rejection criteria is, as Neyman (1957) put it, an act of will rather than an act of rationality. Second, physical theories like conservation laws do not inherently admit probabilistic predictions, and so we must use what are called epistemic bridge principles to connect model predictions with the actual methods of hypothesis testing. In practice, these bridge principles are likelihood functions, error functions, or performance metrics. I propose that the reason we are faced with these problems is because we have historically failed to account for a fundamental component of basic logic - namely the portion of logic that explains how epistemic states evolve in the presence of empirical data. This component of Cox' (1946) calculitic logic is called information theory (Knuth, 2005), and adding information theory our hypothetico-deductive account of science yields straightforward solutions to both of the above problems. This also yields a straightforward method for dealing with Popper's (1963) problem of verisimilitude by facilitating a quantitative approach to measuring process isomorphism. In practice, this involves data assimilation. Finally, information theory allows us to reliably bound measures of epistemic uncertainty, thereby avoiding the problem of Bayesian incoherency under misspecified priors (Grünwald, 2006). I therefore propose solutions to four of the fundamental problems inherent in both hypothetico-deductive and/or Bayesian hypothesis testing. - Neyman (1957) Inductive Behavior as a Basic Concept of Philosophy of Science. - Cox (1946) Probability, Frequency and Reasonable Expectation. - Knuth (2005) Lattice Duality: The Origin of Probability and Entropy. - Grünwald (2006). Bayesian Inconsistency under Misspecification. - Popper (1963) Conjectures and Refutations: The Growth of Scientific Knowledge.

  16. The conscious access hypothesis: Explaining the consciousness.

    Prakash, Ravi

    2008-01-01

    The phenomenon of conscious awareness or consciousness is complicated but fascinating. Although this concept has intrigued the mankind since antiquity, exploration of consciousness from scientific perspectives is not very old. Among myriad of theories regarding nature, functions and mechanism of consciousness, off late, cognitive theories have received wider acceptance. One of the most exciting hypotheses in recent times has been the "conscious access hypotheses" based on the "global workspace model of consciousness". It underscores an important property of consciousness, the global access of information in cerebral cortex. Present article reviews the "conscious access hypothesis" in terms of its theoretical underpinnings as well as experimental supports it has received.

  17. Interstellar colonization and the zoo hypothesis

    Jones, E.M.

    1978-01-01

    Michael Hart and others have pointed out that current estimates of the number of technological civilizations arisen in the Galaxy since its formation is in fundamental conflict with the expectation that such a civilization could colonize and utilize the entire Galaxy in 10 to 20 million years. This dilemma can be called Hart's paradox. Resolution of the paradox requires that one or more of the following are true: we are the Galaxy's first technical civilization; interstellar travel is immensely impractical or simply impossible; technological civilizations are very short-lived; or we inhabit a wildnerness preserve. The latter is the zoo hypothesis

  18. RANDOM WALK HYPOTHESIS IN FINANCIAL MARKETS

    Nicolae-Marius JULA

    2017-05-01

    Full Text Available Random walk hypothesis states that the stock market prices do not follow a predictable trajectory, but are simply random. If you are trying to predict a random set of data, one should test for randomness, because, despite the power and complexity of the used models, the results cannot be trustworthy. There are several methods for testing these hypotheses and the use of computational power provided by the R environment makes the work of the researcher easier and with a cost-effective approach. The increasing power of computing and the continuous development of econometric tests should give the potential investors new tools in selecting commodities and investing in efficient markets.

  19. Confluence Model or Resource Dilution Hypothesis?

    Jæger, Mads

    have a negative effect on educational attainment most studies cannot distinguish empirically between the CM and the RDH. In this paper, I use the different theoretical predictions in the CM and the RDH on the role of cognitive ability as a partial or complete mediator of the sibship size effect......Studies on family background often explain the negative effect of sibship size on educational attainment by one of two theories: the Confluence Model (CM) or the Resource Dilution Hypothesis (RDH). However, as both theories – for substantively different reasons – predict that sibship size should...

  20. Set theory and the continuum hypothesis

    Cohen, Paul J

    2008-01-01

    This exploration of a notorious mathematical problem is the work of the man who discovered the solution. The independence of the continuum hypothesis is the focus of this study by Paul J. Cohen. It presents not only an accessible technical explanation of the author's landmark proof but also a fine introduction to mathematical logic. An emeritus professor of mathematics at Stanford University, Dr. Cohen won two of the most prestigious awards in mathematics: in 1964, he was awarded the American Mathematical Society's Bôcher Prize for analysis; and in 1966, he received the Fields Medal for Logic.

  1. Statistical hypothesis testing with SAS and R

    Taeger, Dirk

    2014-01-01

    A comprehensive guide to statistical hypothesis testing with examples in SAS and R When analyzing datasets the following questions often arise:Is there a short hand procedure for a statistical test available in SAS or R?If so, how do I use it?If not, how do I program the test myself? This book answers these questions and provides an overview of the most commonstatistical test problems in a comprehensive way, making it easy to find and performan appropriate statistical test. A general summary of statistical test theory is presented, along with a basicdescription for each test, including the

  2. Sensing technologies for precision irrigation

    Ćulibrk, Dubravko; Minic, Vladan; Alonso Fernandez, Marta; Alvarez Osuna, Javier; Crnojevic, Vladimir

    2014-01-01

    This brief provides an overview of state-of-the-art sensing technologies relevant to the problem of precision irrigation, an emerging field within the domain of precision agriculture. Applications of wireless sensor networks, satellite data and geographic information systems in the domain are covered. This brief presents the basic concepts of the technologies and emphasizes the practical aspects that enable the implementation of intelligent irrigation systems. The authors target a broad audience interested in this theme and organize the content in five chapters, each concerned with a specific technology needed to address the problem of optimal crop irrigation. Professionals and researchers will find the text a thorough survey with practical applications.

  3. Precision measurement with atom interferometry

    Wang Jin

    2015-01-01

    Development of atom interferometry and its application in precision measurement are reviewed in this paper. The principle, features and the implementation of atom interferometers are introduced, the recent progress of precision measurement with atom interferometry, including determination of gravitational constant and fine structure constant, measurement of gravity, gravity gradient and rotation, test of weak equivalence principle, proposal of gravitational wave detection, and measurement of quadratic Zeeman shift are reviewed in detail. Determination of gravitational redshift, new definition of kilogram, and measurement of weak force with atom interferometry are also briefly introduced. (topical review)

  4. ELECTROWEAK PHYSICS AND PRECISION STUDIES

    MARCIANO, W.

    2005-01-01

    The utility of precision electroweak measurements for predicting the Standard Model Higgs mass via quantum loop effects is discussed. Current values of m W , sin 2 θ W (m Z ) # ovr MS# and m t imply a relatively light Higgs which is below the direct experimental bound but possibly consistent with Supersymmetry expectations. The existence of Supersymmetry is further suggested by a 2σ discrepancy between experiment and theory for the muon anomalous magnetic moment. Constraints from precision studies on other types of ''New Physics'' are also briefly described

  5. Universal precision sine bar attachment

    Mann, Franklin D. (Inventor)

    1989-01-01

    This invention relates to an attachment for a sine bar which can be used to perform measurements during lathe operations or other types of machining operations. The attachment can be used for setting precision angles on vises, dividing heads, rotary tables and angle plates. It can also be used in the inspection of machined parts, when close tolerances are required, and in the layout of precision hardware. The novelty of the invention is believed to reside in a specific versatile sine bar attachment for measuring a variety of angles on a number of different types of equipment.

  6. Introduction to precise numerical methods

    Aberth, Oliver

    2007-01-01

    Precise numerical analysis may be defined as the study of computer methods for solving mathematical problems either exactly or to prescribed accuracy. This book explains how precise numerical analysis is constructed. The book also provides exercises which illustrate points from the text and references for the methods presented. All disc-based content for this title is now available on the Web. · Clearer, simpler descriptions and explanations ofthe various numerical methods· Two new types of numerical problems; accurately solving partial differential equations with the included software and computing line integrals in the complex plane.

  7. Transformations and representations supporting spatial perspective taking

    Yu, Alfred B.; Zacks, Jeffrey M.

    2018-01-01

    Spatial perspective taking is the ability to reason about spatial relations relative to another’s viewpoint. Here, we propose a mechanistic hypothesis that relates mental representations of one’s viewpoint to the transformations used for spatial perspective taking. We test this hypothesis using a novel behavioral paradigm that assays patterns of response time and variation in those patterns across people. The results support the hypothesis that people maintain a schematic representation of the space around their body, update that representation to take another’s perspective, and thereby to reason about the space around their body. This is a powerful computational mechanism that can support imitation, coordination of behavior, and observational learning. PMID:29545731

  8. Hypothesis-driven physical examination curriculum.

    Allen, Sharon; Olson, Andrew; Menk, Jeremiah; Nixon, James

    2017-12-01

    Medical students traditionally learn physical examination skills as a rote list of manoeuvres. Alternatives like hypothesis-driven physical examination (HDPE) may promote students' understanding of the contribution of physical examination to diagnostic reasoning. We sought to determine whether first-year medical students can effectively learn to perform a physical examination using an HDPE approach, and then tailor the examination to specific clinical scenarios. Medical students traditionally learn physical examination skills as a rote list of manoeuvres CONTEXT: First-year medical students at the University of Minnesota were taught both traditional and HDPE approaches during a required 17-week clinical skills course in their first semester. The end-of-course evaluation assessed HDPE skills: students were assigned one of two cardiopulmonary cases. Each case included two diagnostic hypotheses. During an interaction with a standardised patient, students were asked to select physical examination manoeuvres in order to make a final diagnosis. Items were weighted and selection order was recorded. First-year students with minimal pathophysiology performed well. All students selected the correct diagnosis. Importantly, students varied the order when selecting examination manoeuvres depending on the diagnoses under consideration, demonstrating early clinical decision-making skills. An early introduction to HDPE may reinforce physical examination skills for hypothesis generation and testing, and can foster early clinical decision-making skills. This has important implications for further research in physical examination instruction. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  9. A default Bayesian hypothesis test for mediation.

    Nuijten, Michèle B; Wetzels, Ruud; Matzke, Dora; Dolan, Conor V; Wagenmakers, Eric-Jan

    2015-03-01

    In order to quantify the relationship between multiple variables, researchers often carry out a mediation analysis. In such an analysis, a mediator (e.g., knowledge of a healthy diet) transmits the effect from an independent variable (e.g., classroom instruction on a healthy diet) to a dependent variable (e.g., consumption of fruits and vegetables). Almost all mediation analyses in psychology use frequentist estimation and hypothesis-testing techniques. A recent exception is Yuan and MacKinnon (Psychological Methods, 14, 301-322, 2009), who outlined a Bayesian parameter estimation procedure for mediation analysis. Here we complete the Bayesian alternative to frequentist mediation analysis by specifying a default Bayesian hypothesis test based on the Jeffreys-Zellner-Siow approach. We further extend this default Bayesian test by allowing a comparison to directional or one-sided alternatives, using Markov chain Monte Carlo techniques implemented in JAGS. All Bayesian tests are implemented in the R package BayesMed (Nuijten, Wetzels, Matzke, Dolan, & Wagenmakers, 2014).

  10. Gaussian Hypothesis Testing and Quantum Illumination.

    Wilde, Mark M; Tomamichel, Marco; Lloyd, Seth; Berta, Mario

    2017-09-22

    Quantum hypothesis testing is one of the most basic tasks in quantum information theory and has fundamental links with quantum communication and estimation theory. In this paper, we establish a formula that characterizes the decay rate of the minimal type-II error probability in a quantum hypothesis test of two Gaussian states given a fixed constraint on the type-I error probability. This formula is a direct function of the mean vectors and covariance matrices of the quantum Gaussian states in question. We give an application to quantum illumination, which is the task of determining whether there is a low-reflectivity object embedded in a target region with a bright thermal-noise bath. For the asymmetric-error setting, we find that a quantum illumination transmitter can achieve an error probability exponent stronger than a coherent-state transmitter of the same mean photon number, and furthermore, that it requires far fewer trials to do so. This occurs when the background thermal noise is either low or bright, which means that a quantum advantage is even easier to witness than in the symmetric-error setting because it occurs for a larger range of parameters. Going forward from here, we expect our formula to have applications in settings well beyond those considered in this paper, especially to quantum communication tasks involving quantum Gaussian channels.

  11. Inoculation stress hypothesis of environmental enrichment.

    Crofton, Elizabeth J; Zhang, Yafang; Green, Thomas A

    2015-02-01

    One hallmark of psychiatric conditions is the vast continuum of individual differences in susceptibility vs. resilience resulting from the interaction of genetic and environmental factors. The environmental enrichment paradigm is an animal model that is useful for studying a range of psychiatric conditions, including protective phenotypes in addiction and depression models. The major question is how environmental enrichment, a non-drug and non-surgical manipulation, can produce such robust individual differences in such a wide range of behaviors. This paper draws from a variety of published sources to outline a coherent hypothesis of inoculation stress as a factor producing the protective enrichment phenotypes. The basic tenet suggests that chronic mild stress from living in a complex environment and interacting non-aggressively with conspecifics can inoculate enriched rats against subsequent stressors and/or drugs of abuse. This paper reviews the enrichment phenotypes, mulls the fundamental nature of environmental enrichment vs. isolation, discusses the most appropriate control for environmental enrichment, and challenges the idea that cortisol/corticosterone equals stress. The intent of the inoculation stress hypothesis of environmental enrichment is to provide a scaffold with which to build testable hypotheses for the elucidation of the molecular mechanisms underlying these protective phenotypes and thus provide new therapeutic targets to treat psychiatric/neurological conditions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Athlete's Heart: Is the Morganroth Hypothesis Obsolete?

    Haykowsky, Mark J; Samuel, T Jake; Nelson, Michael D; La Gerche, Andre

    2018-05-01

    In 1975, Morganroth and colleagues reported that the increased left ventricular (LV) mass in highly trained endurance athletes versus nonathletes was primarily due to increased end-diastolic volume while the increased LV mass in resistance trained athletes was solely due to an increased LV wall thickness. Based on the divergent remodelling patterns observed, Morganroth and colleagues hypothesised that the increased "volume" load during endurance exercise may be similar to that which occurs in patients with mitral or aortic regurgitation while the "pressure" load associated with performing a Valsalva manoeuvre (VM) during resistance exercise may mimic the stress imposed on the heart by systemic hypertension or aortic stenosis. Despite widespread acceptance of the four-decade old Morganroth hypothesis in sports cardiology, some investigators have questioned whether such a divergent "athlete's heart" phenotype exists. Given this uncertainty, the purpose of this brief review is to re-evaluate the Morganroth hypothesis regarding: i) the acute effects of resistance exercise performed with a brief VM on LV wall stress, and the patterns of LV remodelling in resistance-trained athletes; ii) the acute effects of endurance exercise on biventricular wall stress, and the time course and pattern of LV and right ventricular (RV) remodelling with endurance training; and iii) the value of comparing "loading" conditions between athletes and patients with cardiac pathology. Copyright © 2018. Published by Elsevier B.V.

  13. The Debt Overhang Hypothesis: Evidence from Pakistan

    Shah Muhammad Imran

    2016-04-01

    Full Text Available This study investigates the debt overhang hypothesis for Pakistan in the period 1960-2007. The study examines empirically the dynamic behaviour of GDP, debt services, the employed labour force and investment using the time series concepts of unit roots, cointegration, error correlation and causality. Our findings suggest that debt-servicing has a negative impact on the productivity of both labour and capital, and that in turn has adversely affected economic growth. By severely constraining the ability of the country to service debt, this lends support to the debt-overhang hypothesis in Pakistan. The long run relation between debt services and economic growth implies that future increases in output will drain away in form of high debt service payments to lender country as external debt acts like a tax on output. More specifically, foreign creditors will benefit more from the rise in productivity than will domestic producers and labour. This suggests that domestic labour and capital are the ultimate losers from this heavy debt burden.

  14. Roots and Route of the Artification Hypothesis

    Ellen Dissanayake

    2017-08-01

    Full Text Available Over four decades, my ideas about the arts in human evolution have themselves evolved, from an original notion of art as a human behaviour of “making special” to a full-fledged hypothesis of artification. A summary of the gradual developmental path (or route of the hypothesis, based on ethological principles and concepts, is given, and an argument presented in which artification is described as an exaptation whose roots lie in adaptive features of ancestral mother–infant interaction that contributed to infant survival and maternal reproductive success. I show how the interaction displays features of a ritualised behavior whose operations (formalization, repetition, exaggeration, and elaboration can be regarded as characteristic elements of human ritual ceremonies as well as of art (including song, dance, performance, literary language, altered surroundings, and other examples of making ordinary sounds, movement, language, environments, objects, and bodies extraordinary. Participation in these behaviours in ritual practices served adaptive ends in early Homo by coordinating brain and body states, and thereby emotionally bonding members of a group in common cause as well as reducing existential anxiety in individuals. A final section situates artification within contemporary philosophical and popular ideas of art, claiming that artifying is not a synonym for or definition of art but foundational to any evolutionary discussion of artistic/aesthetic behaviour.

  15. Hypothesis: does ochratoxin A cause testicular cancer?

    Schwartz, Gary G

    2002-02-01

    Little is known about the etiology of testicular cancer, which is the most common cancer among young men. Epidemiologic data point to a carcinogenic exposure in early life or in utero, but the nature of the exposure is unknown. We hypothesize that the mycotoxin, ochratoxin A, is a cause of testicular cancer. Ochratoxin A is a naturally occurring contaminant of cereals, pigmeat, and other foods and is a known genotoxic carcinogen in animals. The major features of the descriptive epidemiology of testicular cancer (a high incidence in northern Europe, increasing incidence over time, and associations with high socioeconomic status, and with poor semen quality) are all associated with exposure to ochratoxin A. Exposure of animals to ochratoxin A via the diet or via in utero transfer induces adducts in testicular DNA. We hypothesize that consumption of foods contaminated with ochratoxin A during pregnancy and/or childhood induces lesions in testicular DNA and that puberty promotes these lesions to testicular cancer. We tested the ochratoxin A hypothesis using ecologic data on the per-capita consumption of cereals, coffee, and pigmeat, the principal dietary sources of ochratoxin A. Incidence rates for testicular cancer in 20 countries were significantly correlated with the per-capita consumption of coffee and pigmeat (r = 0.49 and 0.54, p = 0.03 and 0.01). The ochratoxin A hypothesis offers a coherent explanation for much of the descriptive epidemiology of testicular cancer and suggests new avenues for analytic research.

  16. Urbanization and the more-individuals hypothesis.

    Chiari, Claudia; Dinetti, Marco; Licciardello, Cinzia; Licitra, Gaetano; Pautasso, Marco

    2010-03-01

    1. Urbanization is a landscape process affecting biodiversity world-wide. Despite many urban-rural studies of bird assemblages, it is still unclear whether more species-rich communities have more individuals, regardless of the level of urbanization. The more-individuals hypothesis assumes that species-rich communities have larger populations, thus reducing the chance of local extinctions. 2. Using newly collated avian distribution data for 1 km(2) grid cells across Florence, Italy, we show a significantly positive relationship between species richness and assemblage abundance for the whole urban area. This richness-abundance relationship persists for the 1 km(2) grid cells with less than 50% of urbanized territory, as well as for the remaining grid cells, with no significant difference in the slope of the relationship. These results support the more-individuals hypothesis as an explanation of patterns in species richness, also in human modified and fragmented habitats. 3. However, the intercept of the species richness-abundance relationship is significantly lower for highly urbanized grid cells. Our study confirms that urban communities have lower species richness but counters the common notion that assemblages in densely urbanized ecosystems have more individuals. In Florence, highly inhabited areas show fewer species and lower assemblage abundance. 4. Urbanized ecosystems are an ongoing large-scale natural experiment which can be used to test ecological theories empirically.

  17. The Younger Dryas impact hypothesis: A requiem

    Pinter, Nicholas; Scott, Andrew C.; Daulton, Tyrone L.; Podoll, Andrew; Koeberl, Christian; Anderson, R. Scott; Ishman, Scott E.

    2011-06-01

    The Younger Dryas (YD) impact hypothesis is a recent theory that suggests that a cometary or meteoritic body or bodies hit and/or exploded over North America 12,900 years ago, causing the YD climate episode, extinction of Pleistocene megafauna, demise of the Clovis archeological culture, and a range of other effects. Since gaining widespread attention in 2007, substantial research has focused on testing the 12 main signatures presented as evidence of a catastrophic extraterrestrial event 12,900 years ago. Here we present a review of the impact hypothesis, including its evolution and current variants, and of efforts to test and corroborate the hypothesis. The physical evidence interpreted as signatures of an impact event can be separated into two groups. The first group consists of evidence that has been largely rejected by the scientific community and is no longer in widespread discussion, including: particle tracks in archeological chert; magnetic nodules in Pleistocene bones; impact origin of the Carolina Bays; and elevated concentrations of radioactivity, iridium, and fullerenes enriched in 3He. The second group consists of evidence that has been active in recent research and discussions: carbon spheres and elongates, magnetic grains and magnetic spherules, byproducts of catastrophic wildfire, and nanodiamonds. Over time, however, these signatures have also seen contrary evidence rather than support. Recent studies have shown that carbon spheres and elongates do not represent extraterrestrial carbon nor impact-induced megafires, but are indistinguishable from fungal sclerotia and arthropod fecal material that are a small but common component of many terrestrial deposits. Magnetic grains and spherules are heterogeneously distributed in sediments, but reported measurements of unique peaks in concentrations at the YD onset have yet to be reproduced. The magnetic grains are certainly just iron-rich detrital grains, whereas reported YD magnetic spherules are

  18. Approaches to informed consent for hypothesis-testing and hypothesis-generating clinical genomics research.

    Facio, Flavia M; Sapp, Julie C; Linn, Amy; Biesecker, Leslie G

    2012-10-10

    Massively-parallel sequencing (MPS) technologies create challenges for informed consent of research participants given the enormous scale of the data and the wide range of potential results. We propose that the consent process in these studies be based on whether they use MPS to test a hypothesis or to generate hypotheses. To demonstrate the differences in these approaches to informed consent, we describe the consent processes for two MPS studies. The purpose of our hypothesis-testing study is to elucidate the etiology of rare phenotypes using MPS. The purpose of our hypothesis-generating study is to test the feasibility of using MPS to generate clinical hypotheses, and to approach the return of results as an experimental manipulation. Issues to consider in both designs include: volume and nature of the potential results, primary versus secondary results, return of individual results, duty to warn, length of interaction, target population, and privacy and confidentiality. The categorization of MPS studies as hypothesis-testing versus hypothesis-generating can help to clarify the issue of so-called incidental or secondary results for the consent process, and aid the communication of the research goals to study participants.

  19. Learning-Related Changes in Adolescents' Neural Networks during Hypothesis-Generating and Hypothesis-Understanding Training

    Lee, Jun-Ki; Kwon, Yongju

    2012-01-01

    Fourteen science high school students participated in this study, which investigated neural-network plasticity associated with hypothesis-generating and hypothesis-understanding in learning. The students were divided into two groups and participated in either hypothesis-generating or hypothesis-understanding type learning programs, which were…

  20. STANFORD (SLAC): Precision electroweak result

    Anon.

    1994-01-01

    Precision testing of the electroweak sector of the Standard Model has intensified with the recent publication* of results from the SLD collaboration's 1993 run on the Stanford Linear Collider, SLC. Using a highly polarized electron beam colliding with an unpolarized positron beam, SLD physicists measured the left-right asymmetry at the Z boson resonance with dramatically improved accuracy over 1992

  1. Spin and precision electroweak physics

    Marciano, W.J. [Brookhaven National Lab., Upton, NY (United States)

    1994-12-01

    A perspective on fundamental parameters and precision tests of the Standard Model is given. Weak neutral current reactions are discussed with emphasis on those processes involving (polarized) electrons. The role of electroweak radiative corrections in determining the top quark mass and probing for {open_quotes}new physics{close_quotes} is described.

  2. Spin and precision electroweak physics

    Marciano, W.J.

    1993-01-01

    A perspective on fundamental parameters and precision tests of the Standard Model is given. Weak neutral current reactions are discussed with emphasis on those processes involving (polarized) electrons. The role of electroweak radiative corrections in determining the top quark mass and probing for ''new physics'' is described

  3. Precision surveying system for PEP

    Gunn, J.; Lauritzen, T.; Sah, R.; Pellisier, P.F.

    1977-01-01

    A semi-automatic precision surveying system is being developed for PEP. Reference elevations for vertical alignment will be provided by a liquid level. The short range surveying will be accomplished using a Laser Surveying System featuring automatic data acquisition and analysis

  4. Precision medicine at the crossroads.

    Olson, Maynard V

    2017-10-11

    There are bioethical, institutional, economic, legal, and cultural obstacles to creating the robust-precompetitive-data resource that will be required to advance the vision of "precision medicine," the ability to use molecular data to target therapies to patients for whom they offer the most benefit at the least risk. Creation of such an "information commons" was the central recommendation of the 2011 report Toward Precision Medicine issued by a committee of the National Research Council of the USA (Committee on a Framework for Development of a New Taxonomy of Disease; National Research Council. Toward precision medicine: building a knowledge network for biomedical research and a new taxonomy of disease. 2011). In this commentary, I review the rationale for creating an information commons and the obstacles to doing so; then, I endorse a path forward based on the dynamic consent of research subjects interacting with researchers through trusted mediators. I assert that the advantages of the proposed system overwhelm alternative ways of handling data on the phenotypes, genotypes, and environmental exposures of individual humans; hence, I argue that its creation should be the central policy objective of early efforts to make precision medicine a reality.

  5. Proton gyromagnetic precision measurement system

    Zhu Deming; Deming Zhu

    1991-01-01

    A computerized control and measurement system used in the proton gyromagnetic precision meausrement is descirbed. It adopts the CAMAC data acquisition equipment, using on-line control and analysis with the HP85 and PDP-11/60 computer systems. It also adopts the RSX11M computer operation system, and the control software is written in FORTRAN language

  6. Accuracy and precision of oscillometric blood pressure in standing conscious horses

    Olsen, Emil; Pedersen, Tilde Louise Skovgaard; Robinson, Rebecca

    2016-01-01

    from a teaching and research herd. HYPOTHESIS/OBJECTIVE: To evaluate the accuracy and precision of systolic arterial pressure (SAP), diastolic arterial pressure (DAP), and mean arterial pressure (MAP) in conscious horses obtained with an oscillometric NIBP device when compared to invasively measured...... administration. Agreement analysis with replicate measures was utilized to calculate bias (accuracy) and standard deviation (SD) of bias (precision). RESULTS: A total of 252 pairs of invasive arterial BP and NIBP measurements were analyzed. Compared to the direct BP measures, the NIBP MAP had an accuracy of -4...... mm Hg and precision of 10 mm Hg. SAP had an accuracy of -8 mm Hg and a precision of 17 mm Hg and DAP had an accuracy of -7 mm Hg and a precision of 14 mm Hg. CONCLUSIONS AND CLINICAL RELEVANCE: MAP from the evaluated NIBP monitor is accurate and precise in the adult horse across a range of BP...

  7. Behavioral Variability and Somatic Mosaicism: A Cytogenomic Hypothesis.

    Vorsanova, Svetlana G; Zelenova, Maria A; Yurov, Yuri B; Iourov, Ivan Y

    2018-04-01

    Behavioral sciences are inseparably related to genetics. A variety of neurobehavioral phenotypes are suggested to result from genomic variations. However, the contribution of genetic factors to common behavioral disorders (i.e. autism, schizophrenia, intellectual disability) remains to be understood when an attempt to link behavioral variability to a specific genomic change is made. Probably, the least appreciated genetic mechanism of debilitating neurobehavioral disorders is somatic mosaicism or the occurrence of genetically diverse (neuronal) cells in an individual's brain. Somatic mosaicism is assumed to affect directly the brain being associated with specific behavioral patterns. As shown in studies of chromosome abnormalities (syndromes), genetic mosaicism is able to change dynamically the phenotype due to inconsistency of abnormal cell proportions. Here, we hypothesize that brain-specific postzygotic changes of mosaicism levels are able to modulate variability of behavioral phenotypes. More precisely, behavioral phenotype variability in individuals exhibiting somatic mosaicism might correlate with changes in the amount of genetically abnormal cells throughout the lifespan. If proven, the hypothesis can be used as a basis for therapeutic interventions through regulating levels of somatic mosaicism to increase functioning and to improve overall condition of individuals with behavioral problems.

  8. Dissimilarities of reduced density matrices and eigenstate thermalization hypothesis

    He, Song; Lin, Feng-Li; Zhang, Jia-ju

    2017-12-01

    We calculate various quantities that characterize the dissimilarity of reduced density matrices for a short interval of length ℓ in a two-dimensional (2D) large central charge conformal field theory (CFT). These quantities include the Rényi entropy, entanglement entropy, relative entropy, Jensen-Shannon divergence, as well as the Schatten 2-norm and 4-norm. We adopt the method of operator product expansion of twist operators, and calculate the short interval expansion of these quantities up to order of ℓ9 for the contributions from the vacuum conformal family. The formal forms of these dissimilarity measures and the derived Fisher information metric from contributions of general operators are also given. As an application of the results, we use these dissimilarity measures to compare the excited and thermal states, and examine the eigenstate thermalization hypothesis (ETH) by showing how they behave in high temperature limit. This would help to understand how ETH in 2D CFT can be defined more precisely. We discuss the possibility that all the dissimilarity measures considered here vanish when comparing the reduced density matrices of an excited state and a generalized Gibbs ensemble thermal state. We also discuss ETH for a microcanonical ensemble thermal state in a 2D large central charge CFT, and find that it is approximately satisfied for a small subsystem and violated for a large subsystem.

  9. Advanced methods and algorithm for high precision astronomical imaging

    Ngole-Mboula, Fred-Maurice

    2016-01-01

    One of the biggest challenges of modern cosmology is to gain a more precise knowledge of the dark energy and the dark matter nature. Fortunately, the dark matter can be traced directly through its gravitational effect on galaxies shapes. The European Spatial Agency Euclid mission will precisely provide data for such a purpose. A critical step is analyzing these data will be to accurately model the instrument Point Spread Function (PSF), which the focus of this thesis.We developed non parametric methods to reliably estimate the PSFs across an instrument field-of-view, based on unresolved stars images and accounting for noise, under sampling and PSFs spatial variability. At the core of these contributions, modern mathematical tools and concepts such as sparsity. An important extension of this work will be to account for the PSFs wavelength dependency. (author) [fr

  10. Spatial networks

    Barthélemy, Marc

    2011-02-01

    Complex systems are very often organized under the form of networks where nodes and edges are embedded in space. Transportation and mobility networks, Internet, mobile phone networks, power grids, social and contact networks, and neural networks, are all examples where space is relevant and where topology alone does not contain all the information. Characterizing and understanding the structure and the evolution of spatial networks is thus crucial for many different fields, ranging from urbanism to epidemiology. An important consequence of space on networks is that there is a cost associated with the length of edges which in turn has dramatic effects on the topological structure of these networks. We will thoroughly explain the current state of our understanding of how the spatial constraints affect the structure and properties of these networks. We will review the most recent empirical observations and the most important models of spatial networks. We will also discuss various processes which take place on these spatial networks, such as phase transitions, random walks, synchronization, navigation, resilience, and disease spread.

  11. Spatial interpolation

    Stein, A.

    1991-01-01

    The theory and practical application of techniques of statistical interpolation are studied in this thesis, and new developments in multivariate spatial interpolation and the design of sampling plans are discussed. Several applications to studies in soil science are

  12. Spatial analysis of NDVI readings with difference sampling density

    Advanced remote sensing technologies provide research an innovative way of collecting spatial data for use in precision agriculture. Sensor information and spatial analysis together allow for a complete understanding of the spatial complexity of a field and its crop. The objective of the study was...

  13. Alternatives to the linear risk hypothesis

    Craig, A.G.

    1976-01-01

    A theoretical argument is presented which suggests that in using the linear hypothesis for all values of LET the low dose risk is overestimated for low LET but that it is underestimated for very high LET. The argument is based upon the idea that cell lesions which do not lead to cell death may in fact lead to a malignant cell. Expressions for the Surviving Fraction and the Cancer Risk based on this argument are given. An advantage of this very general approach is that is expresses cell survival and cancer risk entirely in terms of the cell lesions and avoids the rather contentious argument as to how the average number of lesions should be related to the dose. (U.K.)

  14. Artistic talent in dyslexia--a hypothesis.

    Chakravarty, Ambar

    2009-10-01

    The present article hints at a curious neurocognitive phenomenon of development of artistic talents in some children with dyslexia. The article also takes note of the phenomenon of creating in the midst of language disability as observed in the lives of such creative people like Leonardo da Vinci and Albert Einstein who were most probably affected with developmental learning disorders. It has been hypothesised that a developmental delay in the dominant hemisphere most likely 'disinhibits' the non-dominant parietal lobe to unmask talents, artistic or otherwise, in some such individuals. The present hypothesis follows the phenomenon of paradoxical functional facilitation described earlier. It has been suggested that children with learning disorders be encouraged to develop such hidden talents to full capacity, rather than be subjected to overemphasising on the correction of the disturbed coded symbol operations, in remedial training.

  15. Tissue misrepair hypothesis for radiation carcinogenesis

    Kondo, Sohei

    1991-01-01

    Dose-response curves for chronic leukemia in A-bomb survivors and liver tumors in patients given Thorotrast (colloidal thorium dioxide) show large threshold effects. The existence of these threshold effects can be explained by the following hypothesis. A high dose of radiation causes a persistent wound in a cellrenewable tissue. Disorder of the injured cell society partly frees the component cells from territorial restraints on their proliferation, enabling them to continue development of their cellular functions toward advanced autonomy. This progression might be achieved by continued epigenetic and genetic changes as a result of occasional errors in the otherwise concerted healing action of various endogeneous factors recruited for tissue repair. Carcinogenesis is not simply a single-cell problem but a cell-society problem. Therefore, it is not warranted to estimate risk at low doses by linear extrapolation from cancer data at high doses without knowledge of the mechanism of radiation carcinogenesis. (author) 57 refs

  16. Statistical hypothesis tests of some micrometeorological observations

    SethuRaman, S.; Tichler, J.

    1977-01-01

    Chi-square goodness-of-fit is used to test the hypothesis that the medium scale of turbulence in the atmospheric surface layer is normally distributed. Coefficients of skewness and excess are computed from the data. If the data are not normal, these coefficients are used in Edgeworth's asymptotic expansion of Gram-Charlier series to determine an altrnate probability density function. The observed data are then compared with the modified probability densities and the new chi-square values computed.Seventy percent of the data analyzed was either normal or approximatley normal. The coefficient of skewness g 1 has a good correlation with the chi-square values. Events with vertical-barg 1 vertical-bar 1 vertical-bar<0.43 were approximately normal. Intermittency associated with the formation and breaking of internal gravity waves in surface-based inversions over water is thought to be the reason for the non-normality

  17. The hexagon hypothesis: Six disruptive scenarios.

    Burtles, Jim

    2015-01-01

    This paper aims to bring a simple but effective and comprehensive approach to the development, delivery and monitoring of business continuity solutions. To ensure that the arguments and principles apply across the board, the paper sticks to basic underlying concepts rather than sophisticated interpretations. First, the paper explores what exactly people are defending themselves against. Secondly, the paper looks at how defences should be set up. Disruptive events tend to unfold in phases, each of which invites a particular style of protection, ranging from risk management through to business continuity to insurance cover. Their impact upon any business operation will fall into one of six basic scenarios. The hexagon hypothesis suggests that everyone should be prepared to deal with each of these six disruptive scenarios and it provides them with a useful benchmark for business continuity.

  18. Novae, supernovae, and the island universe hypothesis

    Van Den Bergh, S.

    1988-01-01

    Arguments in Curtis's (1917) paper related to the island universe hypothesis and the existence of novae in spiral nebulae are considered. It is noted that the maximum magnitude versus rate-of-decline relation for novae may be the best tool presently available for the calibration of the extragalactic distance scale. Light curve observations of six novae are used to determine a distance of 18.6 + or - 3.5 MPc to the Virgo cluster. Results suggest that Type Ia supernovae cannot easily be used as standard candles, and that Type II supernovae are unsuitable as distance indicators. Factors other than precursor mass are probably responsible for determining the ultimate fate of evolving stars. 83 references

  19. Extra dimensions hypothesis in high energy physics

    Volobuev Igor

    2017-01-01

    Full Text Available We discuss the history of the extra dimensions hypothesis and the physics and phenomenology of models with large extra dimensions with an emphasis on the Randall- Sundrum (RS model with two branes. We argue that the Standard Model extension based on the RS model with two branes is phenomenologically acceptable only if the inter-brane distance is stabilized. Within such an extension of the Standard Model, we study the influence of the infinite Kaluza-Klein (KK towers of the bulk fields on collider processes. In particular, we discuss the modification of the scalar sector of the theory, the Higgs-radion mixing due to the coupling of the Higgs boson to the radion and its KK tower, and the experimental restrictions on the mass of the radion-dominated states.

  20. Multiple model cardinalized probability hypothesis density filter

    Georgescu, Ramona; Willett, Peter

    2011-09-01

    The Probability Hypothesis Density (PHD) filter propagates the first-moment approximation to the multi-target Bayesian posterior distribution while the Cardinalized PHD (CPHD) filter propagates both the posterior likelihood of (an unlabeled) target state and the posterior probability mass function of the number of targets. Extensions of the PHD filter to the multiple model (MM) framework have been published and were implemented either with a Sequential Monte Carlo or a Gaussian Mixture approach. In this work, we introduce the multiple model version of the more elaborate CPHD filter. We present the derivation of the prediction and update steps of the MMCPHD particularized for the case of two target motion models and proceed to show that in the case of a single model, the new MMCPHD equations reduce to the original CPHD equations.

  1. On the immunostimulatory hypothesis of cancer

    Juan Bruzzo

    2011-12-01

    Full Text Available There is a rather generalized belief that the worst possible outcome for the application of immunological therapies against cancer is a null effect on tumor growth. However, a significant body of evidence summarized in the immunostimulatory hypothesis of cancer suggests that, upon certain circumstances, the growth of incipient and established tumors can be accelerated rather than inhibited by the immune response supposedly mounted to limit tumor growth. In order to provide more compelling evidence of this proposition, we have explored the growth behavior characteristics of twelve murine tumors -most of them of spontaneous origin- arisen in the colony of our laboratory, in putatively immunized and control mice. Using classical immunization procedures, 8 out of 12 tumors were actually stimulated in "immunized" mice while the remaining 4 were neither inhibited nor stimulated. Further, even these apparently non-antigenic tumors could reveal some antigenicity if more stringent than classical immunization procedures were used. This possibility was suggested by the results obtained with one of these four apparently non-antigenic tumors: the LB lymphoma. In effect, upon these stringent immunization pretreatments, LB was slightly inhibited or stimulated, depending on the titer of the immune reaction mounted against the tumor, with higher titers rendering inhibition and lower titers rendering tumor stimulation. All the above results are consistent with the immunostimulatory hypothesis that entails the important therapeutic implications -contrary to the orthodoxy- that, anti-tumor vaccines may run a real risk of doing harm if the vaccine-induced immunity is too weak to move the reaction into the inhibitory part of the immune response curve and that, a slight and prolonged immunodepression -rather than an immunostimulation- might interfere with the progression of some tumors and thus be an aid to cytotoxic therapies.

  2. The Stress Acceleration Hypothesis of Nightmares

    Tore Nielsen

    2017-06-01

    Full Text Available Adverse childhood experiences can deleteriously affect future physical and mental health, increasing risk for many illnesses, including psychiatric problems, sleep disorders, and, according to the present hypothesis, idiopathic nightmares. Much like post-traumatic nightmares, which are triggered by trauma and lead to recurrent emotional dreaming about the trauma, idiopathic nightmares are hypothesized to originate in early adverse experiences that lead in later life to the expression of early memories and emotions in dream content. Accordingly, the objectives of this paper are to (1 review existing literature on sleep, dreaming and nightmares in relation to early adverse experiences, drawing upon both empirical studies of dreaming and nightmares and books and chapters by recognized nightmare experts and (2 propose a new approach to explaining nightmares that is based upon the Stress Acceleration Hypothesis of mental illness. The latter stipulates that susceptibility to mental illness is increased by adversity occurring during a developmentally sensitive window for emotional maturation—the infantile amnesia period—that ends around age 3½. Early adversity accelerates the neural and behavioral maturation of emotional systems governing the expression, learning, and extinction of fear memories and may afford short-term adaptive value. But it also engenders long-term dysfunctional consequences including an increased risk for nightmares. Two mechanisms are proposed: (1 disruption of infantile amnesia allows normally forgotten early childhood memories to influence later emotions, cognitions and behavior, including the common expression of threats in nightmares; (2 alterations of normal emotion regulation processes of both waking and sleep lead to increased fear sensitivity and less effective fear extinction. These changes influence an affect network previously hypothesized to regulate fear extinction during REM sleep, disruption of which leads to

  3. High precision straw tube chamber with cathode readout

    Bychkov, V.N.; Golutvin, I.A.; Ershov, Yu.V.

    1992-01-01

    The high precision straw chamber with cathode readout was constructed and investigated. The 10 mm straws were made of aluminized mylar strip with transparent longitudinal window. The X coordinate information has been taken from the cathode strips as induced charges and investigated via centroid method. The spatial resolution σ=120 μm has been obtained with signal/noise ratio about 60. The possible ways for improving the signal/noise ratio have been described. 7 refs.; 8 figs

  4. A high precision straw tube chamber with cathode readout

    Bychkov, V.N.; Golutvin, I.A.; Ershov, Yu.V.; Zubarev, E.V.; Ivanov, A.B.; Lysiakov, V.N.; Makhankov, A.V.; Movchan, S.A.; Peshekhonov, V.D.; Preda, T.

    1993-01-01

    The high precision straw chamber with cathode readout was constructed and investigated. The 10 mm diameter straws were made of aluminized Mylar with transparent longitudinal window. The X-coordinate information has been taken from cathode strips as induced charges and investigated with the centroid method. The spatial resolution σ x =103 μm was obtained at a signal-to-noise ratio of about 70. The possible ways to improve the signal-to-noise ratio are discussed. (orig.)

  5. Precisely Tailored DNA Nanostructures and their Theranostic Applications.

    Zhu, Bing; Wang, Lihua; Li, Jiang; Fan, Chunhai

    2017-12-01

    A critical challenge in nanotechnology is the limited precision and controllability of the structural parameters, which brings about concerns in uniformity, reproducibility and performance. Self-assembled DNA nanostructures, as a newly emerged type of nano-biomaterials, possess low-nanometer precision, excellent programmability and addressability. They can precisely arrange various molecules and materials to form spatially ordered complex, resulting in unambiguous physical or chemical properties. Because of these, DNA nanostructures have shown great promise in numerous biomedical theranostic applications. In this account, we briefly review the history and advances on construction of DNA nanoarchitectures and superstructures with accurate structural parameters. We focus on recent progress in exploiting these DNA nanostructures as platforms for quantitative biosensing, intracellular diagnosis, imaging, and smart drug delivery. We also discuss key challenges in practical applications. © 2017 The Chemical Society of Japan & Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Spatial synchrony in cisco recruitment

    Myers, Jared T.; Yule, Daniel L.; Jones, Michael L.; Ahrenstorff, Tyler D.; Hrabik, Thomas R.; Claramunt, Randall M.; Ebener, Mark P.; Berglund, Eric K.

    2015-01-01

    We examined the spatial scale of recruitment variability for disparate cisco (Coregonus artedi) populations in the Great Lakes (n = 8) and Minnesota inland lakes (n = 4). We found that the scale of synchrony was approximately 400 km when all available data were utilized; much greater than the 50-km scale suggested for freshwater fish populations in an earlier global analysis. The presence of recruitment synchrony between Great Lakes and inland lake cisco populations supports the hypothesis that synchronicity is driven by climate and not dispersal. We also found synchrony in larval densities among three Lake Superior populations separated by 25–275 km, which further supports the hypothesis that broad-scale climatic factors are the cause of spatial synchrony. Among several candidate climate variables measured during the period of larval cisco emergence, maximum wind speeds exhibited the most similar spatial scale of synchrony to that observed for cisco. Other factors, such as average water temperatures, exhibited synchrony on broader spatial scales, which suggests they could also be contributing to recruitment synchrony. Our results provide evidence that abiotic factors can induce synchronous patterns of recruitment for populations of cisco inhabiting waters across a broad geographic range, and show that broad-scale synchrony of recruitment can occur in freshwater fish populations as well as those from marine systems.

  7. Precision and accuracy in radiotherapy

    Brenner, J.D.

    1989-01-01

    The required precision due to random errors in the delivery of fractionated dose regime is considered. It is argued that suggestions that 1-3% precision is needed may be unnecessarily conservative. It is further suggested that random and systematic errors should not be combined with equal weight to yield an overall target uncertainty in dose delivery, systematic errors being of greater significance. The authors conclude that imprecise dose delivery and inaccurate dose delivery affect patient-cure results differently. Whereas, for example, a 10% inaccuracy in dose delivery would be quite catastrophic in the case considered here, a corresponding imprecision would have a much smaller effect on overall success rates. (author). 14 refs.; 2 figs

  8. Precision electroweak physics at LEP

    Mannelli, M.

    1994-12-01

    Copious event statistics, a precise understanding of the LEP energy scale, and a favorable experimental situation at the Z{sup 0} resonance have allowed the LEP experiments to provide both dramatic confirmation of the Standard Model of strong and electroweak interactions and to place substantially improved constraints on the parameters of the model. The author concentrates on those measurements relevant to the electroweak sector. It will be seen that the precision of these measurements probes sensitively the structure of the Standard Model at the one-loop level, where the calculation of the observables measured at LEP is affected by the value chosen for the top quark mass. One finds that the LEP measurements are consistent with the Standard Model, but only if the mass of the top quark is measured to be within a restricted range of about 20 GeV.

  9. Environment-assisted precision measurement

    Goldstein, G.; Cappellaro, P.; Maze, J. R.

    2011-01-01

    We describe a method to enhance the sensitivity of precision measurements that takes advantage of the environment of a quantum sensor to amplify the response of the sensor to weak external perturbations. An individual qubit is used to sense the dynamics of surrounding ancillary qubits, which...... are in turn affected by the external field to be measured. The resulting sensitivity enhancement is determined by the number of ancillas that are coupled strongly to the sensor qubit; it does not depend on the exact values of the coupling strengths and is resilient to many forms of decoherence. The method...... achieves nearly Heisenberg-limited precision measurement, using a novel class of entangled states. We discuss specific applications to improve clock sensitivity using trapped ions and magnetic sensing based on electronic spins in diamond...

  10. Precise object tracking under deformation

    Saad, M.H

    2010-01-01

    The precise object tracking is an essential issue in several serious applications such as; robot vision, automated surveillance (civil and military), inspection, biomedical image analysis, video coding, motion segmentation, human-machine interface, visualization, medical imaging, traffic systems, satellite imaging etc. This frame-work focuses on the precise object tracking under deformation such as scaling , rotation, noise, blurring and change of illumination. This research is a trail to solve these serious problems in visual object tracking by which the quality of the overall system will be improved. Developing a three dimensional (3D) geometrical model to determine the current pose of an object and predict its future location based on FIR model learned by the OLS. This framework presents a robust ranging technique to track a visual target instead of the traditional expensive ranging sensors. The presented research work is applied to real video stream and achieved high precession results.

  11. Fit to Electroweak Precision Data

    Erler, Jens

    2006-01-01

    A brief review of electroweak precision data from LEP, SLC, the Tevatron, and low energies is presented. The global fit to all data including the most recent results on the masses of the top quark and the W boson reinforces the preference for a relatively light Higgs boson. I will also give an outlook on future developments at the Tevatron Run II, CEBAF, the LHC, and the ILC

  12. Precise Object Tracking under Deformation

    Saad, M.H.

    2010-01-01

    The precise object tracking is an essential issue in several serious applications such as; robot vision, automated surveillance (civil and military), inspection, biomedical image analysis, video coding, motion segmentation, human-machine interface, visualization, medical imaging, traffic systems, satellite imaging etc. This framework focuses on the precise object tracking under deformation such as scaling, rotation, noise, blurring and change of illumination. This research is a trail to solve these serious problems in visual object tracking by which the quality of the overall system will be improved. Developing a three dimensional (3D) geometrical model to determine the current pose of an object and predict its future location based on FIR model learned by the OLS. This framework presents a robust ranging technique to track a visual target instead of the traditional expensive ranging sensors. The presented research work is applied to real video stream and achieved high precession results. xiiiThe precise object tracking is an essential issue in several serious applications such as; robot vision, automated surveillance (civil and military), inspection, biomedical image analysis, video coding, motion segmentation, human-machine interface, visualization, medical imaging, traffic systems, satellite imaging etc. This framework focuses on the precise object tracking under deformation such as scaling, rotation, noise, blurring and change of illumination. This research is a trail to solve these serious problems in visual object tracking by which the quality of the overall system will be improved. Developing a three dimensional (3D) geometrical model to determine the current pose of an object and predict its future location based on FIR model learned by the OLS. This framework presents a robust ranging technique to track a visual target instead of the traditional expensive ranging sensors. The presented research work is applied to real video stream and achieved high

  13. Precision measurements of electroweak parameters

    Savin, Alexander

    2017-01-01

    A set of selected precise measurements of the SM parameters from the LHC experiments is discussed. Results on W-mass measurement and forward-backward asymmetry in production of the Drell--Yan events in both dielectron and dimuon decay channels are presented together with results on the effective mixing angle measurements. Electroweak production of the vector bosons in association with two jets is discussed.

  14. Precision titration mini-calorimeter

    Ensor, D.; Kullberg, L.; Choppin, G.

    1977-01-01

    The design and test of a small volume calorimeter of high precision and simple design is described. The calorimeter operates with solution sample volumes in the range of 3 to 5 ml. The results of experiments on the entropy changes for two standard reactions: (1) reaction of tris(hydroxymethyl)aminomethane with hydrochloric acid and (2) reaction between mercury(II) and bromide ions are reported to confirm the accuracy and overall performance of the calorimeter

  15. Knowledge of Precision Farming Beneficiaries

    A.V. Greena

    2016-05-01

    Full Text Available Precision Farming is one of the many advanced farming practices that make production more efficient by better resource management and reducing wastage. TN-IAMWARM is a world bank funded project aims to improve the farm productivity and income through better water management. The present study was carried out in Kambainallur sub basin of Dharmapuri district with 120 TN-IAMWARM beneficiaries as respondents. The result indicated that more than three fourth (76.67 % of the respondents had high level of knowledge on precision farming technologies which was made possible by the implementation of TN-IAMWARM project. The study further revealed that educational status, occupational status and exposure to agricultural messages had a positive and significant contribution to the knowledge level of the respondents at 0.01 level of probability whereas experience in precision farming and social participation had a positive and significant contribution at 0.05 level of probability.

  16. Measurement of turbulent spatial structure and kinetic energy spectrum by exact temporal-to-spatial mapping

    Buchhave, Preben; Velte, Clara Marika

    2017-01-01

    distortions caused by Taylor’s hypothesis. The method is first confirmed to produce the correct statistics using computer simulations and later applied to measurements in some of the most difficult regions of a round turbulent jet—the non-equilibrium developing region and the outermost parts of the developed......We present a method for converting a time record of turbulent velocity measured at a point in a flow to a spatial velocity record consisting of consecutive convection elements. The spatial record allows computation of dynamic statistical moments such as turbulent kinetic wavenumber spectra...... and spatial structure functions in a way that completely bypasses the need for Taylor’s hypothesis. The spatial statistics agree with the classical counterparts, such as the total kinetic energy spectrum, at least for spatial extents up to the Taylor microscale. The requirements for applying the method...

  17. Spatial distribution

    Borregaard, Michael Krabbe; Hendrichsen, Ditte Katrine; Nachman, Gøsta Støger

    2008-01-01

    , depending on the nature of intraspecific interactions between them: while the individuals of some species repel each other and partition the available area, others form groups of varying size, determined by the fitness of each group member. The spatial distribution pattern of individuals again strongly......Living organisms are distributed over the entire surface of the planet. The distribution of the individuals of each species is not random; on the contrary, they are strongly dependent on the biology and ecology of the species, and vary over different spatial scale. The structure of whole...... populations reflects the location and fragmentation pattern of the habitat types preferred by the species, and the complex dynamics of migration, colonization, and population growth taking place over the landscape. Within these, individuals are distributed among each other in regular or clumped patterns...

  18. Spatial Culture

    Reeh, Henrik

    2012-01-01

    Spatial Culture – A Humanities Perspective Abstract of introductory essay by Henrik Reeh Secured by alliances between socio-political development and cultural practices, a new field of humanistic studies in spatial culture has developed since the 1990s. To focus on links between urban culture...... and modern society is, however, an intellectual practice which has a much longer history. Already in the 1980s, the debate on the modern and the postmodern cited Paris and Los Angeles as spatio-cultural illustrations of these major philosophical concepts. Earlier, in the history of critical studies, the work...... Foucault considered a constitutive feature of 20th-century thinking and one that continues to occupy intellectual and cultural debates in the third millennium. A conceptual framework is, nevertheless, necessary, if the humanities are to adequa-tely address city and space – themes that have long been...

  19. Precision study of the $\\beta$-decay of $^{74}$Rb

    Van Duppen, P L E; Lunney, D

    2002-01-01

    We are proposing a high-resolution study of the $\\beta$-decay of $^{74}$Rb in order to extrapolate our precision knowledge of the superallowed $\\beta$-decays from the sd and fp shells towards the medium-heavy Z=N nuclei. The primary goal is to provide new data for testing the CVC hypothesis and the unitarity condition of the CKM matrix of the Standard Model. The presented programme would involve the careful measurements of the decay properties of $^{74}$Rb including the branching ratios to the excited states as well as the precise determination of the decay energy of $^{74}$Rb. The experimental methods readily available at ISOLDE include high-transmission conversion electron spectroscopy, $\\gamma$-ray spectroscopy as well as the measurements of the masses of $^{74}$Rb and $^{74}$Kr using two complementary techniques, ISOLTRAP and MISTRAL. The experiment would rely on a high-quality $^{74}$Rb beam available at ISOLDE with adequate intensity.

  20. The Matter-Gravity Entanglement Hypothesis

    Kay, Bernard S.

    2018-03-01

    I outline some of my work and results (some dating back to 1998, some more recent) on my matter-gravity entanglement hypothesis, according to which the entropy of a closed quantum gravitational system is equal to the system's matter-gravity entanglement entropy. The main arguments presented are: (1) that this hypothesis is capable of resolving what I call the second-law puzzle, i.e. the puzzle as to how the entropy increase of a closed system can be reconciled with the asssumption of unitary time-evolution; (2) that the black hole information loss puzzle may be regarded as a special case of this second law puzzle and that therefore the same resolution applies to it; (3) that the black hole thermal atmosphere puzzle (which I recall) can be resolved by adopting a radically different-from-usual description of quantum black hole equilibrium states, according to which they are total pure states, entangled between matter and gravity in such a way that the partial states of matter and gravity are each approximately thermal equilibrium states (at the Hawking temperature); (4) that the Susskind-Horowitz-Polchinski string-theoretic understanding of black hole entropy as the logarithm of the degeneracy of a long string (which is the weak string coupling limit of a black hole) cannot be quite correct but should be replaced by a modified understanding according to which it is the entanglement entropy between a long string and its stringy atmosphere, when in a total pure equilibrium state in a suitable box, which (in line with (3)) goes over, at strong-coupling, to a black hole in equilibrium with its thermal atmosphere. The modified understanding in (4) is based on a general result, which I also describe, which concerns the likely state of a quantum system when it is weakly coupled to an energy-bath and the total state is a random pure state with a given energy. This result generalizes Goldstein et al.'s `canonical typicality' result to systems which are not necessarily small.

  1. The Matter-Gravity Entanglement Hypothesis

    Kay, Bernard S.

    2018-05-01

    I outline some of my work and results (some dating back to 1998, some more recent) on my matter-gravity entanglement hypothesis, according to which the entropy of a closed quantum gravitational system is equal to the system's matter-gravity entanglement entropy. The main arguments presented are: (1) that this hypothesis is capable of resolving what I call the second-law puzzle, i.e. the puzzle as to how the entropy increase of a closed system can be reconciled with the asssumption of unitary time-evolution; (2) that the black hole information loss puzzle may be regarded as a special case of this second law puzzle and that therefore the same resolution applies to it; (3) that the black hole thermal atmosphere puzzle (which I recall) can be resolved by adopting a radically different-from-usual description of quantum black hole equilibrium states, according to which they are total pure states, entangled between matter and gravity in such a way that the partial states of matter and gravity are each approximately thermal equilibrium states (at the Hawking temperature); (4) that the Susskind-Horowitz-Polchinski string-theoretic understanding of black hole entropy as the logarithm of the degeneracy of a long string (which is the weak string coupling limit of a black hole) cannot be quite correct but should be replaced by a modified understanding according to which it is the entanglement entropy between a long string and its stringy atmosphere, when in a total pure equilibrium state in a suitable box, which (in line with (3)) goes over, at strong-coupling, to a black hole in equilibrium with its thermal atmosphere. The modified understanding in (4) is based on a general result, which I also describe, which concerns the likely state of a quantum system when it is weakly coupled to an energy-bath and the total state is a random pure state with a given energy. This result generalizes Goldstein et al.'s `canonical typicality' result to systems which are not necessarily small.

  2. Modelling firm heterogeneity with spatial 'trends'

    Sarmiento, C. [North Dakota State University, Fargo, ND (United States). Dept. of Agricultural Business & Applied Economics

    2004-04-15

    The hypothesis underlying this article is that firm heterogeneity can be captured by spatial characteristics of the firm (similar to the inclusion of a time trend in time series models). The hypothesis is examined in the context of modelling electric generation by coal powered plants in the presence of firm heterogeneity.

  3. Anosognosia as motivated unawareness: the 'defence' hypothesis revisited.

    Turnbull, Oliver H; Fotopoulou, Aikaterini; Solms, Mark

    2014-12-01

    Anosognosia for hemiplegia has seen a century of almost continuous research, yet a definitive understanding of its mechanism remains elusive. Essentially, anosognosic patients hold quasi-delusional beliefs about their paralysed limbs, in spite of all the contrary evidence, repeated questioning, and logical argument. We review a range of findings suggesting that emotion and motivation play an important role in anosognosia. We conclude that anosognosia involves (amongst other things) a process of psychological defence. This conclusion stems from a wide variety of clinical and experimental investigations, including data on implicit awareness of deficit, fluctuations in awareness over time, and dramatic effects upon awareness of psychological interventions such as psychotherapy, reframing of the emotional consequences of the paralysis, and first versus third person perspectival manipulations. In addition, we review and refute the (eight) arguments historically raised against the 'defence' hypothesis, including the claim that a defence-based account cannot explain the lateralised nature of the disorder. We argue that damage to a well-established right-lateralised emotion regulation system, with links to psychological processes that appear to underpin allocentric spatial cognition, plays a key role in anosognosia (at least in some patients). We conclude with a discussion of implications for clinical practice. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Hypothesis test for synchronization: twin surrogates revisited.

    Romano, M Carmen; Thiel, Marco; Kurths, Jürgen; Mergenthaler, Konstantin; Engbert, Ralf

    2009-03-01

    The method of twin surrogates has been introduced to test for phase synchronization of complex systems in the case of passive experiments. In this paper we derive new analytical expressions for the number of twins depending on the size of the neighborhood, as well as on the length of the trajectory. This allows us to determine the optimal parameters for the generation of twin surrogates. Furthermore, we determine the quality of the twin surrogates with respect to several linear and nonlinear statistics depending on the parameters of the method. In the second part of the paper we perform a hypothesis test for phase synchronization in the case of experimental data from fixational eye movements. These miniature eye movements have been shown to play a central role in neural information processing underlying the perception of static visual scenes. The high number of data sets (21 subjects and 30 trials per person) allows us to compare the generated twin surrogates with the "natural" surrogates that correspond to the different trials. We show that the generated twin surrogates reproduce very well all linear and nonlinear characteristics of the underlying experimental system. The synchronization analysis of fixational eye movements by means of twin surrogates reveals that the synchronization between the left and right eye is significant, indicating that either the centers in the brain stem generating fixational eye movements are closely linked, or, alternatively that there is only one center controlling both eyes.

  5. Marginal contrasts and the Contrastivist Hypothesis

    Daniel Currie Hall

    2016-12-01

    Full Text Available The Contrastivist Hypothesis (CH; Hall 2007; Dresher 2009 holds that the only features that can be phonologically active in any language are those that serve to distinguish phonemes, which presupposes that phonemic status is categorical. Many researchers, however, demonstrate the existence of gradient relations. For instance, Hall (2009 quantifies these using the information-theoretic measure of entropy (unpredictability of distribution and shows that a pair of sounds may have an entropy between 0 (totally predictable and 1 (totally unpredictable. We argue that the existence of such intermediate degrees of contrastiveness does not make the CH untenable, but rather offers insight into contrastive hierarchies. The existence of a continuum does not preclude categorical distinctions: a categorical line can be drawn between zero entropy (entirely predictable, and thus by the CH phonologically inactive and non-zero entropy (at least partially contrastive, and thus potentially phonologically active. But this does not mean that intermediate degrees of surface contrastiveness are entirely irrelevant to the CH; rather, we argue, they can shed light on how deeply ingrained a phonemic distinction is in the phonological system. As an example, we provide a case study from Pulaar [ATR] harmony, which has previously been claimed to be problematic for the CH.

  6. Confabulation: Developing the 'emotion dysregulation' hypothesis.

    Turnbull, Oliver H; Salas, Christian E

    2017-02-01

    Confabulations offer unique opportunities for establishing the neurobiological basis of delusional thinking. As regards causal factors, a review of the confabulation literature suggests that neither amnesia nor executive impairment can be the sole (or perhaps even the primary) cause of all delusional beliefs - though they may act in concert with other factors. A key perspective in the modern literature is that many delusions have an emotionally positive or 'wishful' element, that may serve to modulate or manage emotional experience. Some authors have referred to this perspective as the 'emotion dysregulation' hypothesis. In this article we review the theoretical underpinnings of this approach, and develop the idea by suggesting that the positive aspects of confabulatory states may have a role in perpetuating the imbalance between cognitive control and emotion. We draw on existing evidence from fields outside neuropsychology, to argue for three main causal factors: that positive emotions are related to more global or schematic forms of cognitive processing; that positive emotions influence the accuracy of memory recollection; and that positive emotions make people more susceptible to false memories. These findings suggest that the emotions that we want to feel (or do not want to feel) can influence the way we reconstruct past experiences and generate a sense of self - a proposition that bears on a unified theory of delusional belief states. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.

  7. Evolutionary hypothesis for Chiari type I malformation.

    Fernandes, Yvens Barbosa; Ramina, Ricardo; Campos-Herrera, Cynthia Resende; Borges, Guilherme

    2013-10-01

    Chiari I malformation (CM-I) is classically defined as a cerebellar tonsillar herniation (≥5 mm) through the foramen magnum. A decreased posterior fossa volume, mainly due to basioccipital hypoplasia and sometimes platybasia, leads to posterior fossa overcrowding and consequently cerebellar herniation. Regardless of radiological findings, embryological genetic hypothesis or any other postulations, the real cause behind this malformation is yet not well-elucidated and remains largely unknown. The aim of this paper is to approach CM-I under a broader and new perspective, conjoining anthropology, genetics and neurosurgery, with special focus on the substantial changes that have occurred in the posterior cranial base through human evolution. Important evolutionary allometric changes occurred during brain expansion and genetics studies of human evolution demonstrated an unexpected high rate of gene flow interchange and possibly interbreeding during this process. Based upon this review we hypothesize that CM-I may be the result of an evolutionary anthropological imprint, caused by evolving species populations that eventually met each other and mingled in the last 1.7 million years. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Environmental Kuznets Curve Hypothesis. A Survey

    Dinda, Soumyananda

    2004-01-01

    The Environmental Kuznets Curve (EKC) hypothesis postulates an inverted-U-shaped relationship between different pollutants and per capita income, i.e., environmental pressure increases up to a certain level as income goes up; after that, it decreases. An EKC actually reveals how a technically specified measurement of environmental quality changes as the fortunes of a country change. A sizeable literature on EKC has grown in recent period. The common point of all the studies is the assertion that the environmental quality deteriorates at the early stages of economic development/growth and subsequently improves at the later stages. In other words, environmental pressure increases faster than income at early stages of development and slows down relative to GDP growth at higher income levels. This paper reviews some theoretical developments and empirical studies dealing with EKC phenomenon. Possible explanations for this EKC are seen in (1) the progress of economic development, from clean agrarian economy to polluting industrial economy to clean service economy; (2) tendency of people with higher income having higher preference for environmental quality, etc. Evidence of the existence of the EKC has been questioned from several corners. Only some air quality indicators, especially local pollutants, show the evidence of an EKC. However, an EKC is empirically observed, till there is no agreement in the literature on the income level at which environmental degradation starts declining. This paper provides an overview of the EKC literature, background history, conceptual insights, policy and the conceptual and methodological critique

  9. DAMPs, ageing, and cancer: The 'DAMP Hypothesis'.

    Huang, Jin; Xie, Yangchun; Sun, Xiaofang; Zeh, Herbert J; Kang, Rui; Lotze, Michael T; Tang, Daolin

    2015-11-01

    Ageing is a complex and multifactorial process characterized by the accumulation of many forms of damage at the molecular, cellular, and tissue level with advancing age. Ageing increases the risk of the onset of chronic inflammation-associated diseases such as cancer, diabetes, stroke, and neurodegenerative disease. In particular, ageing and cancer share some common origins and hallmarks such as genomic instability, epigenetic alteration, aberrant telomeres, inflammation and immune injury, reprogrammed metabolism, and degradation system impairment (including within the ubiquitin-proteasome system and the autophagic machinery). Recent advances indicate that damage-associated molecular pattern molecules (DAMPs) such as high mobility group box 1, histones, S100, and heat shock proteins play location-dependent roles inside and outside the cell. These provide interaction platforms at molecular levels linked to common hallmarks of ageing and cancer. They can act as inducers, sensors, and mediators of stress through individual plasma membrane receptors, intracellular recognition receptors (e.g., advanced glycosylation end product-specific receptors, AIM2-like receptors, RIG-I-like receptors, and NOD1-like receptors, and toll-like receptors), or following endocytic uptake. Thus, the DAMP Hypothesis is novel and complements other theories that explain the features of ageing. DAMPs represent ideal biomarkers of ageing and provide an attractive target for interventions in ageing and age-associated diseases. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Identity of Particles and Continuum Hypothesis

    Berezin, Alexander A.

    2001-04-01

    Why all electrons are the same? Unlike other objects, particles and atoms (same isotopes) are forbidden to have individuality or personal history (or reveal their hidden variables, even if they do have them). Or at least, what we commonly call physics so far was unable to disprove particle's sameness (Berezin and Nakhmanson, Physics Essays, 1990). Consider two opposing hypotheses: (A) particles are indeed absolutely same, or (B) they do have individuality, but it is beyond our capacity to demonstrate. This dilemma sounds akin to undecidability of Continuum Hypothesis of existence (or not) of intermediate cardinalities between integers and reals (P.Cohen). Both yes and no of it are true. Thus, (alleged) sameness of electrons and atoms may be a physical translation (embodiment) of this fundamental Goedelian undecidability. Experiments unlikely to help: even if we find that all electrons are same within 30 decimal digits, could their masses (or charges) still differ in100-th digit? Within (B) personalized informationally rich (infinitely rich?) digital tails (starting at, say, 100-th decimal) may carry individual record of each particle history. Within (A) parameters (m, q) are indeed exactly same in all digits and their sameness is based on some inherent (meta)physical principle akin to Platonism or Eddington-type numerology.

  11. Environmental Kuznets Curve Hypothesis. A Survey

    Dinda, Soumyananda [Economic Research Unit, Indian Statistical Institute, 203, B.T. Road, Kolkata-108 (India)

    2004-08-01

    The Environmental Kuznets Curve (EKC) hypothesis postulates an inverted-U-shaped relationship between different pollutants and per capita income, i.e., environmental pressure increases up to a certain level as income goes up; after that, it decreases. An EKC actually reveals how a technically specified measurement of environmental quality changes as the fortunes of a country change. A sizeable literature on EKC has grown in recent period. The common point of all the studies is the assertion that the environmental quality deteriorates at the early stages of economic development/growth and subsequently improves at the later stages. In other words, environmental pressure increases faster than income at early stages of development and slows down relative to GDP growth at higher income levels. This paper reviews some theoretical developments and empirical studies dealing with EKC phenomenon. Possible explanations for this EKC are seen in (1) the progress of economic development, from clean agrarian economy to polluting industrial economy to clean service economy; (2) tendency of people with higher income having higher preference for environmental quality, etc. Evidence of the existence of the EKC has been questioned from several corners. Only some air quality indicators, especially local pollutants, show the evidence of an EKC. However, an EKC is empirically observed, till there is no agreement in the literature on the income level at which environmental degradation starts declining. This paper provides an overview of the EKC literature, background history, conceptual insights, policy and the conceptual and methodological critique.

  12. Pressure from particle image velocimetry for convective flows: a Taylor’s hypothesis approach

    De Kat, R; Ganapathisubramani, B

    2013-01-01

    Taylor’s hypothesis is often applied in turbulent flow analysis to map temporal information into spatial information. Recent efforts in deriving pressure from particle image velocimetry (PIV) have proposed multiple approaches, each with its own weakness and strength. Application of Taylor’s hypothesis allows us to counter the weakness of an Eulerian approach that is described by de Kat and van Oudheusden (2012 Exp. Fluids 52 1089–106). Two different approaches of using Taylor’s hypothesis in determining planar pressure are investigated: one where pressure is determined from volumetric PIV data and one where pressure is determined from time-resolved stereoscopic PIV data. A performance assessment on synthetic data shows that application of Taylor’s hypothesis can improve determination of pressure from PIV data significantly compared with a time-resolved volumetric approach. The technique is then applied to time-resolved PIV data taken in a cross-flow plane of a turbulent jet (Ganapathisubramani et al 2007 Exp. Fluids 42 923–39). Results appear to indicate that pressure can indeed be obtained from PIV data in turbulent convective flows using the Taylor’s hypothesis approach, where there are no other methods to determine pressure. The role of convection velocity in determination of pressure is also discussed. (paper)

  13. A NONPARAMETRIC HYPOTHESIS TEST VIA THE BOOTSTRAP RESAMPLING

    Temel, Tugrul T.

    2001-01-01

    This paper adapts an already existing nonparametric hypothesis test to the bootstrap framework. The test utilizes the nonparametric kernel regression method to estimate a measure of distance between the models stated under the null hypothesis. The bootstraped version of the test allows to approximate errors involved in the asymptotic hypothesis test. The paper also develops a Mathematica Code for the test algorithm.

  14. SETI in vivo: testing the we-are-them hypothesis

    Makukov, Maxim A.; Shcherbak, Vladimir I.

    2018-04-01

    After it was proposed that life on Earth might descend from seeding by an earlier extraterrestrial civilization motivated to secure and spread life, some authors noted that this alternative offers a testable implication: microbial seeds could be intentionally supplied with a durable signature that might be found in extant organisms. In particular, it was suggested that the optimal location for such an artefact is the genetic code, as the least evolving part of cells. However, as the mainstream view goes, this scenario is too speculative and cannot be meaningfully tested because encoding/decoding a signature within the genetic code is something ill-defined, so any retrieval attempt is doomed to guesswork. Here we refresh the seeded-Earth hypothesis in light of recent observations, and discuss the motivation for inserting a signature. We then show that `biological SETI' involves even weaker assumptions than traditional SETI and admits a well-defined methodological framework. After assessing the possibility in terms of molecular and evolutionary biology, we formalize the approach and, adopting the standard guideline of SETI that encoding/decoding should follow from first principles and be convention-free, develop a universal retrieval strategy. Applied to the canonical genetic code, it reveals a non-trivial precision structure of interlocked logical and numerical attributes of systematic character (previously we found these heuristically). To assess this result in view of the initial assumption, we perform statistical, comparison, interdependence and semiotic analyses. Statistical analysis reveals no causal connection of the result to evolutionary models of the genetic code, interdependence analysis precludes overinterpretation, and comparison analysis shows that known variations of the code lack any precision-logic structures, in agreement with these variations being post-LUCA (i.e. post-seeding) evolutionary deviations from the canonical code. Finally, semiotic

  15. Constraining supersymmetry with precision data

    Pierce, D.M.; Erler, J.

    1997-01-01

    We discuss the results of a global fit to precision data in supersymmetric models. We consider both gravity- and gauge-mediated models. As the superpartner spectrum becomes light, the global fit to the data typically results in larger values of χ 2 . We indicate the regions of parameter space which are excluded by the data. We discuss the additional effect of the B(B→X s γ) measurement. Our analysis excludes chargino masses below M Z in the simplest gauge-mediated model with μ>0, with stronger constraints for larger values of tanβ. copyright 1997 American Institute of Physics

  16. High precision Standard Model Physics

    Magnin, J.

    2009-01-01

    The main goal of the LHCb experiment, one of the four large experiments of the Large Hadron Collider, is to try to give answers to the question of why Nature prefers matter over antimatter? This will be done by studying the decay of b quarks and their antimatter partners, b-bar, which will be produced by billions in 14 TeV p-p collisions by the LHC. In addition, as 'beauty' particles mainly decay in charm particles, an interesting program of charm physics will be carried on, allowing to measure quantities as for instance the D 0 -D-bar 0 mixing, with incredible precision.

  17. Electroweak precision measurements in CMS

    Dordevic, Milos

    2017-01-01

    An overview of recent results on electroweak precision measurements from the CMS Collaboration is presented. Studies of the weak boson differential transverse momentum spectra, Z boson angular coefficients, forward-backward asymmetry of Drell-Yan lepton pairs and charge asymmetry of W boson production are made in comparison to the state-of-the-art Monte Carlo generators and theoretical predictions. The results show a good agreement with the Standard Model. As a proof of principle for future W mass measurements, a W-like analysis of the Z boson mass is performed.

  18. Precision proton spectrometers for CMS

    Albrow, Michael

    2013-01-01

    We plan to add high precision tracking- and timing-detectors at z = +/- 240 m to CMS to study exclusive processes p + p -- p + X + p at high luminosity. This enables the LHC to be used as a tagged photon-photon collider, with X = l+l- and W+W-, and as a "tagged" gluon-gluon collider (with a spectator gluon) for QCD studies with jets. A second stage at z = 240 m would allow observations of exclusive Higgs boson production.

  19. Precise Analysis of String Expressions

    Christensen, Aske Simon; Møller, Anders; Schwartzbach, Michael Ignatieff

    2003-01-01

    We perform static analysis of Java programs to answer a simple question: which values may occur as results of string expressions? The answers are summarized for each expression by a regular language that is guaranteed to contain all possible values. We present several applications of this analysis...... are automatically produced. We present extensive benchmarks demonstrating that the analysis is efficient and produces results of useful precision......., including statically checking the syntax of dynamically generated expressions, such as SQL queries. Our analysis constructs flow graphs from class files and generates a context-free grammar with a nonterminal for each string expression. The language of this grammar is then widened into a regular language...

  20. The Many Faces of Precision

    Andy eClark

    2013-05-01

    Full Text Available An appreciation of the many roles of ‘precision-weighting’ (upping the gain on select populations of prediction error units opens the door to better accounts of planning and ‘offline simulation’, makes suggestive contact with large bodies of work on embodied and situated cognition, and offers new perspectives on the ‘active brain’. Combined with the complex affordances of language and culture, and operating against the essential backdrop of a variety of more biologically basic ploys and stratagems, the result is a maximally context-sensitive, restless, constantly self-reconfiguring architecture.

  1. Thin films for precision optics

    Araujo, J.F.; Maurici, N.; Castro, J.C. de

    1983-01-01

    The technology of producing dielectric and/or metallic thin films for high precision optical components is discussed. Computer programs were developed in order to calculate and register, graphically, reflectance and transmittance spectra of multi-layer films. The technology of vacuum evaporation of several materials was implemented in our thin-films laboratory; various films for optics were then developed. The possibility of first calculate film characteristics and then produce the film is of great advantage since it reduces the time required to produce a new type of film and also reduces the cost of the project. (C.L.B.) [pt

  2. Putting people on the map: protecting confidentiality with linked social-spatial data

    Panel on Confidentiality Issues Arising from the Integration of Remotely Sensed and Self-Identifying Data, National Research Council

    2007-01-01

    Precise, accurate spatial information linked to social and behavioral data is revolutionizing social science by opening new questions for investigation and improving understanding of human behavior...

  3. Updating the mild encephalitis hypothesis of schizophrenia.

    Bechter, K

    2013-04-05

    Schizophrenia seems to be a heterogeneous disorder. Emerging evidence indicates that low level neuroinflammation (LLNI) may not occur infrequently. Many infectious agents with low overall pathogenicity are risk factors for psychoses including schizophrenia and for autoimmune disorders. According to the mild encephalitis (ME) hypothesis, LLNI represents the core pathogenetic mechanism in a schizophrenia subgroup that has syndromal overlap with other psychiatric disorders. ME may be triggered by infections, autoimmunity, toxicity, or trauma. A 'late hit' and gene-environment interaction are required to explain major findings about schizophrenia, and both aspects would be consistent with the ME hypothesis. Schizophrenia risk genes stay rather constant within populations despite a resulting low number of progeny; this may result from advantages associated with risk genes, e.g., an improved immune response, which may act protectively within changing environments, although they are associated with the disadvantage of increased susceptibility to psychotic disorders. Specific schizophrenic symptoms may arise with instances of LLNI when certain brain functional systems are involved, in addition to being shaped by pre-existing liability factors. Prodrome phase and the transition to a diseased status may be related to LLNI processes emerging and varying over time. The variability in the course of schizophrenia resembles the varying courses of autoimmune disorders, which result from three required factors: genes, the environment, and the immune system. Preliminary criteria for subgrouping neurodevelopmental, genetic, ME, and other types of schizophrenias are provided. A rare example of ME schizophrenia may be observed in Borna disease virus infection. Neurodevelopmental schizophrenia due to early infections has been estimated by others to explain approximately 30% of cases, but the underlying pathomechanisms of transition to disease remain in question. LLNI (e.g. from

  4. Critiques of the seismic hypothesis and the vegetation stabilization hypothesis for the formation of Mima mounds along the western coast of the U.S.

    Gabet, Emmanuel J.; Burnham, Jennifer L. Horwath; Perron, J. Taylor

    2016-09-01

    A recent paper published in Geomorphology by Gabet et al. (2014) presents the results of a numerical model supporting the hypothesis that burrowing mammals build Mima mounds - small, densely packed hillocks found primarily in the western United States. The model is based on field observations and produces realistic-looking mounds with spatial distributions similar to real moundfields. Alternative explanations have been proposed for these Mima mounds, including formation by seismic shaking and vegetation-controlled erosion and deposition. In this short communication, we present observations from moundfields in the coastal states of the western U.S. that are incompatible with these alternative theories.

  5. [Psychodynamic hypothesis about suicidality in elderly men].

    Lindner, Reinhard

    2010-08-01

    Old men are overrepresented in the whole of all suicides. In contrast, only very few elderly men find their way to specialised treatment facilities. Elderly accept psychotherapy more rarely than younger persons. Therefore presentations on the psychodynamics of suicidality in old men are rare and mostly casuistical. By means of a stepwise reconstructable qualitative case comparison of five randomly chosen elderly suicidal men with ideal types of suicidal (younger) men concerning biography, suicidal symptoms and transference, psychodynamic hypothesis of suicidality in elderly men are developed. All patients came into psychotherapy in a specialised academic out-patient clinic for psychodynamic treatment of acute and chronic suicidality. The five elderly suicidal men predominantly were living in long-term, conflictuous sexual relationships and also had ambivalent relationships to their children. Suicidality in old age refers to lifelong existing intrapsychic conflicts, concerning (male) identity, self-esteem and a core conflict between fusion and separation wishes. The body gets a central role in suicidal experiences, being a defensive instance modified by age and/or physical illness, which brings up to consciousness aggressive and envious impulses, but also feelings of emptiness and insecurity, which have to be warded off again by projection into the body. In transference relationships there are on the one hand the regular transference, on the other hand an age specific turned around transference, with their counter transference reactions. The chosen methodological approach serves the systematic finding of hypotheses with a higher degree in evidence than hypotheses generated from single case studies. Georg Thieme Verlag KG Stuttgart - New York.

  6. Atopic dermatitis and the hygiene hypothesis revisited.

    Flohr, Carsten; Yeo, Lindsey

    2011-01-01

    We published a systematic review on atopic dermatitis (AD) and the hygiene hypothesis in 2005. Since then, the body of literature has grown significantly. We therefore repeated our systematic review to examine the evidence from population-based studies for an association between AD risk and specific infections, childhood immunizations, the use of antibiotics and environmental exposures that lead to a change in microbial burden. Medline was searched from 1966 until June 2010 to identify relevant studies. We found an additional 49 papers suitable for inclusion. There is evidence to support an inverse relationship between AD and endotoxin, early day care, farm animal and dog exposure in early life. Cat exposure in the presence of skin barrier impairment is positively associated with AD. Helminth infection at least partially protects against AD. This is not the case for viral and bacterial infections, but consumption of unpasteurized farm milk seems protective. Routine childhood vaccinations have no effect on AD risk. The positive association between viral infections and AD found in some studies appears confounded by antibiotic prescription, which has been consistently associated with an increase in AD risk. There is convincing evidence for an inverse relationship between helminth infections and AD but no other pathogens. The protective effect seen with early day care, endotoxin, unpasteurized farm milk and animal exposure is likely to be due to a general increase in exposure to non-pathogenic microbes. This would also explain the risk increase associated with the use of broad-spectrum antibiotics. Future studies should assess skin barrier gene mutation carriage and phenotypic skin barrier impairment, as gene-environment interactions are likely to impact on AD risk. Copyright © 041_ S. Karger AG, Basel.

  7. Precision luminosity measurements at LHCb

    Aaij, Roel; Adinolfi, Marco; Affolder, Anthony; Ajaltouni, Ziad; Akar, Simon; Albrecht, Johannes; Alessio, Federico; Alexander, Michael; Ali, Suvayu; Alkhazov, Georgy; Alvarez Cartelle, Paula; Alves Jr, Antonio Augusto; Amato, Sandra; Amerio, Silvia; Amhis, Yasmine; An, Liupan; Anderlini, Lucio; Anderson, Jonathan; Andreassen, Rolf; Andreotti, Mirco; Andrews, Jason; Appleby, Robert; Aquines Gutierrez, Osvaldo; Archilli, Flavio; Artamonov, Alexander; Artuso, Marina; Aslanides, Elie; Auriemma, Giulio; Baalouch, Marouen; Bachmann, Sebastian; Back, John; Badalov, Alexey; Baesso, Clarissa; Baldini, Wander; Barlow, Roger; Barschel, Colin; Barsuk, Sergey; Barter, William; Batozskaya, Varvara; Battista, Vincenzo; Bay, Aurelio; Beaucourt, Leo; Beddow, John; Bedeschi, Franco; Bediaga, Ignacio; Belogurov, Sergey; Belous, Konstantin; Belyaev, Ivan; Ben-Haim, Eli; Bencivenni, Giovanni; Benson, Sean; Benton, Jack; Berezhnoy, Alexander; Bernet, Roland; Bettler, Marc-Olivier; van Beuzekom, Martinus; Bien, Alexander; Bifani, Simone; Bird, Thomas; Bizzeti, Andrea; Bjørnstad, Pål Marius; Blake, Thomas; Blanc, Frédéric; Blouw, Johan; Blusk, Steven; Bocci, Valerio; Bondar, Alexander; Bondar, Nikolay; Bonivento, Walter; Borghi, Silvia; Borgia, Alessandra; Borsato, Martino; Bowcock, Themistocles; Bowen, Espen Eie; Bozzi, Concezio; Brambach, Tobias; Bressieux, Joël; Brett, David; Britsch, Markward; Britton, Thomas; Brodzicka, Jolanta; Brook, Nicholas; Brown, Henry; Bursche, Albert; Buytaert, Jan; Cadeddu, Sandro; Calabrese, Roberto; Calvi, Marta; Calvo Gomez, Miriam; Campana, Pierluigi; Campora Perez, Daniel; Carbone, Angelo; Carboni, Giovanni; Cardinale, Roberta; Cardini, Alessandro; Carson, Laurence; Carvalho Akiba, Kazuyoshi; Casse, Gianluigi; Cassina, Lorenzo; Castillo Garcia, Lucia; Cattaneo, Marco; Cauet, Christophe; Cenci, Riccardo; Charles, Matthew; Charpentier, Philippe; Chefdeville, Maximilien; Chen, Shanzhen; Cheung, Shu-Faye; Chiapolini, Nicola; Chrzaszcz, Marcin; Ciba, Krzystof; Cid Vidal, Xabier; Ciezarek, Gregory; Clarke, Peter; Clemencic, Marco; Cliff, Harry; Closier, Joel; Coco, Victor; Cogan, Julien; Cogneras, Eric; Cojocariu, Lucian; Collazuol, Gianmaria; Collins, Paula; Comerma-Montells, Albert; Contu, Andrea; Cook, Andrew; Coombes, Matthew; Coquereau, Samuel; Corti, Gloria; Corvo, Marco; Counts, Ian; Couturier, Benjamin; Cowan, Greig; Craik, Daniel Charles; Cruz Torres, Melissa Maria; Cunliffe, Samuel; Currie, Robert; D'Ambrosio, Carmelo; Dalseno, Jeremy; David, Pascal; David, Pieter; Davis, Adam; De Bruyn, Kristof; De Capua, Stefano; De Cian, Michel; De Miranda, Jussara; De Paula, Leandro; De Silva, Weeraddana; De Simone, Patrizia; Dean, Cameron Thomas; Decamp, Daniel; Deckenhoff, Mirko; Del Buono, Luigi; Déléage, Nicolas; Derkach, Denis; Deschamps, Olivier; Dettori, Francesco; Di Canto, Angelo; Dijkstra, Hans; Donleavy, Stephanie; Dordei, Francesca; Dorigo, Mirco; Dosil Suárez, Alvaro; Dossett, David; Dovbnya, Anatoliy; Dreimanis, Karlis; Dujany, Giulio; Dupertuis, Frederic; Durante, Paolo; Dzhelyadin, Rustem; Dziurda, Agnieszka; Dzyuba, Alexey; Easo, Sajan; Egede, Ulrik; Egorychev, Victor; Eidelman, Semen; Eisenhardt, Stephan; Eitschberger, Ulrich; Ekelhof, Robert; Eklund, Lars; El Rifai, Ibrahim; Elsasser, Christian; Ely, Scott; Esen, Sevda; Evans, Hannah Mary; Evans, Timothy; Falabella, Antonio; Färber, Christian; Farinelli, Chiara; Farley, Nathanael; Farry, Stephen; Fay, Robert; Ferguson, Dianne; Fernandez Albor, Victor; Ferreira Rodrigues, Fernando; Ferro-Luzzi, Massimiliano; Filippov, Sergey; Fiore, Marco; Fiorini, Massimiliano; Firlej, Miroslaw; Fitzpatrick, Conor; Fiutowski, Tomasz; Fol, Philip; Fontana, Marianna; Fontanelli, Flavio; Forty, Roger; Francisco, Oscar; Frank, Markus; Frei, Christoph; Frosini, Maddalena; Fu, Jinlin; Furfaro, Emiliano; Gallas Torreira, Abraham; Galli, Domenico; Gallorini, Stefano; Gambetta, Silvia; Gandelman, Miriam; Gandini, Paolo; Gao, Yuanning; García Pardiñas, Julián; Garofoli, Justin; Garra Tico, Jordi; Garrido, Lluis; Gascon, David; Gaspar, Clara; Gauld, Rhorry; Gavardi, Laura; Geraci, Angelo; Gersabeck, Evelina; Gersabeck, Marco; Gershon, Timothy; Ghez, Philippe; Gianelle, Alessio; Gianì, Sebastiana; Gibson, Valerie; Giubega, Lavinia-Helena; Gligorov, V.V.; Göbel, Carla; Golubkov, Dmitry; Golutvin, Andrey; Gomes, Alvaro; Gotti, Claudio; Grabalosa Gándara, Marc; Graciani Diaz, Ricardo; Granado Cardoso, Luis Alberto; Graugés, Eugeni; Graziani, Giacomo; Grecu, Alexandru; Greening, Edward; Gregson, Sam; Griffith, Peter; Grillo, Lucia; Grünberg, Oliver; Gui, Bin; Gushchin, Evgeny; Guz, Yury; Gys, Thierry; Hadjivasiliou, Christos; Haefeli, Guido; Haen, Christophe; Haines, Susan; Hall, Samuel; Hamilton, Brian; Hampson, Thomas; Han, Xiaoxue; Hansmann-Menzemer, Stephanie; Harnew, Neville; Harnew, Samuel; Harrison, Jonathan; He, Jibo; Head, Timothy; Heijne, Veerle; Hennessy, Karol; Henrard, Pierre; Henry, Louis; Hernando Morata, Jose Angel; van Herwijnen, Eric; Heß, Miriam; Hicheur, Adlène; Hill, Donal; Hoballah, Mostafa; Hombach, Christoph; Hulsbergen, Wouter; Hunt, Philip; Hussain, Nazim; Hutchcroft, David; Hynds, Daniel; Idzik, Marek; Ilten, Philip; Jacobsson, Richard; Jaeger, Andreas; Jalocha, Pawel; Jans, Eddy; Jaton, Pierre; Jawahery, Abolhassan; Jing, Fanfan; John, Malcolm; Johnson, Daniel; Jones, Christopher; Joram, Christian; Jost, Beat; Jurik, Nathan; Kandybei, Sergii; Kanso, Walaa; Karacson, Matthias; Karbach, Moritz; Karodia, Sarah; Kelsey, Matthew; Kenyon, Ian; Ketel, Tjeerd; Khanji, Basem; Khurewathanakul, Chitsanu; Klaver, Suzanne; Klimaszewski, Konrad; Kochebina, Olga; Kolpin, Michael; Komarov, Ilya; Koopman, Rose; Koppenburg, Patrick; Korolev, Mikhail; Kozlinskiy, Alexandr; Kravchuk, Leonid; Kreplin, Katharina; Kreps, Michal; Krocker, Georg; Krokovny, Pavel; Kruse, Florian; Kucewicz, Wojciech; Kucharczyk, Marcin; Kudryavtsev, Vasily; Kurek, Krzysztof; Kvaratskheliya, Tengiz; La Thi, Viet Nga; Lacarrere, Daniel; Lafferty, George; Lai, Adriano; Lambert, Dean; Lambert, Robert W; Lanfranchi, Gaia; Langenbruch, Christoph; Langhans, Benedikt; Latham, Thomas; Lazzeroni, Cristina; Le Gac, Renaud; van Leerdam, Jeroen; Lees, Jean-Pierre; Lefèvre, Regis; Leflat, Alexander; Lefrançois, Jacques; Leo, Sabato; Leroy, Olivier; Lesiak, Tadeusz; Leverington, Blake; Li, Yiming; Likhomanenko, Tatiana; Liles, Myfanwy; Lindner, Rolf; Linn, Christian; Lionetto, Federica; Liu, Bo; Lohn, Stefan; Longstaff, Iain; Lopes, Jose; Lopez-March, Neus; Lowdon, Peter; Lu, Haiting; Lucchesi, Donatella; Luo, Haofei; Lupato, Anna; Luppi, Eleonora; Lupton, Oliver; Machefert, Frederic; Machikhiliyan, Irina V; Maciuc, Florin; Maev, Oleg; Malde, Sneha; Malinin, Alexander; Manca, Giulia; Mancinelli, Giampiero; Mapelli, Alessandro; Maratas, Jan; Marchand, Jean François; Marconi, Umberto; Marin Benito, Carla; Marino, Pietro; Märki, Raphael; Marks, Jörg; Martellotti, Giuseppe; Martens, Aurelien; Martín Sánchez, Alexandra; Martinelli, Maurizio; Martinez Santos, Diego; Martinez Vidal, Fernando; Martins Tostes, Danielle; Massafferri, André; Matev, Rosen; Mathe, Zoltan; Matteuzzi, Clara; Maurin, Brice; Mazurov, Alexander; McCann, Michael; McCarthy, James; McNab, Andrew; McNulty, Ronan; McSkelly, Ben; Meadows, Brian; Meier, Frank; Meissner, Marco; Merk, Marcel; Milanes, Diego Alejandro; Minard, Marie-Noelle; Moggi, Niccolò; Molina Rodriguez, Josue; Monteil, Stephane; Morandin, Mauro; Morawski, Piotr; Mordà, Alessandro; Morello, Michael Joseph; Moron, Jakub; Morris, Adam Benjamin; Mountain, Raymond; Muheim, Franz; Müller, Katharina; Mussini, Manuel; Muster, Bastien; Naik, Paras; Nakada, Tatsuya; Nandakumar, Raja; Nasteva, Irina; Needham, Matthew; Neri, Nicola; Neubert, Sebastian; Neufeld, Niko; Neuner, Max; Nguyen, Anh Duc; Nguyen, Thi-Dung; Nguyen-Mau, Chung; Nicol, Michelle; Niess, Valentin; Niet, Ramon; Nikitin, Nikolay; Nikodem, Thomas; Novoselov, Alexey; O'Hanlon, Daniel Patrick; Oblakowska-Mucha, Agnieszka; Obraztsov, Vladimir; Oggero, Serena; Ogilvy, Stephen; Okhrimenko, Oleksandr; Oldeman, Rudolf; Onderwater, Gerco; Orlandea, Marius; Otalora Goicochea, Juan Martin; Owen, Patrick; Oyanguren, Maria Arantza; Pal, Bilas Kanti; Palano, Antimo; Palombo, Fernando; Palutan, Matteo; Panman, Jacob; Papanestis, Antonios; Pappagallo, Marco; Pappalardo, Luciano; Parkes, Christopher; Parkinson, Christopher John; Passaleva, Giovanni; Patel, Girish; Patel, Mitesh; Patrignani, Claudia; Pearce, Alex; Pellegrino, Antonio; Pepe Altarelli, Monica; Perazzini, Stefano; Perret, Pascal; Perrin-Terrin, Mathieu; Pescatore, Luca; Pesen, Erhan; Pessina, Gianluigi; Petridis, Konstantin; Petrolini, Alessandro; Picatoste Olloqui, Eduardo; Pietrzyk, Boleslaw; Pilař, Tomas; Pinci, Davide; Pistone, Alessandro; Playfer, Stephen; Plo Casasus, Maximo; Polci, Francesco; Poluektov, Anton; Polycarpo, Erica; Popov, Alexander; Popov, Dmitry; Popovici, Bogdan; Potterat, Cédric; Price, Eugenia; Price, Joseph David; Prisciandaro, Jessica; Pritchard, Adrian; Prouve, Claire; Pugatch, Valery; Puig Navarro, Albert; Punzi, Giovanni; Qian, Wenbin; Rachwal, Bartolomiej; Rademacker, Jonas; Rakotomiaramanana, Barinjaka; Rama, Matteo; Rangel, Murilo; Raniuk, Iurii; Rauschmayr, Nathalie; Raven, Gerhard; Redi, Federico; Reichert, Stefanie; Reid, Matthew; dos Reis, Alberto; Ricciardi, Stefania; Richards, Sophie; Rihl, Mariana; Rinnert, Kurt; Rives Molina, Vincente; Robbe, Patrick; Rodrigues, Ana Barbara; Rodrigues, Eduardo; Rodriguez Perez, Pablo; Roiser, Stefan; Romanovsky, Vladimir; Romero Vidal, Antonio; Rotondo, Marcello; Rouvinet, Julien; Ruf, Thomas; Ruiz, Hugo; Ruiz Valls, Pablo; Saborido Silva, Juan Jose; Sagidova, Naylya; Sail, Paul; Saitta, Biagio; Salustino Guimaraes, Valdir; Sanchez Mayordomo, Carlos; Sanmartin Sedes, Brais; Santacesaria, Roberta; Santamarina Rios, Cibran; Santovetti, Emanuele; Sarti, Alessio; Satriano, Celestina; Satta, Alessia; Saunders, Daniel Martin; Savrina, Darya; Schiller, Manuel; Schindler, Heinrich; Schlupp, Maximilian; Schmelling, Michael; Schmidt, Burkhard; Schneider, Olivier; Schopper, Andreas; Schubiger, Maxime; Schune, Marie Helene; Schwemmer, Rainer; Sciascia, Barbara; Sciubba, Adalberto; Semennikov, Alexander; Sepp, Indrek; Serra, Nicola; Serrano, Justine; Sestini, Lorenzo; Seyfert, Paul; Shapkin, Mikhail; Shapoval, Illya; Shcheglov, Yury; Shears, Tara; Shekhtman, Lev; Shevchenko, Vladimir; Shires, Alexander; Silva Coutinho, Rafael; Simi, Gabriele; Sirendi, Marek; Skidmore, Nicola; Skwarnicki, Tomasz; Smith, Anthony; Smith, Edmund; Smith, Eluned; Smith, Jackson; Smith, Mark; Snoek, Hella; Sokoloff, Michael; Soler, Paul; Soomro, Fatima; Souza, Daniel; Souza De Paula, Bruno; Spaan, Bernhard; Sparkes, Ailsa; Spradlin, Patrick; Sridharan, Srikanth; Stagni, Federico; Stahl, Marian; Stahl, Sascha; Steinkamp, Olaf; Stenyakin, Oleg; Stevenson, Scott; Stoica, Sabin; Stone, Sheldon; Storaci, Barbara; Stracka, Simone; Straticiuc, Mihai; Straumann, Ulrich; Stroili, Roberto; Subbiah, Vijay Kartik; Sun, Liang; Sutcliffe, William; Swientek, Krzysztof; Swientek, Stefan; Syropoulos, Vasileios; Szczekowski, Marek; Szczypka, Paul; Szumlak, Tomasz; T'Jampens, Stephane; Teklishyn, Maksym; Tellarini, Giulia; Teubert, Frederic; Thomas, Christopher; Thomas, Eric; van Tilburg, Jeroen; Tisserand, Vincent; Tobin, Mark; Tolk, Siim; Tomassetti, Luca; Tonelli, Diego; Topp-Joergensen, Stig; Torr, Nicholas; Tournefier, Edwige; Tourneur, Stephane; Tran, Minh Tâm; Tresch, Marco; Trisovic, Ana; Tsaregorodtsev, Andrei; Tsopelas, Panagiotis; Tuning, Niels; Ubeda Garcia, Mario; Ukleja, Artur; Ustyuzhanin, Andrey; Uwer, Ulrich; Vacca, Claudia; Vagnoni, Vincenzo; Valenti, Giovanni; Vallier, Alexis; Vazquez Gomez, Ricardo; Vazquez Regueiro, Pablo; Vázquez Sierra, Carlos; Vecchi, Stefania; Velthuis, Jaap; Veltri, Michele; Veneziano, Giovanni; Vesterinen, Mika; Viaud, Benoit; Vieira, Daniel; Vieites Diaz, Maria; Vilasis-Cardona, Xavier; Vollhardt, Achim; Volyanskyy, Dmytro; Voong, David; Vorobyev, Alexey; Vorobyev, Vitaly; Voß, Christian; de Vries, Jacco; Waldi, Roland; Wallace, Charlotte; Wallace, Ronan; Walsh, John; Wandernoth, Sebastian; Wang, Jianchun; Ward, David; Watson, Nigel; Websdale, David; Whitehead, Mark; Wicht, Jean; Wiedner, Dirk; Wilkinson, Guy; Williams, Matthew; Williams, Mike; Wilschut, Hans; Wilson, Fergus; Wimberley, Jack; Wishahi, Julian; Wislicki, Wojciech; Witek, Mariusz; Wormser, Guy; Wotton, Stephen; Wright, Simon; Wyllie, Kenneth; Xie, Yuehong; Xing, Zhou; Xu, Zhirui; Yang, Zhenwei; Yuan, Xuhao; Yushchenko, Oleg; Zangoli, Maria; Zavertyaev, Mikhail; Zhang, Liming; Zhang, Wen Chao; Zhang, Yanxi; Zhelezov, Alexey; Zhokhov, Anatoly; Zhong, Liang; Zvyagin, Alexander

    2014-12-05

    Measuring cross-sections at the LHC requires the luminosity to be determined accurately at each centre-of-mass energy $\\sqrt{s}$. In this paper results are reported from the luminosity calibrations carried out at the LHC interaction point 8 with the LHCb detector for $\\sqrt{s}$ = 2.76, 7 and 8 TeV (proton-proton collisions) and for $\\sqrt{s_{NN}}$ = 5 TeV (proton-lead collisions). Both the "van der Meer scan" and "beam-gas imaging" luminosity calibration methods were employed. It is observed that the beam density profile cannot always be described by a function that is factorizable in the two transverse coordinates. The introduction of a two-dimensional description of the beams improves significantly the consistency of the results. For proton-proton interactions at $\\sqrt{s}$ = 8 TeV a relative precision of the luminosity calibration of 1.47% is obtained using van der Meer scans and 1.43% using beam-gas imaging, resulting in a combined precision of 1.12%. Applying the calibration to the full data set determin...

  8. Laser precision microfabrication in Japan

    Miyamoto, Isamu; Ooie, Toshihiko; Takeno, Shozui

    2000-11-01

    Electronic devices such as handy phones and micro computers have been rapidly expanding their market recent years due to their enhanced performance, down sizing and cost down. This has been realized by the innovation in the precision micro- fabrication technology of semiconductors and printed wiring circuit boards (PWB) where laser technologies such as lithography, drilling, trimming, welding and soldering play an important role. In phot lithography, for instance, KrF excimer lasers having a resolution of 0.18 micrometers has been used in production instead of mercury lamp. Laser drilling of PWB has been increased up to over 1000 holes per second, and approximately 800 laser drilling systems of PWB are expected to be delivered in the world market this year, and most of these laser processing systems are manufactured in Japan. Trend of laser micro-fabrication in Japanese industry is described along with recent topics of R&D, government supported project and future tasks of industrial laser precision micro-fabrication on the basis of the survey conducted by Japan laser Processing Society.

  9. Precision experiments in electroweak interactions

    Swartz, M.L.

    1990-03-01

    The electroweak theory of Glashow, Weinberg, and Salam (GWS) has become one of the twin pillars upon which our understanding of all particle physics phenomena rests. It is a brilliant achievement that qualitatively and quantitatively describes all of the vast quantity of experimental data that have been accumulated over some forty years. Note that the word quantitatively must be qualified. The low energy limiting cases of the GWS theory, Quantum Electrodynamics and the V-A Theory of Weak Interactions, have withstood rigorous testing. The high energy synthesis of these ideas, the GWS theory, has not yet been subjected to comparably precise scrutiny. The recent operation of a new generation of proton-antiproton (p bar p) and electron-positron (e + e - ) colliders has made it possible to produce and study large samples of the electroweak gauge bosons W ± and Z 0 . We expect that these facilities will enable very precise tests of the GWS theory to be performed in the near future. In keeping with the theme of this Institute, Physics at the 100 GeV Mass Scale, these lectures will explore the current status and the near-future prospects of these experiments

  10. Antihydrogen production and precision experiments

    Nieto, M.M.; Goldman, T.; Holzscheiter, M.H.

    1996-01-01

    The study of CPT invariance with the highest achievable precision in all particle sectors is of fundamental importance for physics. Equally important is the question of the gravitational acceleration of antimatter. In recent years, impressive progress has been achieved in capturing antiprotons in specially designed Penning traps, in cooling them to energies of a few milli-electron volts, and in storing them for hours in a small volume of space. Positrons have been accumulated in large numbers in similar traps, and low energy positron or positronium beams have been generated. Finally, steady progress has been made in trapping and cooling neutral atoms. Thus the ingredients to form antihydrogen at rest are at hand. Once antihydrogen atoms have been captured at low energy, spectroscopic methods can be applied to interrogate their atomic structure with extremely high precision and compare it to its normal matter counterpart, the hydrogen atom. Especially the 1S-2S transition, with a lifetime of the excited state of 122 msec and thereby a natural linewidth of 5 parts in 10 16 , offers in principle the possibility to directly compare matter and antimatter properties at a level of 1 part in 10 16

  11. Laser fusion and precision engineering

    Nakai, Sadao

    1989-01-01

    The development of laser nuclear fusion energy for attaining the self supply of energy in Japan and establishing the future perspective as the nation is based in the wide fields of high level science and technology. Therefore to its promotion, large expectation is placed as the powerful traction for the development of creative science and technology which are particularly necessary in Japan. The research on laser nuclear fusion advances steadily in the elucidation of the physics of pellet implosion which is its basic concept and compressed plasma parameters. In September, 1986, the number of neutron generation 10 13 , and in October, 1988, the high density compression 600 times as high as solid density have been achieved. Based on these results, now the laser nuclear fusion is in the situation to begin the attainment of ignition condition for nuclear fusion and the realization of break even. The optical components, high power laser technology, fuel pellet production, high resolution measurement, the simulation of implosion using a supercomputer and so on are closely related to precision engineering. In this report, the mechanism of laser nuclear fusion, the present status of its research, and the basic technologies and precision engineering are described. (K.I.)

  12. Spatial filtring and thermocouple spatial filter

    Han Bing; Tong Yunxian

    1989-12-01

    The design and study on thermocouple spatial filter have been conducted for the flow measurement of integrated reactor coolant. The fundamental principle of spatial filtring, mathematical descriptions and analyses of thermocouple spatial filter are given

  13. A Modified Version of Taylor’s Hypothesis for Solar Probe Plus Observations

    Klein, Kristopher G.; Perez, Jean C.; Verscharen, Daniel; Mallet, Alfred; Chandran, Benjamin D. G.

    2015-03-01

    The Solar Probe Plus (SPP) spacecraft will explore the near-Sun environment, reaching heliocentric distances less than 10 {{R}⊙ }. Near Earth, spacecraft measurements of fluctuating velocities and magnetic fields taken in the time domain are translated into information about the spatial structure of the solar wind via Taylor’s “frozen turbulence” hypothesis. Near the perihelion of SPP, however, the solar-wind speed is comparable to the Alfvén speed, and Taylor’s hypothesis in its usual form does not apply. In this paper, we show that under certain assumptions, a modified version of Taylor’s hypothesis can be recovered in the near-Sun region. We consider only the transverse, non-compressive component of the fluctuations at length scales exceeding the proton gyroradius, and we describe these fluctuations using an approximate theoretical framework developed by Heinemann and Olbert. We show that fluctuations propagating away from the Sun in the plasma frame obey a relation analogous to Taylor’s hypothesis when {{V}sc,\\bot }\\gg {{z}-} and {{z}+}\\gg {{z}-}, where {{V}sc,\\bot } is the component of the spacecraft velocity perpendicular to the mean magnetic field and {{{\\boldsymbol{z}} }+} ({{{\\boldsymbol{z}} }-}) is the Elsasser variable corresponding to transverse, non-compressive fluctuations propagating away from (toward) the Sun in the plasma frame. Observations and simulations suggest that, in the near-Sun solar wind, the above inequalities are satisfied and {{{\\boldsymbol{z}} }+} fluctuations account for most of the fluctuation energy. The modified form of Taylor’s hypothesis that we derive may thus make it possible to characterize the spatial structure of the energetically dominant component of the turbulence encountered by SPP.

  14. Problems with the Younger Dryas Boundary (YDB) Impact Hypothesis

    Boslough, M.

    2009-12-01

    One breakthrough of 20th-century Earth science was the recognition of impacts as an important geologic process. The most obvious result is a crater. There are more than 170 confirmed terrestrial impact structures with a non-uniform spatial distribution suggesting more to be found. Many have been erased by tectonics and erosion. Deep water impacts do not form craters, and craters in ice sheets disappear when the ice melts. There is growing speculation that such hidden impacts have caused frequent major environmental events of the Holocene, but this is inconsistent with the astronomically-constrained population of Earth-crossing asteroids. Impacts can have consequences much more significant than excavation of a crater. The K/T boundary mass extinction is attributed to the environmental effects of a major impact, and some researchers argue that other extinctions, abrupt climate changes, and even civilization collapses have resulted from impacts. Nuclear winter models suggest that 2-km diameter asteroids exceed a "global catastrophe threshold" by injecting sufficient dust into the stratosphere to cause short-term climate changes, but would not necessarily collapse most natural ecosystems or cause mass extinctions. Globally-catastrophic impacts recur on timescales of about one million years. The 1994 collision of Comet Shoemaker-Levy 9 with Jupiter led us recognize the significance of terrestrial airbursts caused by objects exploding violently in Earth’s atmosphere. We have invoked airbursts to explain rare forms of non-volcanic glasses and melts by using high-resolution computational models to improve our understanding of atmospheric explosions, and have suggested that multiple airbursts from fragmented impactors could be responsible for regional effects. Our models have been cited in support of the widely-publicized YDB impact hypothesis. Proponents claim that a broken comet exploded over North America, with some fragments cratering the Laurentide Ice Sheet. They

  15. The zinc dyshomeostasis hypothesis of Alzheimer's disease.

    Travis J A Craddock

    Full Text Available Alzheimer's disease (AD is the most common form of dementia in the elderly. Hallmark AD neuropathology includes extracellular amyloid plaques composed largely of the amyloid-β protein (Aβ, intracellular neurofibrillary tangles (NFTs composed of hyper-phosphorylated microtubule-associated protein tau (MAP-tau, and microtubule destabilization. Early-onset autosomal dominant AD genes are associated with excessive Aβ accumulation, however cognitive impairment best correlates with NFTs and disrupted microtubules. The mechanisms linking Aβ and NFT pathologies in AD are unknown. Here, we propose that sequestration of zinc by Aβ-amyloid deposits (Aβ oligomers and plaques not only drives Aβ aggregation, but also disrupts zinc homeostasis in zinc-enriched brain regions important for memory and vulnerable to AD pathology, resulting in intra-neuronal zinc levels, which are either too low, or excessively high. To evaluate this hypothesis, we 1 used molecular modeling of zinc binding to the microtubule component protein tubulin, identifying specific, high-affinity zinc binding sites that influence side-to-side tubulin interaction, the sensitive link in microtubule polymerization and stability. We also 2 performed kinetic modeling showing zinc distribution in extra-neuronal Aβ deposits can reduce intra-neuronal zinc binding to microtubules, destabilizing microtubules. Finally, we 3 used metallomic imaging mass spectrometry (MIMS to show anatomically-localized and age-dependent zinc dyshomeostasis in specific brain regions of Tg2576 transgenic, mice, a model for AD. We found excess zinc in brain regions associated with memory processing and NFT pathology. Overall, we present a theoretical framework and support for a new theory of AD linking extra-neuronal Aβ amyloid to intra-neuronal NFTs and cognitive dysfunction. The connection, we propose, is based on β-amyloid-induced alterations in zinc ion concentration inside neurons affecting stability of

  16. The zinc dyshomeostasis hypothesis of Alzheimer's disease.

    Craddock, Travis J A; Tuszynski, Jack A; Chopra, Deepak; Casey, Noel; Goldstein, Lee E; Hameroff, Stuart R; Tanzi, Rudolph E

    2012-01-01

    Alzheimer's disease (AD) is the most common form of dementia in the elderly. Hallmark AD neuropathology includes extracellular amyloid plaques composed largely of the amyloid-β protein (Aβ), intracellular neurofibrillary tangles (NFTs) composed of hyper-phosphorylated microtubule-associated protein tau (MAP-tau), and microtubule destabilization. Early-onset autosomal dominant AD genes are associated with excessive Aβ accumulation, however cognitive impairment best correlates with NFTs and disrupted microtubules. The mechanisms linking Aβ and NFT pathologies in AD are unknown. Here, we propose that sequestration of zinc by Aβ-amyloid deposits (Aβ oligomers and plaques) not only drives Aβ aggregation, but also disrupts zinc homeostasis in zinc-enriched brain regions important for memory and vulnerable to AD pathology, resulting in intra-neuronal zinc levels, which are either too low, or excessively high. To evaluate this hypothesis, we 1) used molecular modeling of zinc binding to the microtubule component protein tubulin, identifying specific, high-affinity zinc binding sites that influence side-to-side tubulin interaction, the sensitive link in microtubule polymerization and stability. We also 2) performed kinetic modeling showing zinc distribution in extra-neuronal Aβ deposits can reduce intra-neuronal zinc binding to microtubules, destabilizing microtubules. Finally, we 3) used metallomic imaging mass spectrometry (MIMS) to show anatomically-localized and age-dependent zinc dyshomeostasis in specific brain regions of Tg2576 transgenic, mice, a model for AD. We found excess zinc in brain regions associated with memory processing and NFT pathology. Overall, we present a theoretical framework and support for a new theory of AD linking extra-neuronal Aβ amyloid to intra-neuronal NFTs and cognitive dysfunction. The connection, we propose, is based on β-amyloid-induced alterations in zinc ion concentration inside neurons affecting stability of polymerized

  17. The Spatial Politics of Spatial Representation

    Olesen, Kristian; Richardson, Tim

    2011-01-01

    spatial planning in Denmark reveals how fuzzy spatial representations and relational spatial concepts are being used to depoliticise strategic spatial planning processes and to camouflage spatial politics. The paper concludes that, while relational geography might play an important role in building......This paper explores the interplay between the spatial politics of new governance landscapes and innovations in the use of spatial representations in planning. The central premise is that planning experiments with new relational approaches become enmeshed in spatial politics. The case of strategic...

  18. The Precision Field Lysimeter Concept

    Fank, J.

    2009-04-01

    The understanding and interpretation of leaching processes have improved significantly during the past decades. Unlike laboratory experiments, which are mostly performed under very controlled conditions (e.g. homogeneous, uniform packing of pre-treated test material, saturated steady-state flow conditions, and controlled uniform hydraulic conditions), lysimeter experiments generally simulate actual field conditions. Lysimeters may be classified according to different criteria such as type of soil block used (monolithic or reconstructed), drainage (drainage by gravity or vacuum or a water table may be maintained), or weighing or non-weighing lysimeters. In 2004 experimental investigations have been set up to assess the impact of different farming systems on groundwater quality of the shallow floodplain aquifer of the river Mur in Wagna (Styria, Austria). The sediment is characterized by a thin layer (30 - 100 cm) of sandy Dystric Cambisol and underlying gravel and sand. Three precisely weighing equilibrium tension block lysimeters have been installed in agricultural test fields to compare water flow and solute transport under (i) organic farming, (ii) conventional low input farming and (iii) extensification by mulching grass. Specific monitoring equipment is used to reduce the well known shortcomings of lysimeter investigations: The lysimeter core is excavated as an undisturbed monolithic block (circular, 1 m2 surface area, 2 m depth) to prevent destruction of the natural soil structure, and pore system. Tracing experiments have been achieved to investigate the occurrence of artificial preferential flow and transport along the walls of the lysimeters. The results show that such effects can be neglected. Precisely weighing load cells are used to constantly determine the weight loss of the lysimeter due to evaporation and transpiration and to measure different forms of precipitation. The accuracy of the weighing apparatus is 0.05 kg, or 0.05 mm water equivalent

  19. Precision cosmology and the landscape

    Bousso, Raphael; Bousso, Raphael

    2006-01-01

    After reviewing the cosmological constant problem--why is Lambda not huge?--I outline the two basic approaches that had emerged by the late 1980s, and note that each made a clear prediction. Precision cosmological experiments now indicate that the cosmological constant is nonzero. This result strongly favors the environmental approach, in which vacuum energy can vary discretely among widely separated regions in the universe. The need to explain this variation from first principles constitutes an observational constraint on fundamental theory. I review arguments that string theory satisfies this constraint, as it contains a dense discretuum of metastable vacua. The enormous landscape of vacua calls for novel, statistical methods of deriving predictions, and it prompts us to reexamine our description of spacetime on the largest scales. I discuss the effects of cosmological dynamics, and I speculate that weighting vacua by their entropy production may allow for prior-free predictions that do not resort to explicitly anthropic arguments

  20. GPS Precision Timing at CERN

    Beetham, C G

    1999-01-01

    For the past decade, the Global Positioning System (GPS) has been used to provide precise time, frequency and position co-ordinates world-wide. Recently, equipment has become available specialising in providing extremely accurate timing information, referenced to Universal Time Co-ordinates (UTC). This feature has been used at CERN to provide time of day information for systems that have been installed in the Proton Synchrotron (PS), Super Proton Synchrotron (SPS) and the Large Electron Positron (LEP) machines. The different systems are described as well as the planned developments, particularly with respect to optical transmission and the Inter-Range Instrumentation Group IRIG-B standard, for future use in the Large Hadron Collider (LHC).

  1. Spatial Contiguity and Incidental Learning in Multimedia Environments

    Paek, Seungoh; Hoffman, Daniel L.; Saravanos, Antonios

    2017-01-01

    Drawing on dual-process theories of cognitive function, the degree to which spatial contiguity influences incidental learning outcomes was examined. It was hypothesized that spatial contiguity would mediate what was learned even in the absence of an explicit learning goal. To test this hypothesis, 149 adults completed a multimedia-related task…

  2. Development of sensor guided precision sprayers

    Nieuwenhuizen, A.T.; Zande, van de J.C.

    2013-01-01

    Sensor guided precision sprayers were developed to automate the spray process with a focus on emission reduction and identical or increased efficacy, with the precision agriculture concept in mind. Within the project “Innovations2” sensor guided precision sprayers were introduced to leek,

  3. Generation of spectral–temporal response surfaces by combining multispectral satellite and hyperspectral UAV imagery for precision agriculture applications

    Gevaert, C.; Suomalainen, J.M.; Tang, J.; Kooistra, L.

    2015-01-01

    Precision agriculture requires detailed crop status information at high spatial and temporal resolutions. Remote sensing can provide such information, but single sensor observations are often incapable of meeting all data requirements. Spectral–temporal response surfaces (STRSs) provide continuous

  4. Precision production: enabling deterministic throughput for precision aspheres with MRF

    Maloney, Chris; Entezarian, Navid; Dumas, Paul

    2017-10-01

    Aspherical lenses offer advantages over spherical optics by improving image quality or reducing the number of elements necessary in an optical system. Aspheres are no longer being used exclusively by high-end optical systems but are now replacing spherical optics in many applications. The need for a method of production-manufacturing of precision aspheres has emerged and is part of the reason that the optics industry is shifting away from artisan-based techniques towards more deterministic methods. Not only does Magnetorheological Finishing (MRF) empower deterministic figure correction for the most demanding aspheres but it also enables deterministic and efficient throughput for series production of aspheres. The Q-flex MRF platform is designed to support batch production in a simple and user friendly manner. Thorlabs routinely utilizes the advancements of this platform and has provided results from using MRF to finish a batch of aspheres as a case study. We have developed an analysis notebook to evaluate necessary specifications for implementing quality control metrics. MRF brings confidence to optical manufacturing by ensuring high throughput for batch processing of aspheres.

  5. Modulation of the Object/Background Interaction by Spatial Frequency

    Yanju Ren

    2011-05-01

    Full Text Available With regard to the relationship between object and background perception in the natural scene images, functional isolation hypothesis and interactive hypothesis were proposed. Based on previous studies, the present study investigated the role of spatial frequency in the relationship between object and background perception in the natural scene images. In three experiments, participants reported the object, background, or both after seeing each picture for 500 ms followed by a mask. The authors found that (a backgrounds were identified more accurately when they contained a consistent rather than an inconsistent object, independently of spatial frequency; (b objects were identified more accurately in a consistent than an inconsistent background under the condition of low spatial frequencies but not high spatial frequencies; (c spatial frequency modulation remained when both objects and backgrounds were reported simultaneously. The authors conclude that object/background interaction is partially dependent on spatial frequency.

  6. Technology in precision viticulture: a state of the art review

    Matese A

    2015-05-01

    Full Text Available Alessandro Matese,1 Salvatore Filippo Di Gennaro1,2 1Institute of Biometeorology, National Research Council (IBIMET-CNR, Florence, Italy; 2Department of Agricultural, Food and Environmental Sciences, University of Perugia, Perugia, Italy Abstract: Precision viticulture aims to maximize the oenological potential of vineyards. This is especially true in regions where the high quality standards of wine production justify the adoption of site-specific management practices to simultaneously increase both quality and yield. The introduction of new technologies for supporting vineyard management allows the efficiency and quality of production to be improved and, at the same time, reduces the environmental impact. The rapid evolution of information communication technologies and geographical science offers enormous potential for the development of optimized solutions for distributed information for precision viticulture. Recent technological developments have allowed useful tools to be elaborated that help in the monitoring and control of many aspects of vine growth. Precision viticulture thus seeks to exploit the widest range of available observations to describe the vineyard spatial variability with high resolution, and provide recommendations to improve management efficiency in terms of quality, production, and sustainability. This review presents a brief outline of state of the art of technologies in precision viticulture. It is divided in two sections, the first focusing on monitoring technologies such as geolocating and remote and proximal sensing; the second focuses on variable-rate technologies and the new agricultural robots. Keywords: remote sensing, proximal sensing, variable-rate technology, robot 

  7. Derivation and precision of mean field electrodynamics with mesoscale fluctuations

    Zhou, Hongzhe; Blackman, Eric G.

    2018-06-01

    Mean field electrodynamics (MFE) facilitates practical modelling of secular, large scale properties of astrophysical or laboratory systems with fluctuations. Practitioners commonly assume wide scale separation between mean and fluctuating quantities, to justify equality of ensemble and spatial or temporal averages. Often however, real systems do not exhibit such scale separation. This raises two questions: (I) What are the appropriate generalized equations of MFE in the presence of mesoscale fluctuations? (II) How precise are theoretical predictions from MFE? We address both by first deriving the equations of MFE for different types of averaging, along with mesoscale correction terms that depend on the ratio of averaging scale to variation scale of the mean. We then show that even if these terms are small, predictions of MFE can still have a significant precision error. This error has an intrinsic contribution from the dynamo input parameters and a filtering contribution from differences in the way observations and theory are projected through the measurement kernel. Minimizing the sum of these contributions can produce an optimal scale of averaging that makes the theory maximally precise. The precision error is important to quantify when comparing to observations because it quantifies the resolution of predictive power. We exemplify these principles for galactic dynamos, comment on broader implications, and identify possibilities for further work.

  8. Instrument-induced spatial crosstalk deconvolution algorithm

    Wright, Valerie G.; Evans, Nathan L., Jr.

    1986-01-01

    An algorithm has been developed which reduces the effects of (deconvolves) instrument-induced spatial crosstalk in satellite image data by several orders of magnitude where highly precise radiometry is required. The algorithm is based upon radiance transfer ratios which are defined as the fractional bilateral exchange of energy betwen pixels A and B.

  9. Precision measurements with atom interferometry

    Schubert, Christian; Abend, Sven; Schlippert, Dennis; Ertmer, Wolfgang; Rasel, Ernst M.

    2017-04-01

    Interferometry with matter waves enables precise measurements of rotations, accelerations, and differential accelerations [1-5]. This is exploited for determining fundamental constants [2], in fundamental science as e.g. testing the universality of free fall [3], and is applied for gravimetry [4], and gravity gradiometry [2,5]. At the Institut für Quantenoptik in Hannover, different approaches are pursued. A large scale device is designed and currently being set up to investigate the gain in precision for gravimetry, gradiometry, and fundamental tests on large baselines [6]. For field applications, a compact and transportable device is being developed. Its key feature is an atom chip source providing a collimated high flux of atoms which is expected to mitigate systematic uncertainties [7,8]. The atom chip technology and miniaturization benefits from microgravity experiments in the drop tower in Bremen and sounding rocket experiments [8,9] which act as pathfinders for space borne operation [10]. This contribution will report about our recent results. The presented work is supported by the CRC 1227 DQ-mat, the CRC 1128 geo-Q, the RTG 1729, the QUEST-LFS, and by the German Space Agency (DLR) with funds provided by the Federal Ministry of Economic Affairs and Energy (BMWi) due to an enactment of the German Bundestag under Grant No. DLR 50WM1552-1557. [1] P. Berg et al., Phys. Rev. Lett., 114, 063002, 2015; I. Dutta et al., Phys. Rev. Lett., 116, 183003, 2016. [2] J. B. Fixler et al., Science 315, 74 (2007); G. Rosi et al., Nature 510, 518, 2014. [3] D. Schlippert et al., Phys. Rev. Lett., 112, 203002, 2014. [4] A. Peters et al., Nature 400, 849, 1999; A. Louchet-Chauvet et al., New J. Phys. 13, 065026, 2011; C. Freier et al., J. of Phys.: Conf. Series 723, 012050, 2016. [5] J. M. McGuirk et al., Phys. Rev. A 65, 033608, 2002; P. Asenbaum et al., arXiv:1610.03832. [6] J. Hartwig et al., New J. Phys. 17, 035011, 2015. [7] H. Ahlers et al., Phys. Rev. Lett. 116, 173601

  10. Demonstration of a Fast, Precise Propane Measurement Using Infrared Spectroscopy

    Zahniser, M. S.; Roscioli, J. R.; Nelson, D. D.; Herndon, S. C.

    2016-12-01

    Propane is one of the primary components of emissions from natural gas extraction and processing activities. In addition to being an air pollutant, its ratio to other hydrocarbons such as methane and ethane can serve as a "fingerprint" of a particular facility or process, aiding in identifying emission sources. Quantifying propane has typically required laboratory analysis of flask samples, resulting in low temporal resolution and making plume-based measurements infeasible. Here we demonstrate fast (1-second), high precision (infrared spectroscopy at 2967 wavenumbers. In addition, we explore the impact of nearby water and ethane absorption lines on the accuracy and precision of the propane measurement. Finally, we discuss development of a dual-laser instrument capable of simultaneous measurements of methane, ethane, and propane (the C1-C3 compounds), all within a small spatial package that can be easily deployed aboard a mobile platform.

  11. A simulation of driven reconnection by a high precision MHD code

    Kusano, Kanya; Ouchi, Yasuo; Hayashi, Takaya; Horiuchi, Ritoku; Watanabe, Kunihiko; Sato, Tetsuya.

    1988-01-01

    A high precision MHD code, which has the fourth-order accuracy for both the spatial and time steps, is developed, and is applied to the simulation studies of two dimensional driven reconnection. It is confirm that the numerical dissipation of this new scheme is much less than that of two-step Lax-Wendroff scheme. The effect of the plasma compressibility on the reconnection dynamics is investigated by means of this high precision code. (author)

  12. Precision is in their nature

    Antonella Del Rosso

    2014-01-01

    There are more than 100 of them in the LHC ring and they have a total of about 400 degrees of freedom. Each one has 4 motors and the newest ones have their own beam-monitoring pickups. Their jaws constrain the relativistic, high-energy particles to a very small transverse area and protect the machine aperture. We are speaking about the LHC collimators, those ultra-precise instruments that leave escaping unstable particles no chance.   The internal structure of a new LHC collimator featuring (see red arrow) one of the beam position monitor's pickups. Designed at CERN but mostly produced by very specialised manufacturers in Europe, the LHC collimators are among the most complex elements of the accelerator. Their job is to control and safely dispose of the halo particles that are produced by unavoidable beam losses from the circulating beam core. “The LHC collimation system has been designed to ensure that beam losses in superconducting magnets remain below quench limits in al...

  13. The Age of Precision Cosmology

    Chuss, David T.

    2012-01-01

    In the past two decades, our understanding of the evolution and fate of the universe has increased dramatically. This "Age of Precision Cosmology" has been ushered in by measurements that have both elucidated the details of the Big Bang cosmology and set the direction for future lines of inquiry. Our universe appears to consist of 5% baryonic matter; 23% of the universe's energy content is dark matter which is responsible for the observed structure in the universe; and 72% of the energy density is so-called "dark energy" that is currently accelerating the expansion of the universe. In addition, our universe has been measured to be geometrically flat to 1 %. These observations and related details of the Big Bang paradigm have hinted that the universe underwent an epoch of accelerated expansion known as Uinflation" early in its history. In this talk, I will review the highlights of modern cosmology, focusing on the contributions made by measurements of the cosmic microwave background, the faint afterglow of the Big Bang. I will also describe new instruments designed to measure the polarization of the cosmic microwave background in order to search for evidence of cosmic inflation.

  14. High precision redundant robotic manipulator

    Young, K.K.D.

    1998-01-01

    A high precision redundant robotic manipulator for overcoming contents imposed by obstacles or imposed by a highly congested work space is disclosed. One embodiment of the manipulator has four degrees of freedom and another embodiment has seven degrees of freedom. Each of the embodiments utilize a first selective compliant assembly robot arm (SCARA) configuration to provide high stiffness in the vertical plane, a second SCARA configuration to provide high stiffness in the horizontal plane. The seven degree of freedom embodiment also utilizes kinematic redundancy to provide the capability of avoiding obstacles that lie between the base of the manipulator and the end effector or link of the manipulator. These additional three degrees of freedom are added at the wrist link of the manipulator to provide pitch, yaw and roll. The seven degrees of freedom embodiment uses one revolute point per degree of freedom. For each of the revolute joints, a harmonic gear coupled to an electric motor is introduced, and together with properly designed based servo controllers provide an end point repeatability of less than 10 microns. 3 figs

  15. Studying antimatter with laser precision

    Katarina Anthony

    2012-01-01

    The next generation of antihydrogen trapping devices, ALPHA-2, is moving into CERN’s Antiproton Decelerator (AD) hall. This brand-new experiment will allow the ALPHA collaboration to conduct studies of antimatter with greater precision. ALPHA spokesperson Jeffrey Hangst was recently awarded a grant by the Carlsberg Foundation, which will be used to purchase equipment for the new experiment.   A 3-D view of the new magnet (in blue) and cryostat. The red lines show the paths of laser beams. LHC-type current leads for the superconducting magnets are visible on the top-right of the image. The ALPHA collaboration has been working to trap and study antihydrogen since 2006. Using antiprotons provided by CERN’s Antiproton Decelerator (AD), ALPHA was the first experiment to trap antihydrogen and to hold it long enough to study its properties. “The new ALPHA-2 experiment will use integrated lasers to probe the trapped antihydrogen,” explains Jeffrey Hangst, ALP...

  16. Neuroticism, intelligence, and intra-individual variability in elementary cognitive tasks: testing the mental noise hypothesis.

    Colom, Roberto; Quiroga, Ma Angeles

    2009-08-01

    Some studies show positive correlations between intraindividual variability in elementary speed measures (reflecting processing efficiency) and individual differences in neuroticism (reflecting instability in behaviour). The so-called neural noise hypothesis assumes that higher levels of noise are related both to smaller indices of processing efficiency and greater levels of neuroticism. Here, we test this hypothesis measuring mental speed by means of three elementary cognitive tasks tapping similar basic processes but varying systematically their content (verbal, numerical, and spatial). Neuroticism and intelligence are also measured. The sample comprised 196 undergraduate psychology students. The results show that (1) processing efficiency is generally unrelated to individual differences in neuroticism, (2) processing speed and efficiency correlate with intelligence, and (3) only the efficiency index is genuinely related to intelligence when the colinearity between speed and efficiency is controlled.

  17. LOFAR Lightning Imaging: Mapping Lightning With Nanosecond Precision

    Hare, B. M.; Scholten, O.; Bonardi, A.; Buitink, S.; Corstanje, A.; Ebert, U.; Falcke, H.; Hörandel, J. R.; Leijnse, H.; Mitra, P.; Mulrey, K.; Nelles, A.; Rachen, J. P.; Rossetto, L.; Rutjes, C.; Schellart, P.; Thoudam, S.; Trinh, T. N. G.; ter Veen, S.; Winchen, T.

    2018-03-01

    Lightning mapping technology has proven instrumental in understanding lightning. In this work we present a pipeline that can use lightning observed by the LOw-Frequency ARray (LOFAR) radio telescope to construct a 3-D map of the flash. We show that LOFAR has unparalleled precision, on the order of meters, even for lightning flashes that are over 20 km outside the area enclosed by LOFAR antennas (˜3,200 km2), and can potentially locate over 10,000 sources per lightning flash. We also show that LOFAR is the first lightning mapping system that is sensitive to the spatial structure of the electrical current during individual lightning leader steps.

  18. [Value of the space perception test for evaluation of the aptitude for precision work in geodesy].

    Remlein-Mozolewska, G

    1982-01-01

    The visual spatial localization ability of geodesy and cartography - employers and of the pupils trained for the mentioned profession has been examined. The examination has been based on work duration and the time of its performance. A correlation between the localization ability and the precision of the hand - movements required in everyday work has been proven. The better the movement precision, the more efficient the visual spatial localization. The length of work has not been significant. The test concerned appeared to be highly useful in geodesy for qualifying workers for the posts requiring good hands efficiency.

  19. Men’s Perception of Raped Women: Test of the Sexually Transmitted Disease Hypothesis and the Cuckoldry Hypothesis

    Prokop Pavol

    2016-06-01

    Full Text Available Rape is a recurrent adaptive problem of female humans and females of a number of non-human animals. Rape has various physiological and reproductive costs to the victim. The costs of rape are furthermore exaggerated by social rejection and blaming of a victim, particularly by men. The negative perception of raped women by men has received little attention from an evolutionary perspective. Across two independent studies, we investigated whether the risk of sexually transmitted diseases (the STD hypothesis, Hypothesis 1 or paternity uncertainty (the cuckoldry hypothesis, Hypothesis 2 influence the negative perception of raped women by men. Raped women received lower attractiveness score than non-raped women, especially in long-term mate attractiveness score. The perceived attractiveness of raped women was not influenced by the presence of experimentally manipulated STD cues on faces of putative rapists. Women raped by three men received lower attractiveness score than women raped by one man. These results provide stronger support for the cuckoldry hypothesis (Hypothesis 2 than for the STD hypothesis (Hypothesis 1. Single men perceived raped women as more attractive than men in a committed relationship (Hypothesis 3, suggesting that the mating opportunities mediate men’s perception of victims of rape. Overall, our results suggest that the risk of cuckoldry underlie the negative perception of victims of rape by men rather than the fear of disease transmission.

  20. Speech cues contribute to audiovisual spatial integration.

    Christopher W Bishop

    Full Text Available Speech is the most important form of human communication but ambient sounds and competing talkers often degrade its acoustics. Fortunately the brain can use visual information, especially its highly precise spatial information, to improve speech comprehension in noisy environments. Previous studies have demonstrated that audiovisual integration depends strongly on spatiotemporal factors. However, some integrative phenomena such as McGurk interference persist even with gross spatial disparities, suggesting that spatial alignment is not necessary for robust integration of audiovisual place-of-articulation cues. It is therefore unclear how speech-cues interact with audiovisual spatial integration mechanisms. Here, we combine two well established psychophysical phenomena, the McGurk effect and the ventriloquist's illusion, to explore this dependency. Our results demonstrate that conflicting spatial cues may not interfere with audiovisual integration of speech, but conflicting speech-cues can impede integration in space. This suggests a direct but asymmetrical influence between ventral 'what' and dorsal 'where' pathways.

  1. Pulsed beams as field probes for precision measurement

    Hudson, J. J.; Ashworth, H. T.; Kara, D. M.; Tarbutt, M. R.; Sauer, B. E.; Hinds, E. A.

    2007-01-01

    We describe a technique for mapping the spatial variation of static electric, static magnetic, and rf magnetic fields using a pulsed atomic or molecular beam. The method is demonstrated using a beam designed to measure the electric dipole moment of the electron. We present maps of the interaction region, showing sensitivity to (i) electric field variation of 1.5 V/cm at 3.3 kV/cm with a spatial resolution of 15 mm; (ii) magnetic field variation of 5 nT with 25 mm resolution; (iii) radio-frequency magnetic field amplitude with 15 mm resolution. This diagnostic technique is very powerful in the context of high-precision atomic and molecular physics experiments, where pulsed beams have not hitherto found widespread application

  2. High precision anatomy for MEG.

    Troebinger, Luzia; López, José David; Lutti, Antoine; Bradbury, David; Bestmann, Sven; Barnes, Gareth

    2014-02-01

    Precise MEG estimates of neuronal current flow are undermined by uncertain knowledge of the head location with respect to the MEG sensors. This is either due to head movements within the scanning session or systematic errors in co-registration to anatomy. Here we show how such errors can be minimized using subject-specific head-casts produced using 3D printing technology. The casts fit the scalp of the subject internally and the inside of the MEG dewar externally, reducing within session and between session head movements. Systematic errors in matching to MRI coordinate system are also reduced through the use of MRI-visible fiducial markers placed on the same cast. Bootstrap estimates of absolute co-registration error were of the order of 1mm. Estimates of relative co-registration error were <1.5mm between sessions. We corroborated these scalp based estimates by looking at the MEG data recorded over a 6month period. We found that the between session sensor variability of the subject's evoked response was of the order of the within session noise, showing no appreciable noise due to between-session movement. Simulations suggest that the between-session sensor level amplitude SNR improved by a factor of 5 over conventional strategies. We show that at this level of coregistration accuracy there is strong evidence for anatomical models based on the individual rather than canonical anatomy; but that this advantage disappears for errors of greater than 5mm. This work paves the way for source reconstruction methods which can exploit very high SNR signals and accurate anatomical models; and also significantly increases the sensitivity of longitudinal studies with MEG. © 2013. Published by Elsevier Inc. All rights reserved.

  3. High precision anatomy for MEG☆

    Troebinger, Luzia; López, José David; Lutti, Antoine; Bradbury, David; Bestmann, Sven; Barnes, Gareth

    2014-01-01

    Precise MEG estimates of neuronal current flow are undermined by uncertain knowledge of the head location with respect to the MEG sensors. This is either due to head movements within the scanning session or systematic errors in co-registration to anatomy. Here we show how such errors can be minimized using subject-specific head-casts produced using 3D printing technology. The casts fit the scalp of the subject internally and the inside of the MEG dewar externally, reducing within session and between session head movements. Systematic errors in matching to MRI coordinate system are also reduced through the use of MRI-visible fiducial markers placed on the same cast. Bootstrap estimates of absolute co-registration error were of the order of 1 mm. Estimates of relative co-registration error were < 1.5 mm between sessions. We corroborated these scalp based estimates by looking at the MEG data recorded over a 6 month period. We found that the between session sensor variability of the subject's evoked response was of the order of the within session noise, showing no appreciable noise due to between-session movement. Simulations suggest that the between-session sensor level amplitude SNR improved by a factor of 5 over conventional strategies. We show that at this level of coregistration accuracy there is strong evidence for anatomical models based on the individual rather than canonical anatomy; but that this advantage disappears for errors of greater than 5 mm. This work paves the way for source reconstruction methods which can exploit very high SNR signals and accurate anatomical models; and also significantly increases the sensitivity of longitudinal studies with MEG. PMID:23911673

  4. High-Precision Half-Life Measurement for the Superallowed β+ Emitter Alm26

    Finlay, P.; Ettenauer, S.; Ball, G. C.; Leslie, J. R.; Svensson, C. E.; Andreoiu, C.; Austin, R. A. E.; Bandyopadhyay, D.; Cross, D. S.; Demand, G.; Djongolov, M.; Garrett, P. E.; Green, K. L.; Grinyer, G. F.; Hackman, G.; Leach, K. G.; Pearson, C. J.; Phillips, A. A.; Sumithrarachchi, C. S.; Triambak, S.; Williams, S. J.

    2011-01-01

    A high-precision half-life measurement for the superallowed β+ emitter Alm26 was performed at the TRIUMF-ISAC radioactive ion beam facility yielding T1/2=6346.54±0.46stat±0.60systms, consistent with, but 2.5 times more precise than, the previous world average. The Alm26 half-life and ft value, 3037.53(61) s, are now the most precisely determined for any superallowed β decay. Combined with recent theoretical corrections for isospin-symmetry-breaking and radiative effects, the corrected Ft value for Alm26, 3073.0(12) s, sets a new benchmark for the high-precision superallowed Fermi β-decay studies used to test the conserved vector current hypothesis and determine the Vud element of the Cabibbo-Kobayashi-Maskawa quark mixing matrix.

  5. Negative affect improves the quality of memories: trading capacity for precision in sensory and working memory.

    Spachtholz, Philipp; Kuhbandner, Christof; Pekrun, Reinhard

    2014-08-01

    Research has shown that negative affect reduces working memory capacity. Commonly, this effect has been attributed to an allocation of resources to task-irrelevant thoughts, suggesting that negative affect has detrimental consequences for working memory performance. However, rather than simply being a detrimental effect, the affect-induced capacity reduction may reflect a trading of capacity for precision of stored representations. To test this hypothesis, we induced neutral or negative affect and concurrently measured the number and precision of representations stored in sensory and working memory. Compared with neutral affect, negative affect reduced the capacity of both sensory and working memory. However, in both memory systems, this decrease in capacity was accompanied by an increase in precision. These findings demonstrate that observers unintentionally trade capacity for precision as a function of affective state and indicate that negative affect can be beneficial for the quality of memories. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  6. New Hypothesis for SOFC Ceramic Oxygen Electrode Mechanisms

    Mogensen, Mogens Bjerg; Chatzichristodoulou, Christodoulos; Graves, Christopher R.

    2016-01-01

    A new hypothesis for the electrochemical reaction mechanism in solid oxide cell ceramic oxygen electrodes is proposed based on literature including our own results. The hypothesis postulates that the observed thin layers of SrO-La2O3 on top of ceramic perovskite and other Ruddlesden-Popper...

  7. Assess the Critical Period Hypothesis in Second Language Acquisition

    Du, Lihong

    2010-01-01

    The Critical Period Hypothesis aims to investigate the reason for significant difference between first language acquisition and second language acquisition. Over the past few decades, researchers carried out a series of studies to test the validity of the hypothesis. Although there were certain limitations in these studies, most of their results…

  8. Dynamical agents' strategies and the fractal market hypothesis

    Vácha, Lukáš; Vošvrda, Miloslav

    2005-01-01

    Roč. 14, č. 2 (2005), s. 172-179 ISSN 1210-0455 Grant - others:GA UK(CZ) 454/2004/A EK/FSV Institutional research plan: CEZ:AV0Z10750506 Keywords : efficient market hypothesis * fractal market hypothesis * agent's investment horizons Subject RIV: AH - Economics

  9. An Exercise for Illustrating the Logic of Hypothesis Testing

    Lawton, Leigh

    2009-01-01

    Hypothesis testing is one of the more difficult concepts for students to master in a basic, undergraduate statistics course. Students often are puzzled as to why statisticians simply don't calculate the probability that a hypothesis is true. This article presents an exercise that forces students to lay out on their own a procedure for testing a…

  10. A default Bayesian hypothesis test for ANOVA designs

    Wetzels, R.; Grasman, R.P.P.P.; Wagenmakers, E.J.

    2012-01-01

    This article presents a Bayesian hypothesis test for analysis of variance (ANOVA) designs. The test is an application of standard Bayesian methods for variable selection in regression models. We illustrate the effect of various g-priors on the ANOVA hypothesis test. The Bayesian test for ANOVA

  11. Adaptation hypothesis of biological efficiency of ionizing radiation

    Kudritskij, Yu.K.; Georgievskij, A.B.; Karpov, V.I.

    1992-01-01

    Adaptation hypothesis of biological efficiency of ionizing radiation is based on acknowledgement of invariance of fundamental laws and principles of biology related to unity of biota and media, evolution and adaptation for radiobiology. The basic arguments for adaptation hypothesis validity, its correspondence to the requirements imposed on scientific hypothes are presented

  12. New methods of testing nonlinear hypothesis using iterative NLLS estimator

    Mahaboob, B.; Venkateswarlu, B.; Mokeshrayalu, G.; Balasiddamuni, P.

    2017-11-01

    This research paper discusses the method of testing nonlinear hypothesis using iterative Nonlinear Least Squares (NLLS) estimator. Takeshi Amemiya [1] explained this method. However in the present research paper, a modified Wald test statistic due to Engle, Robert [6] is proposed to test the nonlinear hypothesis using iterative NLLS estimator. An alternative method for testing nonlinear hypothesis using iterative NLLS estimator based on nonlinear hypothesis using iterative NLLS estimator based on nonlinear studentized residuals has been proposed. In this research article an innovative method of testing nonlinear hypothesis using iterative restricted NLLS estimator is derived. Pesaran and Deaton [10] explained the methods of testing nonlinear hypothesis. This paper uses asymptotic properties of nonlinear least squares estimator proposed by Jenrich [8]. The main purpose of this paper is to provide very innovative methods of testing nonlinear hypothesis using iterative NLLS estimator, iterative NLLS estimator based on nonlinear studentized residuals and iterative restricted NLLS estimator. Eakambaram et al. [12] discussed least absolute deviation estimations versus nonlinear regression model with heteroscedastic errors and also they studied the problem of heteroscedasticity with reference to nonlinear regression models with suitable illustration. William Grene [13] examined the interaction effect in nonlinear models disused by Ai and Norton [14] and suggested ways to examine the effects that do not involve statistical testing. Peter [15] provided guidelines for identifying composite hypothesis and addressing the probability of false rejection for multiple hypotheses.

  13. The Younger Dryas impact hypothesis: A critical review

    van Hoesel, A.; Hoek, W.Z.; Pennock, G.M.; Drury, Martyn

    2014-01-01

    The Younger Dryas impact hypothesis suggests that multiple extraterrestrial airbursts or impacts resulted in the Younger Dryas cooling, extensive wildfires, megafaunal extinctions and changes in human population. After the hypothesis was first published in 2007, it gained much criticism, as the

  14. Stereological analysis of spatial structures

    Hansen, Linda Vadgård

    The thesis deals with stereological analysis of spatial structures. One area of focus has been to improve the precision of well-known stereological estimators by including information that is available via automatic image analysis. Furthermore, the thesis presents a stochastic model for star......-shaped three-dimensional objects using the radial function. It appears that the model is highly fleksiblel in the sense that it can be used to describe an object with arbitrary irregular surface. Results on the distribution of well-known local stereological volume estimators are provided....

  15. Novel encoding and updating of positional, or directional, spatial cues are processed by distinct hippocampal subfields: Evidence for parallel information processing and the "what" stream.

    Hoang, Thu-Huong; Aliane, Verena; Manahan-Vaughan, Denise

    2018-05-01

    The specific roles of hippocampal subfields in spatial information processing and encoding are, as yet, unclear. The parallel map theory postulates that whereas the CA1 processes discrete environmental features (positional cues used to generate a "sketch map"), the dentate gyrus (DG) processes large navigation-relevant landmarks (directional cues used to generate a "bearing map"). Additionally, the two-streams hypothesis suggests that hippocampal subfields engage in differentiated processing of information from the "where" and the "what" streams. We investigated these hypotheses by analyzing the effect of exploration of discrete "positional" features and large "directional" spatial landmarks on hippocampal neuronal activity in rats. As an indicator of neuronal activity we measured the mRNA induction of the immediate early genes (IEGs), Arc and Homer1a. We observed an increase of this IEG mRNA in CA1 neurons of the distal neuronal compartment and in proximal CA3, after novel spatial exploration of discrete positional cues, whereas novel exploration of directional cues led to increases in IEG mRNA in the lower blade of the DG and in proximal CA3. Strikingly, the CA1 did not respond to directional cues and the DG did not respond to positional cues. Our data provide evidence for both the parallel map theory and the two-streams hypothesis and suggest a precise compartmentalization of the encoding and processing of "what" and "where" information occurs within the hippocampal subfields. © 2018 The Authors. Hippocampus Published by Wiley Periodicals, Inc.

  16. Precision requirements for space-based X(CO2) data

    Miller, C.E.; Crisp, D.; Miller, C.E.; Salawitch, J.; Sander, S.P.; Sen, B.; Toon, C.; DeCola, P.L.; Olsen, S.C.; Randerson, J.T.; Michalak, A.M.; Alkhaled, A.; Michalak, A.M.; Rayner, P.; Jacob, D.J.; Suntharalingam, P.; Wofsy, S.C.; Jacob, D.J.; Suntharalingam, P.; Wofsy, S.C.; Jones, D.B.A.; Denning, A.S.; Nicholls, M.E.; O'Brien, D.; Doney, S.C.; Pawson, S.; Pawson, S.; Connor, B.J.; Fung, I.Y.; Tans, P.; Wennberg, P.O.; Yung, Y.L.; Law, R.M.

    2007-01-01

    Precision requirements are determined for space-based column-averaged CO 2 dry air mole fraction X(CO 2 ) data. These requirements result from an assessment of spatial and temporal gradients in X(CO 2 ), the relationship between X(CO 2 ) precision and surface CO 2 flux uncertainties inferred from inversions of the X(CO 2 ) data, and the effects of X(CO 2 ) biases on the fidelity of CO 2 flux inversions. Observational system simulation experiments and synthesis inversion modeling demonstrate that the Orbiting Carbon Observatory mission design and sampling strategy provide the means to achieve these X(CO 2 ) data precision requirements. (authors)

  17. [Progress in precision medicine: a scientific perspective].

    Wang, B; Li, L M

    2017-01-10

    Precision medicine is a new strategy for disease prevention and treatment by taking into account differences in genetics, environment and lifestyles among individuals and making precise diseases classification and diagnosis, which can provide patients with personalized, targeted prevention and treatment. Large-scale population cohort studies are fundamental for precision medicine research, and could produce best evidence for precision medicine practices. Current criticisms on precision medicine mainly focus on the very small proportion of benefited patients, the neglect of social determinants for health, and the possible waste of limited medical resources. In spite of this, precision medicine is still a most hopeful research area, and would become a health care practice model in the future.

  18. Precision Medicine, Cardiovascular Disease and Hunting Elephants.

    Joyner, Michael J

    2016-01-01

    Precision medicine postulates improved prediction, prevention, diagnosis and treatment of disease based on patient specific factors especially DNA sequence (i.e., gene) variants. Ideas related to precision medicine stem from the much anticipated "genetic revolution in medicine" arising seamlessly from the human genome project (HGP). In this essay I deconstruct the concept of precision medicine and raise questions about the validity of the paradigm in general and its application to cardiovascular disease. Thus far precision medicine has underperformed based on the vision promulgated by enthusiasts. While niche successes for precision medicine are likely, the promises of broad based transformation should be viewed with skepticism. Open discussion and debate related to precision medicine are urgently needed to avoid misapplication of resources, hype, iatrogenic interventions, and distraction from established approaches with ongoing utility. Failure to engage in such debate will lead to negative unintended consequences from a revolution that might never come. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Poultry, pig and the risk of BSE following the feed ban in France--a spatial analysis.

    Abrial, David; Calavas, Didier; Jarrige, Nathalie; Ducrot, Christian

    2005-01-01

    A spatial analysis was carried out in order to analyse the reason why the risk of Bovine Spongiform Encephalopathy (BSE) was spatially heterogeneous in France, during the period following the feed ban of Meat and Bone Meal to cattle. The hypothesis of cross-contamination between cattle feedstuff and monogastric feedstuff, which was strongly suggested from previous investigations, was assessed, with the assumption that the higher the pig or poultry density is in a given area, the higher the risk of cross-contamination and cattle infection might be. The data concerned the 467 BSE cases born in France after the ban of meat and bone meal (July 1990) and detected between July 1st, 2001 and December 31, 2003, when the surveillance system was optimal and not spatially biased. The disease mapping models were elaborated with the Bayesian graphical modelling methods and based on a Poisson distribution with spatial smoothing (hierarchical approach) and covariates. The parameters were estimated by a Markov Chain Monte Carlo simulation method. The main result was that the poultry density did not significantly influence the risk of BSE whereas the pig density was significantly associated with an increase in the risk of 2.4% per 10 000 pigs. The areas with a significant pig effect were located in regions with a high pig density as well as a high ratio of pigs to cattle. Despite the absence of a global effect of poultry density on the BSE risk, some areas had a significant poultry effect and the risk was better explained in some others when considering both pig and poultry densities. These findings were in agreement with the hypothesis of cross-contamination, which could take place at the feedstuff factory, during the shipment of food or on the farm. Further studies are needed to more precisely explore how the cross-contamination happened.

  20. New methods for precision Moeller polarimetry*

    Gaskell, D.; Meekins, D.G.; Yan, C.

    2007-01-01

    Precision electron beam polarimetry is becoming increasingly important as parity violation experiments attempt to probe the frontiers of the standard model. In the few GeV regime, Moeller polarimetry is well suited to high-precision measurements, however is generally limited to use at relatively low beam currents (<10 μA). We present a novel technique that will enable precision Moeller polarimetry at very large currents, up to 100 μA. (orig.)

  1. The Development of Precise Engineering Surveying Technology

    LI Guangyun

    2017-10-01

    Full Text Available With the construction of big science projects in China, the precise engineering surveying technology developed rapidly in the 21th century. Firstly, the paper summarized up the current development situation for the precise engineering surveying instrument and theory. Then the three typical cases of the precise engineering surveying practice such as accelerator alignment, industry measurement and high-speed railway surveying technology are focused.

  2. Modeling and control of precision actuators

    Kiong, Tan Kok

    2013-01-01

    IntroductionGrowing Interest in Precise ActuatorsTypes of Precise ActuatorsApplications of Precise ActuatorsNonlinear Dynamics and ModelingHysteresisCreepFrictionForce RipplesIdentification and Compensation of Preisach Hysteresis in Piezoelectric ActuatorsSVD-Based Identification and Compensation of Preisach HysteresisHigh-Bandwidth Identification and Compensation of Hysteretic Dynamics in Piezoelectric ActuatorsConcluding RemarksIdentification and Compensation of Frict

  3. PRECISION COSMOGRAPHY WITH STACKED VOIDS

    Lavaux, Guilhem; Wandelt, Benjamin D.

    2012-01-01

    We present a purely geometrical method for probing the expansion history of the universe from the observation of the shape of stacked voids in spectroscopic redshift surveys. Our method is an Alcock-Paczyński (AP) test based on the average sphericity of voids posited on the local isotropy of the universe. It works by comparing the temporal extent of cosmic voids along the line of sight with their angular, spatial extent. We describe the algorithm that we use to detect and stack voids in redshift shells on the light cone and test it on mock light cones produced from N-body simulations. We establish a robust statistical model for estimating the average stretching of voids in redshift space and quantify the contamination by peculiar velocities. Finally, assuming that the void statistics that we derive from N-body simulations is preserved when considering galaxy surveys, we assess the capability of this approach to constrain dark energy parameters. We report this assessment in terms of the figure of merit (FoM) of the dark energy task force and in particular of the proposed Euclid mission which is particularly suited for this technique since it is a spectroscopic survey. The FoM due to stacked voids from the Euclid wide survey may double that of all other dark energy probes derived from Euclid data alone (combined with Planck priors). In particular, voids seem to outperform baryon acoustic oscillations by an order of magnitude. This result is consistent with simple estimates based on mode counting. The AP test based on stacked voids may be a significant addition to the portfolio of major dark energy probes and its potentialities must be studied in detail.

  4. Handbook of Spatial Statistics

    Gelfand, Alan E

    2010-01-01

    Offers an introduction detailing the evolution of the field of spatial statistics. This title focuses on the three main branches of spatial statistics: continuous spatial variation (point referenced data); discrete spatial variation, including lattice and areal unit data; and, spatial point patterns.

  5. Spatial and environmental connectivity analysis in a cholera vaccine trial.

    Emch, Michael; Ali, Mohammad; Root, Elisabeth D; Yunus, Mohammad

    2009-02-01

    This paper develops theory and methods for vaccine trials that utilize spatial and environmental information. Satellite imagery is used to identify whether households are connected to one another via water bodies in a study area in rural Bangladesh. Then relationships between neighborhood-level cholera vaccine coverage and placebo incidence and neighborhood-level spatial variables are measured. The study hypothesis is that unvaccinated people who are environmentally connected to people who have been vaccinated will be at lower risk compared to unvaccinated people who are environmentally connected to people who have not been vaccinated. We use four datasets including: a cholera vaccine trial database, a longitudinal demographic database of the rural population from which the vaccine trial participants were selected, a household-level geographic information system (GIS) database of the same study area, and high resolution Quickbird satellite imagery. An environmental connectivity metric was constructed by integrating the satellite imagery with the vaccine and demographic databases linked with GIS. The results show that there is a relationship between neighborhood rates of cholera vaccination and placebo incidence. Thus, people are indirectly protected when more people in their environmentally connected neighborhood are vaccinated. This result is similar to our previous work that used a simpler Euclidean distance neighborhood to measure neighborhood vaccine coverage [Ali, M., Emch, M., von Seidlein, L., Yunus, M., Sack, D. A., Holmgren, J., et al. (2005). Herd immunity conferred by killed oral cholera vaccines in Bangladesh. Lancet, 366(9479), 44-49]. Our new method of measuring environmental connectivity is more precise since it takes into account the transmission mode of cholera and therefore this study validates our assertion that the oral cholera vaccine provides indirect protection in addition to direct protection.

  6. Environmental policy without costs? A review of the Porter hypothesis

    Braennlund, Runar; Lundgren, Tommy. e-mail: runar.brannlund@econ.umu.se

    2009-03-15

    This paper reviews the theoretical and empirical literature connected to the so called Porter Hypothesis. That is, to review the literature connected to the discussion about the relation between environmental policy and competitiveness. According to the conventional wisdom environmental policy, aiming for improving the environment through for example emission reductions, do imply costs since scarce resources must be diverted from somewhere else. However, this conventional wisdom has been challenged and questioned recently through what has been denoted the 'Porter hypothesis'. Those in the forefront of the Porter hypothesis challenge the conventional wisdom basically on the ground that resources are used inefficiently in the absence of the right kind of environmental regulations, and that the conventional neo-classical view is too static to take inefficiencies into account. The conclusions that can be made from this review is (1) that the theoretical literature can identify the circumstances and mechanisms that must exist for a Porter effect to occur, (2) that these circumstances are rather non-general, hence rejecting the Porter hypothesis in general, (3) that the empirical literature give no general support for the Porter hypothesis. Furthermore, a closer look at the 'Swedish case' reveals no support for the Porter hypothesis in spite of the fact that Swedish environmental policy the last 15-20 years seems to be in line the prerequisites stated by the Porter hypothesis concerning environmental policy

  7. The linear hypothesis - an idea whose time has passed

    Tschaeche, A.N.

    1995-01-01

    The linear no-threshold hypothesis is the basis for radiation protection standards in the United States. In the words of the National Council on Radiation Protection and Measurements (NCRP), the hypothesis is: open-quotes In the interest of estimating effects in humans conservatively, it is not unreasonable to follow the assumption of a linear relationship between dose and effect in the low dose regions for which direct observational data are not available.close quotes The International Commission on Radiological Protection (ICRP) stated the hypothesis in a slightly different manner: open-quotes One such basic assumption ... is that ... there is ... a linear relationship without threshold between dose and the probability of an effect. The hypothesis was necessary 50 yr ago when it was first enunciated because the dose-effect curve for ionizing radiation for effects in humans was not known. The ICRP and NCRP needed a model to extrapolate high-dose effects to low-dose effects. So the linear no-threshold hypothesis was born. Certain details of the history of the development and use of the linear hypothesis are presented. In particular, use of the hypothesis by the U.S. regulatory agencies is examined. Over time, the sense of the hypothesis has been corrupted. The corruption of the hypothesis into the current paradigm of open-quote a little radiation, no matter how small, can and will harm youclose quotes is presented. The reasons the corruption occurred are proposed. The effects of the corruption are enumerated, specifically, the use of the corruption by the antinuclear forces in the United States and some of the huge costs to U.S. taxpayers due to the corruption. An alternative basis for radiation protection standards to assure public safety, based on the weight of scientific evidence on radiation health effects, is proposed

  8. Brain morphology of the threespine stickleback (Gasterosteus aculeatus) varies inconsistently with respect to habitat complexity: A test of the Clever Foraging Hypothesis.

    Ahmed, Newaz I; Thompson, Cole; Bolnick, Daniel I; Stuart, Yoel E

    2017-05-01

    The Clever Foraging Hypothesis asserts that organisms living in a more spatially complex environment will have a greater neurological capacity for cognitive processes related to spatial memory, navigation, and foraging. Because the telencephalon is often associated with spatial memory and navigation tasks, this hypothesis predicts a positive association between telencephalon size and environmental complexity. The association between habitat complexity and brain size has been supported by comparative studies across multiple species but has not been widely studied at the within-species level. We tested for covariation between environmental complexity and neuroanatomy of threespine stickleback ( Gasterosteus aculeatus ) collected from 15 pairs of lakes and their parapatric streams on Vancouver Island. In most pairs, neuroanatomy differed between the adjoining lake and stream populations. However, the magnitude and direction of this difference were inconsistent between watersheds and did not covary strongly with measures of within-site environmental heterogeneity. Overall, we find weak support for the Clever Foraging Hypothesis in our study.

  9. Precision bounds for gradient magnetometry with atomic ensembles

    Apellaniz, Iagoba; Urizar-Lanz, Iñigo; Zimborás, Zoltán; Hyllus, Philipp; Tóth, Géza

    2018-05-01

    We study gradient magnetometry with an ensemble of atoms with arbitrary spin. We calculate precision bounds for estimating the gradient of the magnetic field based on the quantum Fisher information. For quantum states that are invariant under homogeneous magnetic fields, we need to measure a single observable to estimate the gradient. On the other hand, for states that are sensitive to homogeneous fields, a simultaneous measurement is needed, as the homogeneous field must also be estimated. We prove that for the cases studied in this paper, such a measurement is feasible. We present a method to calculate precision bounds for gradient estimation with a chain of atoms or with two spatially separated atomic ensembles. We also consider a single atomic ensemble with an arbitrary density profile, where the atoms cannot be addressed individually, and which is a very relevant case for experiments. Our model can take into account even correlations between particle positions. While in most of the discussion we consider an ensemble of localized particles that are classical with respect to their spatial degree of freedom, we also discuss the case of gradient metrology with a single Bose-Einstein condensate.

  10. Precision Attitude Control for the BETTII Balloon-Borne Interferometer

    Benford, Dominic J.; Fixsen, Dale J.; Rinehart. Stephen

    2012-01-01

    The Balloon Experimental Twin Telescope for Infrared Interferometry (BETTII) is an 8-meter baseline far-infrared interferometer to fly on a high altitude balloon. Operating at wavelengths of 30-90 microns, BETTII will obtain spatial and spectral information on science targets at angular resolutions down to less than half an arcsecond, a capability unmatched by other far-infrared facilities. This requires attitude control at a level ofless than a tenth of an arcsecond, a great challenge for a lightweight balloon-borne system. We have designed a precision attitude determination system to provide gondola attitude knowledge at a level of 2 milliarcseconds at rates up to 100Hz, with accurate absolute attitude determination at the half arcsecond level at rates of up to 10Hz. A mUlti-stage control system involving rigid body motion and tip-tilt-piston correction provides precision pointing stability to the level required for the far-infrared instrument to perform its spatial/spectral interferometry in an open-loop control. We present key aspects of the design of the attitude determination and control and its development status.

  11. Biostatistics series module 2: Overview of hypothesis testing

    Avijit Hazra

    2016-01-01

    Full Text Available Hypothesis testing (or statistical inference is one of the major applications of biostatistics. Much of medical research begins with a research question that can be framed as a hypothesis. Inferential statistics begins with a null hypothesis that reflects the conservative position of no change or no difference in comparison to baseline or between groups. Usually, the researcher has reason to believe that there is some effect or some difference which is the alternative hypothesis. The researcher therefore proceeds to study samples and measure outcomes in the hope of generating evidence strong enough for the statistician to be able to reject the null hypothesis. The concept of the P value is almost universally used in hypothesis testing. It denotes the probability of obtaining by chance a result at least as extreme as that observed, even when the null hypothesis is true and no real difference exists. Usually, if P is < 0.05 the null hypothesis is rejected and sample results are deemed statistically significant. With the increasing availability of computers and access to specialized statistical software, the drudgery involved in statistical calculations is now a thing of the past, once the learning curve of the software has been traversed. The life sciences researcher is therefore free to devote oneself to optimally designing the study, carefully selecting the hypothesis tests to be applied, and taking care in conducting the study well. Unfortunately, selecting the right test seems difficult initially. Thinking of the research hypothesis as addressing one of five generic research questions helps in selection of the right hypothesis test. In addition, it is important to be clear about the nature of the variables (e.g., numerical vs. categorical; parametric vs. nonparametric and the number of groups or data sets being compared (e.g., two or more than two at a time. The same research question may be explored by more than one type of hypothesis test

  12. Spatial Management Areas

    National Oceanic and Atmospheric Administration, Department of Commerce — Spatial management files combine all related and relevant spatial management files into an integrated fisheries management file. Overlaps of the redundant spatial...

  13. [Spatial mobility on reaching adult age].

    De Coninck, F

    1990-12-01

    "Starting with longitudinal data on two cohorts of women living in the Alpes-Maritimes [France] in 1982 (a sample of 1,500 women in total) we try to establish the role of the spatial distribution of opportunities at a number of key stages in the life cycle: marriage, birth of first child, making professional use of qualifications, confrontation of a situation of professional risk and professional mobility during the years immediately following the completion of studies. The underlying hypothesis is that control of social location often depends on the control of spatial location." (SUMMARY IN ENG) excerpt

  14. High precision ray tracing in cylindrically symmetric electrostatics

    Edwards Jr, David, E-mail: dej122842@gmail.com

    2015-11-15

    Highlights: • High precision ray tracing is formulated using power series techniques. • Ray tracing is possible for fields generated by solution to laplace's equation. • Spatial and temporal orders of 4–10 are included. • Precisions in test geometries of hemispherical deflector analyzer of ∼10{sup −20} have been obtained. • This solution offers a considerable extension to the ray tracing accuracy over the current state of art. - Abstract: With the recent availability of a high order FDM solution to the curved boundary value problem, it is now possible to determine potentials in such geometries with considerably greater accuracy than had been available with the FDM method. In order for the algorithms used in the accurate potential calculations to be useful in ray tracing, an integration of those algorithms needs to be placed into the ray trace process itself. The object of this paper is to incorporate these algorithms into a solution of the equations of motion of the ray and, having done this, to demonstrate its efficacy. The algorithm incorporation has been accomplished by using power series techniques and the solution constructed has been tested by tracing the medial ray through concentric sphere geometries. The testing has indicated that precisions of ray calculations of 10{sup −20} are now possible. This solution offers a considerable extension to the ray tracing accuracy over the current state of art.

  15. Precision measurements at a muon collider

    Dawson, S.

    1995-01-01

    We discuss the potential for making precision measurements of M W and M T at a muon collider and the motivations for each measurement. A comparison is made with the precision measurements expected at other facilities. The measurement of the top quark decay width is also discussed

  16. Visual thread quality for precision miniature mechanisms

    Gillespie, L.K.

    1981-04-01

    Threaded features have eight visual appearance factors which can affect their function in precision miniature mechanisms. The Bendix practice in deburring, finishing, and accepting these conditions on miniature threads is described as is their impact in assemblies of precision miniature electromechanical assemblies.

  17. Analysis of Precision of Activation Analysis Method

    Heydorn, Kaj; Nørgaard, K.

    1973-01-01

    The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...

  18. An aberrant precision account of autism.

    Rebecca P Lawson

    2014-05-01

    Full Text Available Autism is a neurodevelopmental disorder characterised by problems with social-communication, restricted interests and repetitive behaviour. A recent and controversial article presented a compelling normative explanation for the perceptual symptoms of autism in terms of a failure of Bayesian inference (Pellicano and Burr, 2012. In response, we suggested that when Bayesian interference is grounded in its neural instantiation – namely, predictive coding – many features of autistic perception can be attributed to aberrant precision (or beliefs about precision within the context of hierarchical message passing in the brain (Friston et al., 2013. Here, we unpack the aberrant precision account of autism. Specifically, we consider how empirical findings – that speak directly or indirectly to neurobiological mechanisms – are consistent with the aberrant encoding of precision in autism; in particular, an imbalance of the precision ascribed to sensory evidence relative to prior beliefs.

  19. Precision medicine for psychopharmacology: a general introduction.

    Shin, Cheolmin; Han, Changsu; Pae, Chi-Un; Patkar, Ashwin A

    2016-07-01

    Precision medicine is an emerging medical model that can provide accurate diagnoses and tailored therapeutic strategies for patients based on data pertaining to genes, microbiomes, environment, family history and lifestyle. Here, we provide basic information about precision medicine and newly introduced concepts, such as the precision medicine ecosystem and big data processing, and omics technologies including pharmacogenomics, pharamacometabolomics, pharmacoproteomics, pharmacoepigenomics, connectomics and exposomics. The authors review the current state of omics in psychiatry and the future direction of psychopharmacology as it moves towards precision medicine. Expert commentary: Advances in precision medicine have been facilitated by achievements in multiple fields, including large-scale biological databases, powerful methods for characterizing patients (such as genomics, proteomics, metabolomics, diverse cellular assays, and even social networks and mobile health technologies), and computer-based tools for analyzing large amounts of data.

  20. Precision surveying the principles and geomatics practice

    Ogundare, John Olusegun

    2016-01-01

    A comprehensive overview of high precision surveying, including recent developments in geomatics and their applications This book covers advanced precision surveying techniques, their proper use in engineering and geoscience projects, and their importance in the detailed analysis and evaluation of surveying projects. The early chapters review the fundamentals of precision surveying: the types of surveys; survey observations; standards and specifications; and accuracy assessments for angle, distance and position difference measurement systems. The book also covers network design and 3-D coordinating systems before discussing specialized topics such as structural and ground deformation monitoring techniques and analysis, mining surveys, tunneling surveys, and alignment surveys. Precision Surveying: The Principles and Geomatics Practice: * Covers structural and ground deformation monitoring analysis, advanced techniques in mining and tunneling surveys, and high precision alignment of engineering structures *...

  1. Null but not void: considerations for hypothesis testing.

    Shaw, Pamela A; Proschan, Michael A

    2013-01-30

    Standard statistical theory teaches us that once the null and alternative hypotheses have been defined for a parameter, the choice of the statistical test is clear. Standard theory does not teach us how to choose the null or alternative hypothesis appropriate to the scientific question of interest. Neither does it tell us that in some cases, depending on which alternatives are realistic, we may want to define our null hypothesis differently. Problems in statistical practice are frequently not as pristinely summarized as the classic theory in our textbooks. In this article, we present examples in statistical hypothesis testing in which seemingly simple choices are in fact rich with nuance that, when given full consideration, make the choice of the right hypothesis test much less straightforward. Published 2012. This article is a US Government work and is in the public domain in the USA.

  2. Alzheimer's disease: the amyloid hypothesis and the Inverse Warburg effect

    Demetrius, Lloyd A.; Magistretti, Pierre J.; Pellerin, Luc

    2015-01-01

    Epidemiological and biochemical studies show that the sporadic forms of Alzheimer's disease (AD) are characterized by the following hallmarks: (a) An exponential increase with age; (b) Selective neuronal vulnerability; (c) Inverse cancer comorbidity. The present article appeals to these hallmarks to evaluate and contrast two competing models of AD: the amyloid hypothesis (a neuron-centric mechanism) and the Inverse Warburg hypothesis (a neuron-astrocytic mechanism). We show that these three hallmarks of AD conflict with the amyloid hypothesis, but are consistent with the Inverse Warburg hypothesis, a bioenergetic model which postulates that AD is the result of a cascade of three events—mitochondrial dysregulation, metabolic reprogramming (the Inverse Warburg effect), and natural selection. We also provide an explanation for the failures of the clinical trials based on amyloid immunization, and we propose a new class of therapeutic strategies consistent with the neuroenergetic selection model.

  3. Cross-system log file analysis for hypothesis testing

    Glahn, Christian

    2008-01-01

    Glahn, C. (2008). Cross-system log file analysis for hypothesis testing. Presented at Empowering Learners for Lifelong Competence Development: pedagogical, organisational and technological issues. 4th TENCompetence Open Workshop. April, 10, 2008, Madrid, Spain.

  4. Hypothesis Testing Using the Films of the Three Stooges

    Gardner, Robert; Davidson, Robert

    2010-01-01

    The use of The Three Stooges' films as a source of data in an introductory statistics class is described. The Stooges' films are separated into three populations. Using these populations, students may conduct hypothesis tests with data they collect.

  5. Incidence of allergy and atopic disorders and hygiene hypothesis.

    Bencko, V.; Šíma, Petr

    2017-01-01

    Roč. 2, 6 March (2017), č. článku 1244. ISSN 2474-1663 Institutional support: RVO:61388971 Keywords : allergy disorders * atopic disorders * hygiene hypothesis Subject RIV: EE - Microbiology, Virology OBOR OECD: Microbiology

  6. Alzheimer's disease: the amyloid hypothesis and the Inverse Warburg effect

    Demetrius, Lloyd A.

    2015-01-14

    Epidemiological and biochemical studies show that the sporadic forms of Alzheimer\\'s disease (AD) are characterized by the following hallmarks: (a) An exponential increase with age; (b) Selective neuronal vulnerability; (c) Inverse cancer comorbidity. The present article appeals to these hallmarks to evaluate and contrast two competing models of AD: the amyloid hypothesis (a neuron-centric mechanism) and the Inverse Warburg hypothesis (a neuron-astrocytic mechanism). We show that these three hallmarks of AD conflict with the amyloid hypothesis, but are consistent with the Inverse Warburg hypothesis, a bioenergetic model which postulates that AD is the result of a cascade of three events—mitochondrial dysregulation, metabolic reprogramming (the Inverse Warburg effect), and natural selection. We also provide an explanation for the failures of the clinical trials based on amyloid immunization, and we propose a new class of therapeutic strategies consistent with the neuroenergetic selection model.

  7. The Double-Deficit Hypothesis in Spanish Developmental Dyslexia

    Jimenez, Juan E.; Hernandez-Valle, Isabel; Rodriguez, Cristina; Guzman, Remedios; Diaz, Alicia; Ortiz, Rosario

    2008-01-01

    The double-deficit hypothesis (DDH) of developmental dyslexia was investigated in seven to twelve year old Spanish children. It was observed that the double deficit (DD) group had the greatest difficulty with reading.

  8. A digital x-ray imaging MWPC detector system for precision absorptiometry

    Batemen, J.E.; Connolly, J.F.; Glasgow, W.

    1977-11-01

    An X-ray absorptiometric imaging system (based on a xenon-filled multiwire proportional counter) has been developed with high counting rate capability, good spatial resolution and linear mass response, aimed at permitting bone mass measurements to be made in the peripheral skeleton with precision approaching 1%. The system is described and preliminary results on test phantoms are presented. (author)

  9. HD 101065, the Most Peculiar Star: First Results from Precise Radial ...

    Abstract. In this paper we discuss the prospects for asteroseismology with spatial resolution and motivate studies of the most chemically peculiar. roAp star HD 101065. We present the first results from a high-precision radial velocity (RV) study of HD 101065 based on data spanning four nights that were acquired using the ...

  10. A robust null hypothesis for the potential causes of megadrought in western North America

    Ault, T.; St George, S.; Smerdon, J. E.; Coats, S.; Mankin, J. S.; Cruz, C. C.; Cook, B.; Stevenson, S.

    2017-12-01

    The western United States was affected by several megadroughts during the last 1200 years, most prominently during the Medieval Climate Anomaly (MCA: 800 to 1300 CE). A null hypothesis is developed to test the possibility that, given a sufficiently long period of time, these events are inevitable and occur purely as a consequence of internal climate variability. The null distribution of this hypothesis is populated by a linear inverse model (LIM) constructed from global sea-surface temperature anomalies and self-calibrated Palmer Drought Severity Index data for North America. Despite being trained only on seasonal data from the late 20th century, the LIM produces megadroughts that are comparable in their duration, spatial scale, and magnitude as the most severe events of the last 12 centuries. The null hypothesis therefore cannot be rejected with much confidence when considering these features of megadrought, meaning that similar events are possible today, even without any changes to boundary conditions. In contrast, the observed clustering of megadroughts in the MCA, as well as the change in mean hydroclimate between the MCA and the 1500-2000 period, are more likely to have been caused by either external forcing or by internal climate variability not well sampled during the latter half of the Twentieth Century. Finally, the results demonstrate the LIM is a viable tool for determining whether paleoclimate reconstructions events should be ascribed to external forcings, "out of sample" climate mechanisms, or if they are consistent with the variability observed during the recent period.

  11. Validity of the Taylor hypothesis for linear kinetic waves in the weakly collisional solar wind

    Howes, G. G.; Klein, K. G.; TenBarge, J. M.

    2014-01-01

    The interpretation of single-point spacecraft measurements of solar wind turbulence is complicated by the fact that the measurements are made in a frame of reference in relative motion with respect to the turbulent plasma. The Taylor hypothesis—that temporal fluctuations measured by a stationary probe in a rapidly flowing fluid are dominated by the advection of spatial structures in the fluid rest frame—is often assumed to simplify the analysis. But measurements of turbulence in upcoming missions, such as Solar Probe Plus, threaten to violate the Taylor hypothesis, either due to slow flow of the plasma with respect to the spacecraft or to the dispersive nature of the plasma fluctuations at small scales. Assuming that the frequency of the turbulent fluctuations is characterized by the frequency of the linear waves supported by the plasma, we evaluate the validity of the Taylor hypothesis for the linear kinetic wave modes in the weakly collisional solar wind. The analysis predicts that a dissipation range of solar wind turbulence supported by whistler waves is likely to violate the Taylor hypothesis, while one supported by kinetic Alfvén waves is not.

  12. THE EFFECT OF BASIC MOTOR ABILITIES ON DRIBBLING SPEED AND PRECISION IN SOCCER GAME

    Ismail Selimović; Mehmeti Ejup

    2011-01-01

    Effects of basic motor skills on situational-motor abilities for speed dribble and ball control precision assessment in soccer game at boys aged 12-14 years were analyzed with regression analysis. For this purpose, 17 variables for basic motor parameters were selected, as well as three situational tests. In every example of the regression analysis results, the results obtained showed confirmation of the hypothesis of significant effects of the morphological characteristics on the results in a...

  13. The Random-Walk Hypothesis on the Indian Stock Market

    Ankita Mishra; Vinod Mishra; Russell Smyth

    2014-01-01

    This study tests the random walk hypothesis for the Indian stock market. Using 19 years of monthly data on six indices from the National Stock Exchange (NSE) and the Bombay Stock Exchange (BSE), this study applies three different unit root tests with two structural breaks to analyse the random walk hypothesis. We find that unit root tests that allow for two structural breaks alone are not able to reject the unit root null; however, a recently developed unit root test that simultaneously accou...

  14. The Fractal Market Hypothesis: Applications to Financial Forecasting

    Blackledge, Jonathan

    2010-01-01

    Most financial modelling systems rely on an underlying hypothesis known as the Efficient Market Hypothesis (EMH) including the famous Black-Scholes formula for placing an option. However, the EMH has a fundamental flaw: it is based on the assumption that economic processes are normally distributed and it has long been known that this is not the case. This fundamental assumption leads to a number of shortcomings associated with using the EMH to analyse financial data which includes failure to ...

  15. Dopamine and Reward: The Anhedonia Hypothesis 30 years on

    Wise, Roy A.

    2008-01-01

    The anhedonia hypothesis – that brain dopamine plays a critical role in the subjective pleasure associated with positive rewards – was intended to draw the attention of psychiatrists to the growing evidence that dopamine plays a critical role in the objective reinforcement and incentive motivation associated with food and water, brain stimulation reward, and psychomotor stimulant and opiate reward. The hypothesis called to attention the apparent paradox that neuroleptics, drugs used to treat ...

  16. Conditional sampling technique to test the applicability of the Taylor hypothesis for the large-scale coherent structures

    Hussain, A. K. M. F.

    1980-01-01

    Comparisons of the distributions of large scale structures in turbulent flow with distributions based on time dependent signals from stationary probes and the Taylor hypothesis are presented. The study investigated an area in the near field of a 7.62 cm circular air jet at a Re of 32,000, specifically having coherent structures through small-amplitude controlled excitation and stable vortex pairing in the jet column mode. Hot-wire and X-wire anemometry were employed to establish phase averaged spatial distributions of longitudinal and lateral velocities, coherent Reynolds stress and vorticity, background turbulent intensities, streamlines and pseudo-stream functions. The Taylor hypothesis was used to calculate spatial distributions of the phase-averaged properties, with results indicating that the usage of the local time-average velocity or streamwise velocity produces large distortions.

  17. rpe v5: an emulator for reduced floating-point precision in large numerical simulations

    Dawson, Andrew; Düben, Peter D.

    2017-06-01

    This paper describes the rpe (reduced-precision emulator) library which has the capability to emulate the use of arbitrary reduced floating-point precision within large numerical models written in Fortran. The rpe software allows model developers to test how reduced floating-point precision affects the result of their simulations without having to make extensive code changes or port the model onto specialized hardware. The software can be used to identify parts of a program that are problematic for numerical precision and to guide changes to the program to allow a stronger reduction in precision.The development of rpe was motivated by the strong demand for more computing power. If numerical precision can be reduced for an application under consideration while still achieving results of acceptable quality, computational cost can be reduced, since a reduction in numerical precision may allow an increase in performance or a reduction in power consumption. For simulations with weather and climate models, savings due to a reduction in precision could be reinvested to allow model simulations at higher spatial resolution or complexity, or to increase the number of ensemble members to improve predictions. rpe was developed with a particular focus on the community of weather and climate modelling, but the software could be used with numerical simulations from other domains.

  18. Experience, but not distance, influences the recruitment precision in the stingless bee Scaptotrigona mexicana

    Sánchez, Daniel; Kraus, F. Bernhard; Hernández, Manuel De Jesús; Vandame, Rémy

    2007-07-01

    Recruitment precision, i.e. the proportion of recruits that reach an advertised food source, is a crucial adaptation of social bees to their environment. Studies with honeybees showed that recruitment precision is not a fixed feature, but it may be enhanced by factors like experience and distance. However, little is known regarding the recruitment precision of stingless bees. Hence, in this study, we examined the effects of experience and spatial distance on the precision of the food communication system of the stingless bee Scaptotrigona mexicana. We conducted the experiments by training bees to a three-dimensional artificial patch at several distances from the colony. We recorded the choices of individual recruited foragers, either being newcomers (foragers without experience with the advertised food source) or experienced (foragers that had previously visited the feeder). We found that the average precision of newcomers (95.6 ± 2.61%) was significantly higher than that of experienced bees (80.2 ± 1.12%). While this might seem counter-intuitive on first sight, this “loss” of precision can be explained by the tendency of experienced recruits to explore nearby areas to find new rewarding food sources after they had initially learned the exact location of the food source. Increasing the distance from the colony had no significant effect on the precision of the foraging bees. Thus, our data show that experience, but not the distance of the food source, affected the patch precision of S. mexicana foragers.

  19. A precision measurement of the mass of the top quark

    Abazov, V.M.

    2004-01-01

    The standard model of particle physics contains parameters -- such as particle masses -- whose origins are still unknown and which cannot be predicted, but whose values are constrained through their interactions. In particular, the masses of the top quark (M t ) and W boson (M W ) constrain the mass of the long-hypothesized, but thus far not observed, Higgs boson. A precise measurement of M t can therefore indicate where to look for the Higgs, and indeed whether the hypothesis of a standard model Higgs is consistent with experimental data. As top quarks are produced in pairs and decay in only about 10 -24 s into various final states, reconstructing their masses from their decay products is very challenging. Here we report a technique that extracts more information from each top-quark event and yields a greatly improved precision (of +- 5.3 GeV/c 2 ) when compared to previous measurements. When our new result is combined with our published measurement in a complementary decay mode and with the only other measurements available, the new world average for M t becomes 178.0 +- 4.3 GeV/c 2 . As a result, the most likely Higgs mass increases from the experimentally excluded value of 96 to 117 GeV/c 2 , which is beyond current experimental sensitivity. The upper limit on the Higgs mass at the 95% confidence level is raised from 219 to 251 GeV/c 2

  20. A neural measure of precision in visual working memory.

    Ester, Edward F; Anderson, David E; Serences, John T; Awh, Edward

    2013-05-01

    Recent studies suggest that the temporary storage of visual detail in working memory is mediated by sensory recruitment or sustained patterns of stimulus-specific activation within feature-selective regions of visual cortex. According to a strong version of this hypothesis, the relative "quality" of these patterns should determine the clarity of an individual's memory. Here, we provide a direct test of this claim. We used fMRI and a forward encoding model to characterize population-level orientation-selective responses in visual cortex while human participants held an oriented grating in memory. This analysis, which enables a precise quantitative description of multivoxel, population-level activity measured during working memory storage, revealed graded response profiles whose amplitudes were greatest for the remembered orientation and fell monotonically as the angular distance from this orientation increased. Moreover, interparticipant differences in the dispersion-but not the amplitude-of these response profiles were strongly correlated with performance on a concurrent memory recall task. These findings provide important new evidence linking the precision of sustained population-level responses in visual cortex and memory acuity.

  1. Precision of jaw-closing movements for different jaw gaps.

    Hellmann, Daniel; Becker, Georg; Giannakopoulos, Nikolaos N; Eberhard, Lydia; Fingerhut, Christopher; Rammelsberg, Peter; Schindler, Hans J

    2014-02-01

    Jaw-closing movements are basic components of physiological motor actions precisely achieving intercuspation without significant interference. The main purpose of this study was to test the hypothesis that, despite an imperfect intercuspal position, the precision of jaw-closing movements fluctuates within the range of physiological closing movements indispensable for meeting intercuspation without significant interference. For 35 healthy subjects, condylar and incisal point positions for fast and slow jaw-closing, interrupted at different jaw gaps by the use of frontal occlusal plateaus, were compared with uninterrupted physiological jaw closing, with identical jaw gaps, using a telemetric system for measuring jaw position. Examiner-guided centric relation served as a clinically relevant reference position. For jaw gaps ≤4 mm, no significant horizontal or vertical displacement differences were observed for the incisal or condylar points among physiological, fast, and slow jaw-closing. However, the jaw positions under these three closing conditions differed significantly from guided centric relation for nearly all experimental jaw gaps. The findings provide evidence of stringent neuromuscular control of jaw-closing movements in the vicinity of intercuspation. These results might be of clinical relevance to occlusal intervention with different objectives. © 2013 Eur J Oral Sci.

  2. Vermeer's The Little Street: a precise location

    Frans Grijzenhout

    2018-03-01

    Full Text Available In the autumn of 2015, Frans Grijzenhout published his sensational findings regarding the likely location of Johannes Vermeer’s ‘little street’ (The Little Street. After consulting a variety of sources, including the ‘The Ledger of Dredging of the Canals in the Town of Delft’ from 1666–1667, he had reached the conclusion that the famous painting by Vermeer must have been based on the houses and two intervening passageways that in Vermeer’s day stood on Vlamingstraat, an unassuming canal in the eastern part of Delft, where numbers 40 and 42 stand today. He had also ascertained that one of Vermeer’s aunts, Ariaentgen Claes van der Minne, was the occupant of 42 Vlamingstraat at that time. Several authors have since produced material indicating that Vermeer painted the right-hand house in Little Street ‘from life’: the house was, it now appears, observed and reproduced in meticulous detail. The same can now be confirmed for other aspects, such as the colour used for the painted shutters and the recesses for wind hooks in the sill of the window of the righthand house. Philip Steadman has rightly pointed to an apparent discrepancy of four feet (c. 1.25 m. between the details in the aforementioned ‘Ledger of the Dredging of the Canals in the Town of Delft’ and the actual spatial situation at 40 Vlamingstraat. This difference can be traced back to the fact that the gateway provided access to both front and back houses. Accordingly, the owners of both front and back houses would have been taxed on the width (four feet of the passageway. Given what we know about the meticulous precision with which the Ledger was compiled in the case of 42 Vlamingstraat, it is inconceivable that the authors of the register for number 40 should have made a mistake. A spatial rendering based on an earlier perspective study of The Little Street, corresponds surprisingly well, and in some respects in detail, with the cadastral and other information we

  3. Testing the null hypothesis: the forgotten legacy of Karl Popper?

    Wilkinson, Mick

    2013-01-01

    Testing of the null hypothesis is a fundamental aspect of the scientific method and has its basis in the falsification theory of Karl Popper. Null hypothesis testing makes use of deductive reasoning to ensure that the truth of conclusions is irrefutable. In contrast, attempting to demonstrate the new facts on the basis of testing the experimental or research hypothesis makes use of inductive reasoning and is prone to the problem of the Uniformity of Nature assumption described by David Hume in the eighteenth century. Despite this issue and the well documented solution provided by Popper's falsification theory, the majority of publications are still written such that they suggest the research hypothesis is being tested. This is contrary to accepted scientific convention and possibly highlights a poor understanding of the application of conventional significance-based data analysis approaches. Our work should remain driven by conjecture and attempted falsification such that it is always the null hypothesis that is tested. The write up of our studies should make it clear that we are indeed testing the null hypothesis and conforming to the established and accepted philosophical conventions of the scientific method.

  4. Spatial memory and integration processes in congenital blindness.

    Vecchi, Tomaso; Tinti, Carla; Cornoldi, Cesare

    2004-12-22

    The paper tests the hypothesis that difficulties met by the blind in spatial processing are due to the simultaneous treatment of independent spatial representations. Results showed that lack of vision does not impede the ability to process and transform mental images; however, blind people are significantly poorer in the recall of more than a single spatial pattern at a time than in the recall of the corresponding material integrated into a single pattern. It is concluded that the simultaneous maintenance of different spatial information is affected by congenital blindness, while cognitive processes that may involve sequential manipulation are not.

  5. Geo-registration of Unprofessional and Weakly-related Image and Precision Evaluation

    LIU Yingzhen

    2015-09-01

    Full Text Available The 3D geo-spatial model built by unprofessional and weakly-related image is a significant source of geo-spatial information. The unprofessional and weakly-related image cannot be useful geo-spatial information until be geo-registered with accurate geo-spatial orientation and location. In this paper, we present an automatic geo-registration using the coordination acquired by real-time GPS module. We calculate 2D and 3D spatial transformation parameters based on the spatial similarity between the image location in the geo-spatial coordination system and in the 3D reconstruction coordination system. Because of the poor precision of GPS information and especially the unstability of elevation measurement, we use RANSAC algorithm to get rid of outliers. In the experiment, we compare the geo-registered image positions to their differential GPS coordinates. The errors of translation, rotation and scaling are evaluated quantitively and the causes of bad result are analyzed. The experiment demonstrates that this geo-registration method can get a precise result with enough images.

  6. From ear to body: the auditory-motor loop in spatial cognition

    Isabelle eViaud-Delmon

    2014-09-01

    Full Text Available Spatial memory is mainly studied through the visual sensory modality: navigation tasks in humans rarely integrate dynamic and spatial auditory information. In order to study how a spatial scene can be memorized on the basis of auditory and idiothetic cues only, we constructed an auditory equivalent of the Morris water maze, a task widely used to assess spatial learning and memory in rodents. Participants were equipped with wireless headphones, which delivered a soundscape updated in real time according to their movements in 3D space. A wireless tracking system (video infrared with passive markers was used to send the coordinates of the subject’s head to the sound rendering system. The rendering system used advanced HRTF-based synthesis of directional cues and room acoustic simulation for the auralization of a realistic acoustic environment. Participants were guided blindfolded in an experimental room. Their task was to explore a delimitated area in order to find a hidden auditory target, i.e. a sound that was only triggered when walking on a precise location of the area. The position of this target could be coded in relationship to auditory landmarks constantly rendered during the exploration of the area. The task was composed of a practice trial, 6 acquisition trials during which they had to memorise the localisation of the target, and 4 test trials in which some aspects of the auditory scene were modified. The task ended with a probe trial in which the auditory target was removed.The configuration of searching paths allowed observing how auditory information was coded to memorise the position of the target. They suggested that space can be efficiently coded without visual information in normal sighted subjects. In conclusion, space representation can be based on sensorimotor and auditory cues only, providing another argument in favour of the hypothesis that the brain has access to a modality-invariant representation of external space.

  7. From ear to body: the auditory-motor loop in spatial cognition.

    Viaud-Delmon, Isabelle; Warusfel, Olivier

    2014-01-01

    SPATIAL MEMORY IS MAINLY STUDIED THROUGH THE VISUAL SENSORY MODALITY: navigation tasks in humans rarely integrate dynamic and spatial auditory information. In order to study how a spatial scene can be memorized on the basis of auditory and idiothetic cues only, we constructed an auditory equivalent of the Morris water maze, a task widely used to assess spatial learning and memory in rodents. Participants were equipped with wireless headphones, which delivered a soundscape updated in real time according to their movements in 3D space. A wireless tracking system (video infrared with passive markers) was used to send the coordinates of the subject's head to the sound rendering system. The rendering system used advanced HRTF-based synthesis of directional cues and room acoustic simulation for the auralization of a realistic acoustic environment. Participants were guided blindfolded in an experimental room. Their task was to explore a delimitated area in order to find a hidden auditory target, i.e., a sound that was only triggered when walking on a precise location of the area. The position of this target could be coded in relationship to auditory landmarks constantly rendered during the exploration of the area. The task was composed of a practice trial, 6 acquisition trials during which they had to memorize the localization of the target, and 4 test trials in which some aspects of the auditory scene were modified. The task ended with a probe trial in which the auditory target was removed. The configuration of searching paths allowed observing how auditory information was coded to memorize the position of the target. They suggested that space can be efficiently coded without visual information in normal sighted subjects. In conclusion, space representation can be based on sensorimotor and auditory cues only, providing another argument in favor of the hypothesis that the brain has access to a modality-invariant representation of external space.

  8. Toward precision medicine in Alzheimer's disease.

    Reitz, Christiane

    2016-03-01

    In Western societies, Alzheimer's disease (AD) is the most common form of dementia and the sixth leading cause of death. In recent years, the concept of precision medicine, an approach for disease prevention and treatment that is personalized to an individual's specific pattern of genetic variability, environment and lifestyle factors, has emerged. While for some diseases, in particular select cancers and a few monogenetic disorders such as cystic fibrosis, significant advances in precision medicine have been made over the past years, for most other diseases precision medicine is only in its beginning. To advance the application of precision medicine to a wider spectrum of disorders, governments around the world are starting to launch Precision Medicine Initiatives, major efforts to generate the extensive scientific knowledge needed to integrate the model of precision medicine into every day clinical practice. In this article we summarize the state of precision medicine in AD, review major obstacles in its development, and discuss its benefits in this highly prevalent, clinically and pathologically complex disease.

  9. [Precision Nursing: Individual-Based Knowledge Translation].

    Chiang, Li-Chi; Yeh, Mei-Ling; Su, Sui-Lung

    2016-12-01

    U.S. President Obama announced a new era of precision medicine in the Precision Medicine Initiative (PMI). This initiative aims to accelerate the progress of personalized medicine in light of individual requirements for prevention and treatment in order to improve the state of individual and public health. The recent and dramatic development of large-scale biologic databases (such as the human genome sequence), powerful methods for characterizing patients (such as genomics, microbiome, diverse biomarkers, and even pharmacogenomics), and computational tools for analyzing big data are maximizing the potential benefits of precision medicine. Nursing science should follow and keep pace with this trend in order to develop empirical knowledge and expertise in the area of personalized nursing care. Nursing scientists must encourage, examine, and put into practice innovative research on precision nursing in order to provide evidence-based guidance to clinical practice. The applications in personalized precision nursing care include: explanations of personalized information such as the results of genetic testing; patient advocacy and support; anticipation of results and treatment; ongoing chronic monitoring; and support for shared decision-making throughout the disease trajectory. Further, attention must focus on the family and the ethical implications of taking a personalized approach to care. Nurses will need to embrace the paradigm shift to precision nursing and work collaboratively across disciplines to provide the optimal personalized care to patients. If realized, the full potential of precision nursing will provide the best chance for good health for all.

  10. Comparison of the precision of three commonly used GPS models

    E Chavoshi

    2016-04-01

    Full Text Available Introduction: Development of science in various fields has caused change in the methods to determine geographical location. Precision farming involves new technology that provides the opportunity for farmers to change in factors such as nutrients, soil moisture available to plants, soil physical and chemical characteristics and other factors with the spatial resolution of less than a centimeter to several meters to monitor and evaluate. GPS receivers based on precision farming operations specified accuracies are used in the following areas: 1 monitoring of crop and soil sampling (less than one meter accuracy 2 use of fertilizer, pesticide and seed work (less than half a meter accuracy 3 Transplantation and row cultivation (precision of less than 4 cm (Perez et al., 2011. In one application of GPS in agriculture, route guidance precision farming tractors in the fields was designed to reduce the transmission error that deviate from the path specified in the range of 50 to 300 mm driver informed and improved way to display (Perez et al., 2011. In another study, the system automatically guidance, based on RTK-GPS technology, precision tillage operations was used between and within the rows very close to the drip irrigation pipe and without damage to their crops at a distance of 50 mm (Abidine et al., 2004. In another study, to compare the accuracy and precision of the receivers, 5 different models of Trimble Mark GPS devices from 15 stations were mapped, the results indicated that minimum error was related to Geo XT model with an accuracy of 91 cm and maximum error was related to Pharos model with an accuracy of 5.62 m (Kindra et al., 2006. Due to the increasing use of GPS receivers in agriculture as well as the lack of trust on the real accuracy and precision of receivers, this study aimed to compare the positioning accuracy and precision of three commonly used GPS receivers models used to specify receivers with the lowest error for precision

  11. TLD array for precise dose measurements in stereotactic radiation techniques

    Ertl, A.; Kitz, K.; Griffitt, W.; Hartl, R.F.E.; Zehetmayer, M.

    1996-01-01

    We developed a new TLD array for precise dose measurement and verification of the spatial dose distribution in small radiation targets. It consists of a hemicylindrical, tissue-equivalent rod made of polystyrene with 17 parallel moulds for an exact positioning of each TLD. The spatial resolution of the TLD array was evaluated using the Leskell spherical phantom. Dose planning was performed with KULA 4.4 under stereotactic conditions on axial CT images. In the Leksell gamma unit the TLD array was irradiated with a maximal dose of 10 Gy with an unplugged 14 mm collimator. The doses delivered to the TLDs were rechecked by diode detector and film dosimetry and compared to the computer-generated dose profile. We found excellent agreement of our measured values, even at the critical penumbra decline. For the 14 mm and 18 mm collimator and for the 11 mm collimator combination we compared the measured and calculated data at full width at half maximum. This TLD array may be useful for phantom or tissue model studies on the spatial dose distribution in confined radiation targets as used in stereotactic radiotherapy. (author)

  12. Effective Connectivity Reveals Right-Hemisphere Dominance in Audiospatial Perception: Implications for Models of Spatial Neglect

    Friston, Karl J.; Mattingley, Jason B.; Roepstorff, Andreas; Garrido, Marta I.

    2014-01-01

    Detecting the location of salient sounds in the environment rests on the brain's ability to use differences in sounds arriving at both ears. Functional neuroimaging studies in humans indicate that the left and right auditory hemispaces are coded asymmetrically, with a rightward attentional bias that reflects spatial attention in vision. Neuropsychological observations in patients with spatial neglect have led to the formulation of two competing models: the orientation bias and right-hemisphere dominance models. The orientation bias model posits a symmetrical mapping between one side of the sensorium and the contralateral hemisphere, with mutual inhibition of the ipsilateral hemisphere. The right-hemisphere dominance model introduces a functional asymmetry in the brain's coding of space: the left hemisphere represents the right side, whereas the right hemisphere represents both sides of the sensorium. We used Dynamic Causal Modeling of effective connectivity and Bayesian model comparison to adjudicate between these alternative network architectures, based on human electroencephalographic data acquired during an auditory location oddball paradigm. Our results support a hemispheric asymmetry in a frontoparietal network that conforms to the right-hemisphere dominance model. We show that, within this frontoparietal network, forward connectivity increases selectively in the hemisphere contralateral to the side of sensory stimulation. We interpret this finding in light of hierarchical predictive coding as a selective increase in attentional gain, which is mediated by feedforward connections that carry precision-weighted prediction errors during perceptual inference. This finding supports the disconnection hypothesis of unilateral neglect and has implications for theories of its etiology. PMID:24695717

  13. Advances in Precision Medicine: Tailoring Individualized Therapies.

    Matchett, Kyle B; Lynam-Lennon, Niamh; Watson, R William; Brown, James A L

    2017-10-25

    The traditional bench-to-bedside pipeline involves using model systems and patient samples to provide insights into pathways deregulated in cancer. This discovery reveals new biomarkers and therapeutic targets, ultimately stratifying patients and informing cohort-based treatment options. Precision medicine (molecular profiling of individual tumors combined with established clinical-pathological parameters) reveals, in real-time, individual patient's diagnostic and prognostic risk profile, informing tailored and tumor-specific treatment plans. Here we discuss advances in precision medicine presented at the Irish Association for Cancer Research Annual Meeting, highlighting examples where personalized medicine approaches have led to precision discovery in individual tumors, informing customized treatment programs.

  14. Mixed-Precision Spectral Deferred Correction: Preprint

    Grout, Ray W. S.

    2015-09-02

    Convergence of spectral deferred correction (SDC), where low-order time integration methods are used to construct higher-order methods through iterative refinement, can be accelerated in terms of computational effort by using mixed-precision methods. Using ideas from multi-level SDC (in turn based on FAS multigrid ideas), some of the SDC correction sweeps can use function values computed in reduced precision without adversely impacting the accuracy of the final solution. This is particularly beneficial for the performance of combustion solvers such as S3D [6] which require double precision accuracy but are performance limited by the cost of data motion.

  15. Accuracy and precision in thermoluminescence dosimetry

    Marshall, T.O.

    1984-01-01

    The question of accuracy and precision in thermoluminescent dosimetry, particularly in relation to lithium fluoride phosphor, is discussed. The more important sources of error, including those due to the detectors, the reader, annealing and dosemeter design, are identified and methods of reducing their effects on accuracy and precision to a minimum are given. Finally, the accuracy and precision achievable for three quite different applications are discussed, namely, for personal dosimetry, environmental monitoring and for the measurement of photon dose distributions in phantoms. (U.K.)

  16. High - speed steel for precise cased tools

    Karwiarz, J.; Mazur, A.

    2001-01-01

    The test results of high-vanadium high - speed steel (SWV9) for precise casted tools are presented. The face -milling cutters of NFCa80A type have been tested in industrial operating conditions. An average life - time of SWV9 steel tools was 3-10 times longer compare to the conventional high - speed milling cutters. Metallography of SWB9 precise casted steel revealed beneficial for tool properties distribution of primary vanadium carbides in the steel matrix. Presented results should be a good argument for wide application of high - vanadium high - speed steel for precise casted tools. (author)

  17. Spatial econometrics using microdata

    Dubé, Jean

    2014-01-01

    This book provides an introduction to spatial analyses concerning disaggregated (or micro) spatial data.Particular emphasis is put on spatial data compilation and the structuring of the connections between the observations. Descriptive analysis methods of spatial data are presented in order to identify and measure the spatial, global and local dependency.The authors then focus on autoregressive spatial models, to control the problem of spatial dependency between the residues of a basic linear statistical model, thereby contravening one of the basic hypotheses of the ordinary least squares appr

  18. Precision mechatronics based on high-precision measuring and positioning systems and machines

    Jäger, Gerd; Manske, Eberhard; Hausotte, Tino; Mastylo, Rostyslav; Dorozhovets, Natalja; Hofmann, Norbert

    2007-06-01

    Precision mechatronics is defined in the paper as the science and engineering of a new generation of high precision systems and machines. Nanomeasuring and nanopositioning engineering represents important fields of precision mechatronics. The nanometrology is described as the today's limit of the precision engineering. The problem, how to design nanopositioning machines with uncertainties as small as possible will be discussed. The integration of several optical and tactile nanoprobes makes the 3D-nanopositioning machine suitable for various tasks, such as long range scanning probe microscopy, mask and wafer inspection, nanotribology, nanoindentation, free form surface measurement as well as measurement of microoptics, precision molds, microgears, ring gauges and small holes.

  19. High precision spectrophotometric analysis of thorium

    Palmieri, H.E.L.

    1984-01-01

    An accurate and precise determination of thorium is proposed. Precision of about 0,1% is required for the determination of macroquantities of thorium when processed. After an extensive literature search concerning this subject, spectrophotometric titration has been chosen, using dissodium ethylenediaminetetraacetate (EDTA) solution and alizarin-S as indicator. In order to obtain such a precision, an amount of 0,025 M EDTA solution precisely measured has been added and the titration was completed with less than 5 ml of 0,0025 M EDTA solution. It is usual to locate the end-point graphically, by plotting added titrant versus absorbance. The non-linear minimum square fit, using the Fletcher e Powell's minimization process and a computer programme. Besides the equivalence point, other parameters of titration were determined: the indicator concentration, the absorbance of the metal-indicator complex, and the stability constants of the metal-indicator and the metal-EDTA complexes. (Author) [pt

  20. Thorium spectrophotometric analysis with high precision

    Palmieri, H.E.L.

    1983-06-01

    An accurate and precise determination of thorium is proposed. Precision of about 0,1% is required for the determination of macroquantities of thorium processed. After an extensive literature search concerning this subject, spectrophotometric titration has been chosen, using disodium ethylenediaminetetraacetate (EDTA) solution and alizarin S as indicator. In order to obtain such a precision, an amount of 0,025 M EDTA solution precisely measured has been added and the titration was completed with less than 5 ml of 0,0025 M EDTA solution. It is usual to locate the end-point graphically, by plotting added titrant versus absorbance. The non-linear minimum square fit, using the Fletcher e Powell's minimization process and a computer program. (author)

  1. Cardiovascular Precision Medicine in the Genomics Era

    Alexandra M. Dainis, BS

    2018-04-01

    Full Text Available Summary: Precision medicine strives to delineate disease using multiple data sources—from genomics to digital health metrics—in order to be more precise and accurate in our diagnoses, definitions, and treatments of disease subtypes. By defining disease at a deeper level, we can treat patients based on an understanding of the molecular underpinnings of their presentations, rather than grouping patients into broad categories with one-size-fits-all treatments. In this review, the authors examine how precision medicine, specifically that surrounding genetic testing and genetic therapeutics, has begun to make strides in both common and rare cardiovascular diseases in the clinic and the laboratory, and how these advances are beginning to enable us to more effectively define risk, diagnose disease, and deliver therapeutics for each individual patient. Key Words: genome sequencing, genomics, precision medicine, targeted therapeutics

  2. Equity and Value in 'Precision Medicine'.

    Gray, Muir; Lagerberg, Tyra; Dombrádi, Viktor

    2017-04-01

    Precision medicine carries huge potential in the treatment of many diseases, particularly those with high-penetrance monogenic underpinnings. However, precision medicine through genomic technologies also has ethical implications. We will define allocative, personal, and technical value ('triple value') in healthcare and how this relates to equity. Equity is here taken to be implicit in the concept of triple value in countries that have publicly funded healthcare systems. It will be argued that precision medicine risks concentrating resources to those that already experience greater access to healthcare and power in society, nationally as well as globally. Healthcare payers, clinicians, and patients must all be involved in optimising the potential of precision medicine, without reducing equity. Throughout, the discussion will refer to the NHS RightCare Programme, which is a national initiative aiming to improve value and equity in the context of NHS England.

  3. The forthcoming era of precision medicine.

    Gamulin, Stjepan

    2016-11-01

    The aim of this essay is to present the definition and principles of personalized or precision medicine, the perspective and barriers to its development and clinical application. The implementation of precision medicine in health care requires the coordinated efforts of all health care stakeholders (the biomedical community, government, regulatory bodies, patients' groups). Particularly, translational research with the integration of genomic and comprehensive data from all levels of the organism ("big data"), development of bioinformatics platforms enabling network analysis of disease etiopathogenesis, development of a legislative framework for handling personal data, and new paradigms of medical education are necessary for successful application of the concept of precision medicine in health care. In the present and future era of precision medicine, the collaboration of all participants in health care is necessary for its realization, resulting in improvement of diagnosis, prevention and therapy, based on a holistic, individually tailored approach. Copyright © 2016 by Academy of Sciences and Arts of Bosnia and Herzegovina.

  4. Epistemology, Ethics, and Progress in Precision Medicine.

    Hey, Spencer Phillips; Barsanti-Innes, Brianna

    2016-01-01

    The emerging paradigm of precision medicine strives to leverage the tools of molecular biology to prospectively tailor treatments to the individual patient. Fundamental to the success of this movement is the discovery and validation of "predictive biomarkers," which are properties of a patient's biological specimens that can be assayed in advance of therapy to inform the treatment decision. Unfortunately, research into biomarkers and diagnostics for precision medicine has fallen well short of expectations. In this essay, we examine the portfolio of research activities into the excision repair cross complement group 1 (ERCC1) gene as a predictive biomarker for precision lung cancer therapy as a case study in elucidating the epistemological and ethical obstacles to developing new precision medicines.

  5. A Note on "Accuracy" and "Precision"

    Stallings, William M.; Gillmore, Gerald M.

    1971-01-01

    Advocates the use of precision" rather than accuracy" in defining reliability. These terms are consistently differentiated in certain sciences. Review of psychological and measurement literature reveals, however, interchangeable usage of the terms in defining reliability. (Author/GS)

  6. Precision axial translator with high stability.

    Bösch, M A

    1979-08-01

    We describe a new type of translator which is inherently stable against torsion and twisting. This concentric translator is also ideally suited for precise axial motion with clearance of the center line.

  7. Mechanics and Physics of Precise Vacuum Mechanisms

    Deulin, E. A; Panfilov, Yu V; Nevshupa, R. A

    2010-01-01

    In this book the Russian expertise in the field of the design of precise vacuum mechanics is summarized. A wide range of physical applications of mechanism design in electronic, optical-electronic, chemical, and aerospace industries is presented in a comprehensible way. Topics treated include the method of microparticles flow regulation and its determination in vacuum equipment and mechanisms of electronics; precise mechanisms of nanoscale precision based on magnetic and electric rheology; precise harmonic rotary and not-coaxial nut-screw linear motion vacuum feedthroughs with technical parameters considered the best in the world; elastically deformed vacuum motion feedthroughs without friction couples usage; the computer system of vacuum mechanisms failure predicting. This English edition incorporates a number of features which should improve its usefulness as a textbook without changing the basic organization or the general philosophy of presentation of the subject matter of the original Russian work. Exper...

  8. Precision Munition Electro-Sciences Facility

    Federal Laboratory Consortium — This facility allows the characterization of the electro-magnetic environment produced by a precision weapon in free flight. It can measure the radiofrequency (RF)...

  9. Precision electroweak physics at the Tevatron

    James, Eric B.

    2006-01-01

    An overview of Tevatron electroweak measurements performed by the CDF and Dφ experiments is presented. The current status and future prospects for high precision measurements of electroweak parameters and detailed studies of boson production are highlighted. (author)

  10. Precision Guidance with Impact Angle Requirements

    Ford, Jason

    2001-01-01

    This paper examines a weapon system precision guidance problem in which the objective is to guide a weapon onto a non-manoeuvring target so that a particular desired angle of impact is achieved using...

  11. Precise subtyping for synchronous multiparty sessions

    Mariangiola Dezani-Ciancaglini

    2016-02-01

    Full Text Available The notion of subtyping has gained an important role both in theoretical and applicative domains: in lambda and concurrent calculi as well as in programming languages. The soundness and the completeness, together referred to as the preciseness of subtyping, can be considered from two different points of view: operational and denotational. The former preciseness has been recently developed with respect to type safety, i.e. the safe replacement of a term of a smaller type when a term of a bigger type is expected. The latter preciseness is based on the denotation of a type which is a mathematical object that describes the meaning of the type in accordance with the denotations of other expressions from the language. The result of this paper is the operational and denotational preciseness of the subtyping for a synchronous multiparty session calculus. The novelty of this paper is the introduction of characteristic global types to prove the operational completeness.

  12. Prospects for Precision Neutrino Cross Section Measurements

    Harris, Deborah A. [Fermilab

    2016-01-28

    The need for precision cross section measurements is more urgent now than ever before, given the central role neutrino oscillation measurements play in the field of particle physics. The definition of precision is something worth considering, however. In order to build the best model for an oscillation experiment, cross section measurements should span a broad range of energies, neutrino interaction channels, and target nuclei. Precision might better be defined not in the final uncertainty associated with any one measurement but rather with the breadth of measurements that are available to constrain models. Current experience shows that models are better constrained by 10 measurements across different processes and energies with 10% uncertainties than by one measurement of one process on one nucleus with a 1% uncertainty. This article describes the current status of and future prospects for the field of precision cross section measurements considering the metric of how many processes, energies, and nuclei have been studied.

  13. Precise Calculation of Complex Radioactive Decay Chains

    Harr, Logan J

    2007-01-01

    ...). An application of the exponential moments function is used with a transmutation matrix in the calculation of complex radioactive decay chains to achieve greater precision than can be attained through current methods...

  14. Collaborative Genomics Study Advances Precision Oncology

    A collaborative study conducted by two Office of Cancer Genomics (OCG) initiatives highlights the importance of integrating structural and functional genomics programs to improve cancer therapies, and more specifically, contribute to precision oncology treatments for children.

  15. Nucleon measurements at the precision frontier

    Carlson, Carl E. [Physics Department, College of William and Mary, Williamsburg, VA 23187 (United States)

    2013-11-07

    We comment on nucleon measurements at the precision frontier. As examples of what can be learned, we concentrate on three topics, which are parity violating scattering experiments, the proton radius puzzle, and the symbiosis between nuclear and atomic physics.

  16. The forthcoming era of precision medicine

    Stjepan Gamulin

    2016-11-01

    Full Text Available Abstract. The aim of this essay is to present the definition and principles of personalized or precision medicine, the perspective and barriers to its development and clinical application. The implementation of precision medicine in health care requires the coordinated efforts of all health care stakeholders (the biomedical community, government, regulatory bodies, patients’ groups. Particularly, translational research with the integration of genomic and comprehensive data from all levels of the organism (“big data”, development of bioinformatics platforms enabling network analysis of disease etiopathogenesis, development of a legislative framework for handling personal data, and new paradigms of medical education are necessary for successful application of the concept of precision medicine in health care. Conclusion. In the present and future era of precision medicine, the collaboration of all participants in health care is necessary for its realization, resulting in improvement of diagnosis, prevention and therapy, based on a holistic, individually tailored approach.

  17. A large scale test of the gaming-enhancement hypothesis

    Andrew K. Przybylski

    2016-11-01

    Full Text Available A growing research literature suggests that regular electronic game play and game-based training programs may confer practically significant benefits to cognitive functioning. Most evidence supporting this idea, the gaming-enhancement hypothesis, has been collected in small-scale studies of university students and older adults. This research investigated the hypothesis in a general way with a large sample of 1,847 school-aged children. Our aim was to examine the relations between young people’s gaming experiences and an objective test of reasoning performance. Using a Bayesian hypothesis testing approach, evidence for the gaming-enhancement and null hypotheses were compared. Results provided no substantive evidence supporting the idea that having preference for or regularly playing commercially available games was positively associated with reasoning ability. Evidence ranged from equivocal to very strong in support for the null hypothesis over what was predicted. The discussion focuses on the value of Bayesian hypothesis testing for investigating electronic gaming effects, the importance of open science practices, and pre-registered designs to improve the quality of future work.

  18. A large scale test of the gaming-enhancement hypothesis.

    Przybylski, Andrew K; Wang, John C

    2016-01-01

    A growing research literature suggests that regular electronic game play and game-based training programs may confer practically significant benefits to cognitive functioning. Most evidence supporting this idea, the gaming-enhancement hypothesis , has been collected in small-scale studies of university students and older adults. This research investigated the hypothesis in a general way with a large sample of 1,847 school-aged children. Our aim was to examine the relations between young people's gaming experiences and an objective test of reasoning performance. Using a Bayesian hypothesis testing approach, evidence for the gaming-enhancement and null hypotheses were compared. Results provided no substantive evidence supporting the idea that having preference for or regularly playing commercially available games was positively associated with reasoning ability. Evidence ranged from equivocal to very strong in support for the null hypothesis over what was predicted. The discussion focuses on the value of Bayesian hypothesis testing for investigating electronic gaming effects, the importance of open science practices, and pre-registered designs to improve the quality of future work.

  19. Investigating the environmental Kuznets curve hypothesis in Vietnam

    Al-Mulali, Usama; Saboori, Behnaz; Ozturk, Ilhan

    2015-01-01

    This study investigates the existence of the environmental Kuznets curve (EKC) hypothesis in Vietnam during the period 1981–2011. To realize the goals of this study, a pollution model was established applying the Autoregressive Distributed Lag (ARDL) methodology. The results revealed that the pollution haven hypothesis does exist in Vietnam because capital increases pollution. In addition, imports also increase pollution which indicates that most of Vietnam's imported products are energy intensive and highly polluted. However, exports have no effect on pollution which indicates that the level of exports is not significant enough to affect pollution. Moreover, fossil fuel energy consumption increases pollution while renewable energy consumption has no significant effect in reducing pollution. Furthermore, labor force reduces pollution since most of Vietnam's labor force is in the agricultural and services sectors which are less energy intensive than the industrial sector. Based on the obtained results, the EKC hypothesis does not exist because the relationship between GDP and pollution is positive in both the short and long run. - Highlights: • The environmental Kuznets curve (EKC) hypothesis in Vietnam is investigated. • The Autoregressive Distributed Lag (ARDL) methodology was utilized. • The EKC hypothesis does not exist

  20. Social learning and evolution: the cultural intelligence hypothesis

    van Schaik, Carel P.; Burkart, Judith M.

    2011-01-01

    If social learning is more efficient than independent individual exploration, animals should learn vital cultural skills exclusively, and routine skills faster, through social learning, provided they actually use social learning preferentially. Animals with opportunities for social learning indeed do so. Moreover, more frequent opportunities for social learning should boost an individual's repertoire of learned skills. This prediction is confirmed by comparisons among wild great ape populations and by social deprivation and enculturation experiments. These findings shaped the cultural intelligence hypothesis, which complements the traditional benefit hypotheses for the evolution of intelligence by specifying the conditions in which these benefits can be reaped. The evolutionary version of the hypothesis argues that species with frequent opportunities for social learning should more readily respond to selection for a greater number of learned skills. Because improved social learning also improves asocial learning, the hypothesis predicts a positive interspecific correlation between social-learning performance and individual learning ability. Variation among primates supports this prediction. The hypothesis also predicts that more heavily cultural species should be more intelligent. Preliminary tests involving birds and mammals support this prediction too. The cultural intelligence hypothesis can also account for the unusual cognitive abilities of humans, as well as our unique mechanisms of skill transfer. PMID:21357223

  1. Dopamine and reward: the anhedonia hypothesis 30 years on.

    Wise, Roy A

    2008-10-01

    The anhedonia hypothesis--that brain dopamine plays a critical role in the subjective pleasure associated with positive rewards--was intended to draw the attention of psychiatrists to the growing evidence that dopamine plays a critical role in the objective reinforcement and incentive motivation associated with food and water, brain stimulation reward, and psychomotor stimulant and opiate reward. The hypothesis called to attention the apparent paradox that neuroleptics, drugs used to treat a condition involving anhedonia (schizophrenia), attenuated in laboratory animals the positive reinforcement that we normally associate with pleasure. The hypothesis held only brief interest for psychiatrists, who pointed out that the animal studies reflected acute actions of neuroleptics whereas the treatment of schizophrenia appears to result from neuroadaptations to chronic neuroleptic administration, and that it is the positive symptoms of schizophrenia that neuroleptics alleviate, rather than the negative symptoms that include anhedonia. Perhaps for these reasons, the hypothesis has had minimal impact in the psychiatric literature. Despite its limited heuristic value for the understanding of schizophrenia, however, the anhedonia hypothesis has had major impact on biological theories of reinforcement, motivation, and addiction. Brain dopamine plays a very important role in reinforcement of response habits, conditioned preferences, and synaptic plasticity in cellular models of learning and memory. The notion that dopamine plays a dominant role in reinforcement is fundamental to the psychomotor stimulant theory of addiction, to most neuroadaptation theories of addiction, and to current theories of conditioned reinforcement and reward prediction. Properly understood, it is also fundamental to recent theories of incentive motivation.

  2. Social learning and evolution: the cultural intelligence hypothesis.

    van Schaik, Carel P; Burkart, Judith M

    2011-04-12

    If social learning is more efficient than independent individual exploration, animals should learn vital cultural skills exclusively, and routine skills faster, through social learning, provided they actually use social learning preferentially. Animals with opportunities for social learning indeed do so. Moreover, more frequent opportunities for social learning should boost an individual's repertoire of learned skills. This prediction is confirmed by comparisons among wild great ape populations and by social deprivation and enculturation experiments. These findings shaped the cultural intelligence hypothesis, which complements the traditional benefit hypotheses for the evolution of intelligence by specifying the conditions in which these benefits can be reaped. The evolutionary version of the hypothesis argues that species with frequent opportunities for social learning should more readily respond to selection for a greater number of learned skills. Because improved social learning also improves asocial learning, the hypothesis predicts a positive interspecific correlation between social-learning performance and individual learning ability. Variation among primates supports this prediction. The hypothesis also predicts that more heavily cultural species should be more intelligent. Preliminary tests involving birds and mammals support this prediction too. The cultural intelligence hypothesis can also account for the unusual cognitive abilities of humans, as well as our unique mechanisms of skill transfer.

  3. Defending the Decimals: Why Foolishly False Precision Might Strengthen Social Science

    Jeremy Freese

    2014-12-01

    Full Text Available Social scientists often report regression coefficients using more significant figures than are meaningful given measurement precision and sample size. Common sense says we should not do this. Yet, as normative practice, eliminating these extra digits introduces a more serious scientific problem when accompanied by other ascendant reporting practices intended to reduce social science’s long-standing emphasis on null hypothesis significance testing. Coefficient p-values can no longer be recovered to the degree of precision that p-values have been abundantly demonstrated to influence actual research practice. Developing methods for detecting and addressing systematically exaggerated effect sizes across collections of studies cannot be done effectively if p-values are hidden. Regarding what is preferable for scientific literature versus an individual study, the costs of false precision are therefore innocuous compared to alternatives that either encourage the continuation of practices known to exaggerate causal effects or thwart assessment of how much such exaggeration occurs.

  4. PRECISION ELECTROWEAK MEASUREMENTS AND THE HIGGS MASS

    MARCIANO, W.J.

    2004-01-01

    The utility of precision electroweak measurements for predicting the Standard Model Higgs mass via quantum loop effects is discussed. Current constraints from m w and sin 2 θ w (m z ) ovr MS imply a relatively light Higgs ∼< 154 GeV which is consistent with Supersymmetry expectations. The existence of Supersymmetry is further suggested by a discrepancy between experiment and theory for the muon anomalous magnetic moment. Constraints from precision studies on other types of ''New Physics'' are also briefly described

  5. Precision Medicine-Nobody Is Average.

    Vinks, A A

    2017-03-01

    Medicine gets personal and tailor-made treatments are underway. Hospitals have started to advertise their advanced genomic testing capabilities and even their disruptive technologies to help foster a culture of innovation. The prediction in the lay press is that in decades from now we may look back and see 2017 as the year precision medicine blossomed. It is all part of the Precision Medicine Initiative that takes into account individual differences in people's genes, environments, and lifestyles. © 2017 ASCPT.

  6. The role of precise time in IFF

    Bridge, W. M.

    1982-01-01

    The application of precise time to the identification of friend or foe (IFF) problem is discussed. The simple concept of knowing when to expect each signal is exploited in a variety of ways to achieve an IFF system which is hard to detect, minimally exploitable and difficult to jam. Precise clocks are the backbone of the concept and the various candidates for this role are discussed. The compact rubidium-controlled oscillator is the only practical candidate.

  7. Precision siting of a particle accelerator

    Cintra, Jorge Pimentel

    1996-01-01

    Precise location is a specific survey job that involves a high skilled work to avoid unrecoverable results at the project installation. As a function of the different process stages, different specifications can be applied, invoking different instruments: theodolite, measurement tape, distanciometer, invar wire. This paper, based on experience obtained at the installation of particle accelerator equipment, deals with general principles of precise location: tolerance definitions, increasing accuracy techniques, schedule of locations, sensitivity analysis, quality control methods. (author)

  8. Principles of precision medicine in stroke.

    Hinman, Jason D; Rost, Natalia S; Leung, Thomas W; Montaner, Joan; Muir, Keith W; Brown, Scott; Arenillas, Juan F; Feldmann, Edward; Liebeskind, David S

    2017-01-01

    The era of precision medicine has arrived and conveys tremendous potential, particularly for stroke neurology. The diagnosis of stroke, its underlying aetiology, theranostic strategies, recurrence risk and path to recovery are populated by a series of highly individualised questions. Moreover, the phenotypic complexity of a clinical diagnosis of stroke makes a simple genetic risk assessment only partially informative on an individual basis. The guiding principles of precision medicine in stroke underscore the need to identify, value, organise and analyse the multitude of variables obtained from each individual to generate a precise approach to optimise cerebrovascular health. Existing data may be leveraged with novel technologies, informatics and practical clinical paradigms to apply these principles in stroke and realise the promise of precision medicine. Importantly, precision medicine in stroke will only be realised once efforts to collect, value and synthesise the wealth of data collected in clinical trials and routine care starts. Stroke theranostics, the ultimate vision of synchronising tailored therapeutic strategies based on specific diagnostic data, demand cerebrovascular expertise on big data approaches to clinically relevant paradigms. This review considers such challenges and delineates the principles on a roadmap for rational application of precision medicine to stroke and cerebrovascular health. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  9. Precision medicine needs pioneering clinical bioinformaticians.

    Gómez-López, Gonzalo; Dopazo, Joaquín; Cigudosa, Juan C; Valencia, Alfonso; Al-Shahrour, Fátima

    2017-10-25

    Success in precision medicine depends on accessing high-quality genetic and molecular data from large, well-annotated patient cohorts that couple biological samples to comprehensive clinical data, which in conjunction can lead to effective therapies. From such a scenario emerges the need for a new professional profile, an expert bioinformatician with training in clinical areas who can make sense of multi-omics data to improve therapeutic interventions in patients, and the design of optimized basket trials. In this review, we first describe the main policies and international initiatives that focus on precision medicine. Secondly, we review the currently ongoing clinical trials in precision medicine, introducing the concept of 'precision bioinformatics', and we describe current pioneering bioinformatics efforts aimed at implementing tools and computational infrastructures for precision medicine in health institutions around the world. Thirdly, we discuss the challenges related to the clinical training of bioinformaticians, and the urgent need for computational specialists capable of assimilating medical terminologies and protocols to address real clinical questions. We also propose some skills required to carry out common tasks in clinical bioinformatics and some tips for emergent groups. Finally, we explore the future perspectives and the challenges faced by precision medicine bioinformatics. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. Towards precision medicine; a new biomedical cosmology.

    Vegter, M W

    2018-02-10

    Precision Medicine has become a common label for data-intensive and patient-driven biomedical research. Its intended future is reflected in endeavours such as the Precision Medicine Initiative in the USA. This article addresses the question whether it is possible to discern a new 'medical cosmology' in Precision Medicine, a concept that was developed by Nicholas Jewson to describe comprehensive transformations involving various dimensions of biomedical knowledge and practice, such as vocabularies, the roles of patients and physicians and the conceptualisation of disease. Subsequently, I will elaborate my assessment of the features of Precision Medicine with the help of Michel Foucault, by exploring how precision medicine involves a transformation along three axes: the axis of biomedical knowledge, of biomedical power and of the patient as a self. Patients are encouraged to become the managers of their own health status, while the medical domain is reframed as a data-sharing community, characterised by changing power relationships between providers and patients, producers and consumers. While the emerging Precision Medicine cosmology may surpass existing knowledge frameworks; it obscures previous traditions and reduces research-subjects to mere data. This in turn, means that the individual is both subjected to the neoliberal demand to share personal information, and at the same time has acquired the positive 'right' to become a member of the data-sharing community. The subject has to constantly negotiate the meaning of his or her data, which can either enable self-expression, or function as a commanding Superego.

  11. Precision validation of MIPAS-Envisat products

    C. Piccolo

    2007-01-01

    Full Text Available This paper discusses the variation and validation of the precision, or estimated random error, associated with the ESA Level 2 products from the Michelson Interferometer for Passive Atmospheric Sounding (MIPAS. This quantity represents the propagation of the radiometric noise from the spectra through the retrieval process into the Level 2 profile values. The noise itself varies with time, steadily rising between ice decontamination events, but the Level 2 precision has a greater variation due to the atmospheric temperature which controls the total radiance received. Hence, for all species, the precision varies latitudinally/seasonally with temperature, with a small superimposed temporal structure determined by the degree of ice contamination on the detectors. The precision validation involves comparing two MIPAS retrievals at the intersections of ascending/descending orbits. For 5 days per month of full resolution MIPAS operation, the standard deviation of the matching profile pairs is computed and compared with the precision given in the MIPAS Level 2 data, except for NO2 since it has a large diurnal variation between ascending/descending intersections. Even taking into account the propagation of the pressure-temperature retrieval errors into the VMR retrieval, the standard deviation of the matching pairs is usually a factor 1–2 larger than the precision. This is thought to be due to effects such as horizontal inhomogeneity of the atmosphere and instability of the retrieval.

  12. The economic case for precision medicine.

    Gavan, Sean P; Thompson, Alexander J; Payne, Katherine

    2018-01-01

    Introduction : The advancement of precision medicine into routine clinical practice has been highlighted as an agenda for national and international health care policy. A principle barrier to this advancement is in meeting requirements of the payer or reimbursement agency for health care. This special report aims to explain the economic case for precision medicine, by accounting for the explicit objectives defined by decision-makers responsible for the allocation of limited health care resources. Areas covered : The framework of cost-effectiveness analysis, a method of economic evaluation, is used to describe how precision medicine can, in theory, exploit identifiable patient-level heterogeneity to improve population health outcomes and the relative cost-effectiveness of health care. Four case studies are used to illustrate potential challenges when demonstrating the economic case for a precision medicine in practice. Expert commentary : The economic case for a precision medicine should be considered at an early stage during its research and development phase. Clinical and economic evidence can be generated iteratively and should be in alignment with the objectives and requirements of decision-makers. Programmes of further research, to demonstrate the economic case of a precision medicine, can be prioritized by the extent that they reduce the uncertainty expressed by decision-makers.

  13. A comparator-hypothesis account of biased contingency detection.

    Vadillo, Miguel A; Barberia, Itxaso

    2018-02-12

    Our ability to detect statistical dependencies between different events in the environment is strongly biased by the number of coincidences between them. Even when there is no true covariation between a cue and an outcome, if the marginal probability of either of them is high, people tend to perceive some degree of statistical contingency between both events. The present paper explores the ability of the Comparator Hypothesis to explain the general pattern of results observed in this literature. Our simulations show that this model can account for the biasing effects of the marginal probabilities of cues and outcomes. Furthermore, the overall fit of the Comparator Hypothesis to a sample of experimental conditions from previous studies is comparable to that of the popular Rescorla-Wagner model. These results should encourage researchers to further explore and put to the test the predictions of the Comparator Hypothesis in the domain of biased contingency detection. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Universality hypothesis breakdown at one-loop order

    Carvalho, P. R. S.

    2018-05-01

    We probe the universality hypothesis by analytically computing the at least two-loop corrections to the critical exponents for q -deformed O (N ) self-interacting λ ϕ4 scalar field theories through six distinct and independent field-theoretic renormalization group methods and ɛ -expansion techniques. We show that the effect of q deformation on the one-loop corrections to the q -deformed critical exponents is null, so the universality hypothesis is broken down at this loop order. Such an effect emerges only at the two-loop and higher levels, and the validity of the universality hypothesis is restored. The q -deformed critical exponents obtained through the six methods are the same and, furthermore, reduce to their nondeformed values in the appropriated limit.

  15. Unicorns do exist: a tutorial on "proving" the null hypothesis.

    Streiner, David L

    2003-12-01

    Introductory statistics classes teach us that we can never prove the null hypothesis; all we can do is reject or fail to reject it. However, there are times when it is necessary to try to prove the nonexistence of a difference between groups. This most often happens within the context of comparing a new treatment against an established one and showing that the new intervention is not inferior to the standard. This article first outlines the logic of "noninferiority" testing by differentiating between the null hypothesis (that which we are trying to nullify) and the "nill" hypothesis (there is no difference), reversing the role of the null and alternate hypotheses, and defining an interval within which groups are said to be equivalent. We then work through an example and show how to calculate sample sizes for noninferiority studies.

  16. Almost-Quantum Correlations Violate the No-Restriction Hypothesis.

    Sainz, Ana Belén; Guryanova, Yelena; Acín, Antonio; Navascués, Miguel

    2018-05-18

    To identify which principles characterize quantum correlations, it is essential to understand in which sense this set of correlations differs from that of almost-quantum correlations. We solve this problem by invoking the so-called no-restriction hypothesis, an explicit and natural axiom in many reconstructions of quantum theory stating that the set of possible measurements is the dual of the set of states. We prove that, contrary to quantum correlations, no generalized probabilistic theory satisfying the no-restriction hypothesis is able to reproduce the set of almost-quantum correlations. Therefore, any theory whose correlations are exactly, or very close to, the almost-quantum correlations necessarily requires a rule limiting the possible measurements. Our results suggest that the no-restriction hypothesis may play a fundamental role in singling out the set of quantum correlations among other nonsignaling ones.

  17. Motor synergies and the equilibrium-point hypothesis.

    Latash, Mark L

    2010-07-01

    The article offers a way to unite three recent developments in the field of motor control and coordination: (1) The notion of synergies is introduced based on the principle of motor abundance; (2) The uncontrolled manifold hypothesis is described as offering a computational framework to identify and quantify synergies; and (3) The equilibrium-point hypothesis is described for a single muscle, single joint, and multijoint systems. Merging these concepts into a single coherent scheme requires focusing on control variables rather than performance variables. The principle of minimal final action is formulated as the guiding principle within the referent configuration hypothesis. Motor actions are associated with setting two types of variables by a controller, those that ultimately define average performance patterns and those that define associated synergies. Predictions of the suggested scheme are reviewed, such as the phenomenon of anticipatory synergy adjustments, quick actions without changes in synergies, atypical synergies, and changes in synergies with practice. A few models are briefly reviewed.

  18. Differentiating Spatial Memory from Spatial Transformations

    Street, Whitney N.; Wang, Ranxiao Frances

    2014-01-01

    The perspective-taking task is one of the most common paradigms used to study the nature of spatial memory, and better performance for certain orientations is generally interpreted as evidence of spatial representations using these reference directions. However, performance advantages can also result from the relative ease in certain…

  19. Commissioning and proof of functionality of the OPERA precision tracker, especially of the time measuring system; Inbetriebnahme und Funktionsnachweis des OPERA Precision Trackers insbesondere des Zeitmesssystems

    Janutta, Benjamin

    2008-10-15

    The commissioning and the proof of functionality of the Precision Tracker of the OPERA experiment is subject of this thesis. The timing system of the precision tracker is of major concern here. At first the time.resolution of the timing electronics was characterized additionally general running parameters were studied. Afterwards the installation and commissioning were carried out. The precision tracker is supposed to determine the momentum of throughgoing myons with an accuracy of {delta}p/p<0.25 as well as the sign of their charge. The commissioning is finished by now and it was shown, that the data acquisition system runs very reliable and only 1.5% show an slightly higher number of hits. The nominal spatial track resolution of {sigma}<600 {mu}m was also reached. (orig.)

  20. The frequentist implications of optional stopping on Bayesian hypothesis tests.

    Sanborn, Adam N; Hills, Thomas T

    2014-04-01

    Null hypothesis significance testing (NHST) is the most commonly used statistical methodology in psychology. The probability of achieving a value as extreme or more extreme than the statistic obtained from the data is evaluated, and if it is low enough, the null hypothesis is rejected. However, because common experimental practice often clashes with the assumptions underlying NHST, these calculated probabilities are often incorrect. Most commonly, experimenters use tests that assume that sample sizes are fixed in advance of data collection but then use the data to determine when to stop; in the limit, experimenters can use data monitoring to guarantee that the null hypothesis will be rejected. Bayesian hypothesis testing (BHT) provides a solution to these ills because the stopping rule used is irrelevant to the calculation of a Bayes factor. In addition, there are strong mathematical guarantees on the frequentist properties of BHT that are comforting for researchers concerned that stopping rules could influence the Bayes factors produced. Here, we show that these guaranteed bounds have limited scope and often do not apply in psychological research. Specifically, we quantitatively demonstrate the impact of optional stopping on the resulting Bayes factors in two common situations: (1) when the truth is a combination of the hypotheses, such as in a heterogeneous population, and (2) when a hypothesis is composite-taking multiple parameter values-such as the alternative hypothesis in a t-test. We found that, for these situations, while the Bayesian interpretation remains correct regardless of the stopping rule used, the choice of stopping rule can, in some situations, greatly increase the chance of experimenters finding evidence in the direction they desire. We suggest ways to control these frequentist implications of stopping rules on BHT.

  1. Toward accurate and precise estimates of lion density.

    Elliot, Nicholas B; Gopalaswamy, Arjun M

    2017-08-01

    Reliable estimates of animal density are fundamental to understanding ecological processes and population dynamics. Furthermore, their accuracy is vital to conservation because wildlife authorities rely on estimates to make decisions. However, it is notoriously difficult to accurately estimate density for wide-ranging carnivores that occur at low densities. In recent years, significant progress has been made in density estimation of Asian carnivores, but the methods have not been widely adapted to African carnivores, such as lions (Panthera leo). Although abundance indices for lions may produce poor inferences, they continue to be used to estimate density and inform management and policy. We used sighting data from a 3-month survey and adapted a Bayesian spatially explicit capture-recapture (SECR) model to estimate spatial lion density in the Maasai Mara National Reserve and surrounding conservancies in Kenya. Our unstructured spatial capture-recapture sampling design incorporated search effort to explicitly estimate detection probability and density on a fine spatial scale, making our approach robust in the context of varying detection probabilities. Overall posterior mean lion density was estimated to be 17.08 (posterior SD 1.310) lions >1 year old/100 km 2 , and the sex ratio was estimated at 2.2 females to 1 male. Our modeling framework and narrow posterior SD demonstrate that SECR methods can produce statistically rigorous and precise estimates of population parameters, and we argue that they should be favored over less reliable abundance indices. Furthermore, our approach is flexible enough to incorporate different data types, which enables robust population estimates over relatively short survey periods in a variety of systems. Trend analyses are essential to guide conservation decisions but are frequently based on surveys of differing reliability. We therefore call for a unified framework to assess lion numbers in key populations to improve management and

  2. Cosmological signatures of anisotropic spatial curvature

    Pereira, Thiago S.; Marugán, Guillermo A. Mena; Carneiro, Saulo

    2015-01-01

    If one is willing to give up the cherished hypothesis of spatial isotropy, many interesting cosmological models can be developed beyond the simple anisotropically expanding scenarios. One interesting possibility is presented by shear-free models in which the anisotropy emerges at the level of the curvature of the homogeneous spatial sections, whereas the expansion is dictated by a single scale factor. We show that such models represent viable alternatives to describe the large-scale structure of the inflationary universe, leading to a kinematically equivalent Sachs-Wolfe effect. Through the definition of a complete set of spatial eigenfunctions we compute the two-point correlation function of scalar perturbations in these models. In addition, we show how such scenarios would modify the spectrum of the CMB assuming that the observations take place in a small patch of a universe with anisotropic curvature

  3. Cosmological signatures of anisotropic spatial curvature

    Pereira, Thiago S. [Departamento de Física, Universidade Estadual de Londrina, 86057-970, Londrina – PR (Brazil); Marugán, Guillermo A. Mena [Instituto de Estructura de la Materia, IEM-CSIC, Serrano 121, 28006, Madrid (Spain); Carneiro, Saulo, E-mail: tspereira@uel.br, E-mail: mena@iem.cfmac.csic.es, E-mail: saulo.carneiro@pq.cnpq.br [Instituto de Física, Universidade Federal da Bahia, 40210-340, Salvador – BA (Brazil)

    2015-07-01

    If one is willing to give up the cherished hypothesis of spatial isotropy, many interesting cosmological models can be developed beyond the simple anisotropically expanding scenarios. One interesting possibility is presented by shear-free models in which the anisotropy emerges at the level of the curvature of the homogeneous spatial sections, whereas the expansion is dictated by a single scale factor. We show that such models represent viable alternatives to describe the large-scale structure of the inflationary universe, leading to a kinematically equivalent Sachs-Wolfe effect. Through the definition of a complete set of spatial eigenfunctions we compute the two-point correlation function of scalar perturbations in these models. In addition, we show how such scenarios would modify the spectrum of the CMB assuming that the observations take place in a small patch of a universe with anisotropic curvature.

  4. Thermalization without eigenstate thermalization hypothesis after a quantum quench.

    Mori, Takashi; Shiraishi, Naoto

    2017-08-01

    Nonequilibrium dynamics of a nonintegrable system without the eigenstate thermalization hypothesis is studied. It is shown that, in the thermodynamic limit, this model thermalizes after an arbitrary quantum quench at finite temperature, although it does not satisfy the eigenstate thermalization hypothesis. In contrast, when the system size is finite and the temperature is low enough, the system may not thermalize. In this case, the steady state is well described by the generalized Gibbs ensemble constructed by using highly nonlocal conserved quantities. We also show that this model exhibits prethermalization, in which the prethermalized state is characterized by nonthermal energy eigenstates.

  5. Tunguska, 1908: the gas pouch and soil fluidization hypothesis

    Nistor, I.

    2012-01-01

    The Siberian taiga explosion of 30 June 1908 remains one of the great mysteries of the 20th century: millions of trees put down over an area of 2200 km2 without trace of a crater or meteorite fragments. Hundred years of failed searches have followed, resulting in as many flawed hypothesis which could not offer satisfactory explanations: meteorite, comet, UFO, etc. In the author's opinion, the cause is that the energy the explorers looked for was simply not there! The author's hypothesis is that a meteoroid encountered a gas pouch in the atmosphere, producing a devastating explosion, its effects being amplified by soil fluidization.

  6. The equilibrium-point hypothesis--past, present and future.

    Feldman, Anatol G; Levin, Mindy F

    2009-01-01

    This chapter is a brief account of fundamentals of the equilibrium-point hypothesis or more adequately called the threshold control theory (TCT). It also compares the TCT with other approaches to motor control. The basic notions of the TCT are reviewed with a major focus on solutions to the problems of multi-muscle and multi-degrees of freedom redundancy. The TCT incorporates cognitive aspects by explaining how neurons recognize that internal (neural) and external (environmental) events match each other. These aspects as well as how motor learning occurs are subjects of further development of the TCT hypothesis.

  7. The cosmic censorship hypothesis and the positive energy conjecture

    Jang, P.S.; Wald, R.W.

    1979-01-01

    The position so far is summarized. Penrose derived an inequality; if a data set was found to violate this then the assumptions deriving the inequality must be false. In this case it could show a counter example to the cosmic censorship hypothesis. The authors have shown elsewhere that a positive energy argument of Geroch can be modified to rule out a violation of Penrose's inequality with any time-symmetric initial data set whose apparent horizon consists of a single component. This increases confidence in the hypothesis and also indicates there may be a close relationship between this conjecture and the positive energy conjecture. (UK)

  8. Eat dirt and avoid atopy: the hygiene hypothesis revisited.

    Patki, Anil

    2007-01-01

    The explosive rise in the incidence of atopic diseases in the Western developed countries can be explained on the basis of the so-called "hygiene hypothesis". In short, it attributes the rising incidence of atopic dermatitis to reduced exposure to various childhood infections and bacterial endotoxins. Reduced exposure to dirt in the clean environment results in a skewed development of the immune system which results in an abnormal allergic response to various environmental allergens which are otherwise innocuous. This article reviews the historical aspects, epidemiological and immunological basis of the hygiene hypothesis and implications for Indian conditions.

  9. The environmental convergence hypothesis: Carbon dioxide emissions according to the source of energy

    Herrerias, M.J.

    2013-01-01

    The aim of this paper is to investigate the environmental convergence hypothesis in carbon dioxide emissions for a large group of developed and developing countries from 1980 to 2009. The novel aspect of this work is that we distinguish among carbon dioxide emissions according to the source of energy (coal, natural gas and petroleum) instead of considering the aggregate measure of per capita carbon dioxide emissions, where notable interest is given to the regional dimension due to the application of new club convergence tests. This allows us to determine the convergence behaviour of emissions in a more precise way and to detect it according to the source of energy used, thereby helping to address the environmental targets. More specifically, the convergence hypothesis is examined with a pair-wise test and another one is used to test for the existence of club convergence. Our results from using the pair-wise test indicate that carbon dioxide emissions for each type of energy diverge. However, club convergence is found for a large group of countries, although some still display divergence. These findings point to the need to apply specific environmental policies to each club detected, since specific countries converge to different clubs. - Highlights: • The environmental convergence hypothesis is investigated across countries. • We perform a pair-wise test and a club convergence test. • Results from the first of these two tests suggest that carbon dioxide emissions are diverging. • However, we find that carbon dioxide emissions are converging within groups of countries. • Active environmental policies are required

  10. RDF SKETCH MAPS - KNOWLEDGE COMPLEXITY REDUCTION FOR PRECISION MEDICINE ANALYTICS.

    Thanintorn, Nattapon; Wang, Juexin; Ersoy, Ilker; Al-Taie, Zainab; Jiang, Yuexu; Wang, Duolin; Verma, Megha; Joshi, Trupti; Hammer, Richard; Xu, Dong; Shin, Dmitriy

    2016-01-01

    Sketch Map of the top 30% paths retained important information about signaling cascades leading to activation of proto-oncogene BRAF, which is usually associated with a different cancer, melanoma. Recent reports of successful treatments of HCL patients by the BRAF-targeted drug vemurafenib support the validity of the RDF Sketch Maps findings. We therefore believe that RDF Sketch Maps will be invaluable for hypothesis generation for precision diagnostics and therapeutics as well as drug repurposing studies.

  11. Spatial Data Management

    Mamoulis, Nikos

    2011-01-01

    Spatial database management deals with the storage, indexing, and querying of data with spatial features, such as location and geometric extent. Many applications require the efficient management of spatial data, including Geographic Information Systems, Computer Aided Design, and Location Based Services. The goal of this book is to provide the reader with an overview of spatial data management technology, with an emphasis on indexing and search techniques. It first introduces spatial data models and queries and discusses the main issues of extending a database system to support spatial data.

  12. Semantic Features, Perceptual Expectations, and Frequency as Factors in the Learning of Polar Spatial Adjective Concepts.

    Dunckley, Candida J. Lutes; Radtke, Robert C.

    Two semantic theories of word learning, a perceptual complexity hypothesis (H. Clark, 1970) and a quantitative complexity hypothesis (E. Clark, 1972) were tested by teaching 24 preschoolers and 16 college students CVC labels for five polar spatial adjective concepts having single word representations in English, and for three having no direct…

  13. Conducting Precision Medicine Research with African Americans.

    Halbert, Chanita Hughes; McDonald, Jasmine; Vadaparampil, Susan; Rice, LaShanta; Jefferson, Melanie

    2016-01-01

    Precision medicine is an approach to detecting, treating, and managing disease that is based on individual variation in genetic, environmental, and lifestyle factors. Precision medicine is expected to reduce health disparities, but this will be possible only if studies have adequate representation of racial minorities. It is critical to anticipate the rates at which individuals from diverse populations are likely to participate in precision medicine studies as research initiatives are being developed. We evaluated the likelihood of participating in a clinical study for precision medicine. Observational study conducted between October 2010 and February 2011 in a national sample of African Americans. Intentions to participate in a government sponsored study that involves providing a biospecimen and generates data that could be shared with other researchers to conduct future studies. One third of respondents would participate in a clinical study for precision medicine. Only gender had a significant independent association with participation intentions. Men had a 1.86 (95% CI = 1.11, 3.12, p = 0.02) increased likelihood of participating in a precision medicine study compared to women in the model that included overall barriers and facilitators. In the model with specific participation barriers, distrust was associated with a reduced likelihood of participating in the research described in the vignette (OR = 0.57, 95% CI = 0.34, 0.96, p = 0.04). African Americans may have low enrollment in PMI research. As PMI research is implemented, extensive efforts will be needed to ensure adequate representation. Additional research is needed to identify optimal ways of ethically describing precision medicine studies to ensure sufficient recruitment of racial minorities.

  14. Spatially Controlled Relay Beamforming

    Kalogerias, Dionysios

    This thesis is about fusion of optimal stochastic motion control and physical layer communications. Distributed, networked communication systems, such as relay beamforming networks (e.g., Amplify & Forward (AF)), are typically designed without explicitly considering how the positions of the respective nodes might affect the quality of the communication. Optimum placement of network nodes, which could potentially improve the quality of the communication, is not typically considered. However, in most practical settings in physical layer communications, such as relay beamforming, the Channel State Information (CSI) observed by each node, per channel use, although it might be (modeled as) random, it is both spatially and temporally correlated. It is, therefore, reasonable to ask if and how the performance of the system could be improved by (predictively) controlling the positions of the network nodes (e.g., the relays), based on causal side (CSI) information, and exploitting the spatiotemporal dependencies of the wireless medium. In this work, we address this problem in the context of AF relay beamforming networks. This novel, cyber-physical system approach to relay beamforming is termed as "Spatially Controlled Relay Beamforming". First, we discuss wireless channel modeling, however, in a rigorous, Bayesian framework. Experimentally accurate and, at the same time, technically precise channel modeling is absolutely essential for designing and analyzing spatially controlled communication systems. In this work, we are interested in two distinct spatiotemporal statistical models, for describing the behavior of the log-scale magnitude of the wireless channel: 1. Stationary Gaussian Fields: In this case, the channel is assumed to evolve as a stationary, Gaussian stochastic field in continuous space and discrete time (say, for instance, time slots). Under such assumptions, spatial and temporal statistical interactions are determined by a set of time and space invariant

  15. New Hypothesis and Theory about Functions of Sleep and Dreams

    Nikola N. Ilanković

    2014-03-01

    Conclusion: IEP-P1 could be a new biological marker to distinction of sleep organization in different psychotic states and other states of altered consciousness. The developed statistical models could be the basis for new hypothesis and theories about functions of sleep and dreams.

  16. Revisiting Hudson’s (1992) OO = O2 hypothesis

    Shibuya, Yoshikata; Jensen, Kim Ebensgaard

    2018-01-01

    In an important paper on the English “double-object”, or ditransitive, construction, Richard Hudson proposes a hypothesis that conflates the ditransitive direct object, or O2, and the monotransitive direct object, or OO, into the same syntactic functional category. While making important departures...

  17. Hypothesis, Prediction, and Conclusion: Using Nature of Science Terminology Correctly

    Eastwell, Peter

    2012-01-01

    This paper defines the terms "hypothesis," "prediction," and "conclusion" and shows how to use the terms correctly in scientific investigations in both the school and science education research contexts. The scientific method, or hypothetico-deductive (HD) approach, is described and it is argued that an understanding of the scientific method,…

  18. A sequential hypothesis test based on a generalized Azuma inequality

    Reijsbergen, D.P.; Scheinhardt, Willem R.W.; de Boer, Pieter-Tjerk

    We present a new power-one sequential hypothesis test based on a bound for the probability that a bounded zero-mean martingale ever crosses a curve of the form $a(n+k)^b$. The proof of the bound is of independent interest.

  19. Planned Hypothesis Tests Are Not Necessarily Exempt from Multiplicity Adjustment

    Frane, Andrew V.

    2015-01-01

    Scientific research often involves testing more than one hypothesis at a time, which can inflate the probability that a Type I error (false discovery) will occur. To prevent this Type I error inflation, adjustments can be made to the testing procedure that compensate for the number of tests. Yet many researchers believe that such adjustments are…

  20. A default Bayesian hypothesis test for correlations and partial correlations

    Wetzels, R.; Wagenmakers, E.J.

    2012-01-01

    We propose a default Bayesian hypothesis test for the presence of a correlation or a partial correlation. The test is a direct application of Bayesian techniques for variable selection in regression models. The test is easy to apply and yields practical advantages that the standard frequentist tests