WorldWideScience

Sample records for spatial precision hypothesis

  1. Synchronization and phonological skills: precise auditory timing hypothesis (PATH

    Directory of Open Access Journals (Sweden)

    Adam eTierney

    2014-11-01

    Full Text Available Phonological skills are enhanced by music training, but the mechanisms enabling this cross-domain enhancement remain unknown. To explain this cross-domain transfer, we propose a precise auditory timing hypothesis (PATH whereby entrainment practice is the core mechanism underlying enhanced phonological abilities in musicians. Both rhythmic synchronization and language skills such as consonant discrimination, detection of word and phrase boundaries, and conversational turn-taking rely on the perception of extremely fine-grained timing details in sound. Auditory-motor timing is an acoustic feature which meets all five of the pre-conditions necessary for cross-domain enhancement to occur (Patel 2011, 2012, 2014. There is overlap between the neural networks that process timing in the context of both music and language. Entrainment to music demands more precise timing sensitivity than does language processing. Moreover, auditory-motor timing integration captures the emotion of the trainee, is repeatedly practiced, and demands focused attention. The precise auditory timing hypothesis predicts that musical training emphasizing entrainment will be particularly effective in enhancing phonological skills.

  2. Variability in the Precision of Children’s Spatial Working Memory

    Directory of Open Access Journals (Sweden)

    Elena M. Galeano Weber

    2018-02-01

    Full Text Available Cognitive modeling studies in adults have established that visual working memory (WM capacity depends on the representational precision, as well as its variability from moment to moment. By contrast, visuospatial WM performance in children has been typically indexed by response accuracy—a binary measure that provides less information about precision with which items are stored. Here, we aimed at identifying whether and how children’s WM performance depends on the spatial precision and its variability over time in real-world contexts. Using smartphones, 110 Grade 3 and Grade 4 students performed a spatial WM updating task three times a day in school and at home for four weeks. Measures of spatial precision (i.e., Euclidean distance between presented and reported location were used for hierarchical modeling to estimate variability of spatial precision across different time scales. Results demonstrated considerable within-person variability in spatial precision across items within trials, from trial to trial and from occasion to occasion within days and from day to day. In particular, item-to-item variability was systematically increased with memory load and lowered with higher grade. Further, children with higher precision variability across items scored lower in measures of fluid intelligence. These findings emphasize the important role of transient changes in spatial precision for the development of WM.

  3. High spatial precision nano-imaging of polarization-sensitive plasmonic particles

    Science.gov (United States)

    Liu, Yunbo; Wang, Yipei; Lee, Somin Eunice

    2018-02-01

    Precise polarimetric imaging of polarization-sensitive nanoparticles is essential for resolving their accurate spatial positions beyond the diffraction limit. However, conventional technologies currently suffer from beam deviation errors which cannot be corrected beyond the diffraction limit. To overcome this issue, we experimentally demonstrate a spatially stable nano-imaging system for polarization-sensitive nanoparticles. In this study, we show that by integrating a voltage-tunable imaging variable polarizer with optical microscopy, we are able to suppress beam deviation errors. We expect that this nano-imaging system should allow for acquisition of accurate positional and polarization information from individual nanoparticles in applications where real-time, high precision spatial information is required.

  4. Precise Mapping Of A Spatially Distributed Radioactive Source

    International Nuclear Information System (INIS)

    Beck, A.; Caras, I.; Piestum, S.; Sheli, E.; Melamud, Y.; Berant, S.; Kadmon, Y.; Tirosh, D.

    1999-01-01

    Spatial distribution measurement of radioactive sources is a routine task in the nuclear industry. The precision of each measurement depends upon the specific application. However, the technological edge of this precision is motivated by the production of standards for calibration. Within this definition, the most demanding field is the calibration of standards for medical equipment. In this paper, a semi-empirical method for controlling the measurement precision is demonstrated, using a relatively simple laboratory apparatus. The spatial distribution of the source radioactivity is measured as part of the quality assurance tests, during the production of flood sources. These sources are further used in calibration of medical gamma cameras. A typical flood source is a 40 x 60 cm 2 plate with an activity of 10 mCi (or more) of 57 Co isotope. The measurement set-up is based on a single NaI(Tl) scintillator with a photomultiplier tube, moving on an X Y table which scans the flood source. In this application the source is required to have a uniform activity distribution over its surface

  5. Biodiversity, productivity, and the spatial insurance hypothesis revisited

    Science.gov (United States)

    Shanafelt, David W.; Dieckmann, Ulf; Jonas, Matthias; Franklin, Oskar; Loreau, Michel; Perrings, Charles

    2015-01-01

    Accelerating rates of biodiversity loss have led ecologists to explore the effects of species richness on ecosystem functioning and the flow of ecosystem services. One explanation of the relationship between biodiversity and ecosystem functioning lies in the spatial insurance hypothesis, which centers on the idea that productivity and stability increase with biodiversity in a temporally varying, spatially heterogeneous environment. However, there has been little work on the impact of dispersal where environmental risks are more or less spatially correlated, or where dispersal rates are variable. In this paper, we extend the original Loreau model to consider stochastic temporal variation in resource availability, which we refer to as “environmental risk,” and heterogeneity in species dispersal rates. We find that asynchronies across communities and species provide community-level stabilizing effects on productivity, despite varying levels of species richness. Although intermediate dispersal rates play a role in mitigating risk, they are less effective in insuring productivity against global (metacommunity-level) than local (individual community-level) risks. These results are particularly interesting given the emergence of global sources of risk such as climate change or the closer integration of world markets. Our results offer deeper insights into the Loreau model and new perspectives on the effectiveness of spatial insurance in the face of environmental risks. PMID:26100182

  6. Convergence Hypothesis: Evidence from Panel Unit Root Test with Spatial Dependence

    Directory of Open Access Journals (Sweden)

    Lezheng Liu

    2006-10-01

    Full Text Available In this paper we test the convergence hypothesis by using a revised 4- step procedure of panel unit root test suggested by Evans and Karras (1996. We use data on output for 24 OECD countries over 40 years long. Whether the convergence, if any, is conditional or absolute is also examined. According to a proposition by Baltagi, Bresson, and Pirotte (2005, we incorporate spatial autoregressive error into a fixedeffect panel model to account for not only the heterogeneous panel structure, but also spatial dependence, which might induce lower statistical power of conventional panel unit root test. Our empirical results indicate that output is converging among OECD countries. However, convergence is characterized as conditional. The results also report a relatively lower convergent speed compared to conventional panel studies.

  7. Multiple Vehicle Cooperative Localization with Spatial Registration Based on a Probability Hypothesis Density Filter

    Directory of Open Access Journals (Sweden)

    Feihu Zhang

    2014-01-01

    Full Text Available This paper studies the problem of multiple vehicle cooperative localization with spatial registration in the formulation of the probability hypothesis density (PHD filter. Assuming vehicles are equipped with proprioceptive and exteroceptive sensors (with biases to cooperatively localize positions, a simultaneous solution for joint spatial registration and state estimation is proposed. For this, we rely on the sequential Monte Carlo implementation of the PHD filtering. Compared to other methods, the concept of multiple vehicle cooperative localization with spatial registration is first proposed under Random Finite Set Theory. In addition, the proposed solution also addresses the challenges for multiple vehicle cooperative localization, e.g., the communication bandwidth issue and data association uncertainty. The simulation result demonstrates its reliability and feasibility in large-scale environments.

  8. The direct perception hypothesis: perceiving the intention of another’s action hinders its precise imitation

    Science.gov (United States)

    Froese, Tom; Leavens, David A.

    2014-01-01

    We argue that imitation is a learning response to unintelligible actions, especially to social conventions. Various strands of evidence are converging on this conclusion, but further progress has been hampered by an outdated theory of perceptual experience. Comparative psychology continues to be premised on the doctrine that humans and non-human primates only perceive others’ physical “surface behavior,” while mental states are perceptually inaccessible. However, a growing consensus in social cognition research accepts the direct perception hypothesis: primarily we see what others aim to do; we do not infer it from their motions. Indeed, physical details are overlooked – unless the action is unintelligible. On this basis we hypothesize that apes’ propensity to copy the goal of an action, rather than its precise means, is largely dependent on its perceived intelligibility. Conversely, children copy means more often than adults and apes because, uniquely, much adult human behavior is completely unintelligible to unenculturated observers due to the pervasiveness of arbitrary social conventions, as exemplified by customs, rituals, and languages. We expect the propensity to imitate to be inversely correlated with the familiarity of cultural practices, as indexed by age and/or socio-cultural competence. The direct perception hypothesis thereby helps to parsimoniously explain the most important findings of imitation research, including children’s over-imitation and other species-typical and age-related variations. PMID:24600413

  9. Geotechnical parameter spatial distribution stochastic analysis based on multi-precision information assimilation

    Science.gov (United States)

    Wang, C.; Rubin, Y.

    2014-12-01

    Spatial distribution of important geotechnical parameter named compression modulus Es contributes considerably to the understanding of the underlying geological processes and the adequate assessment of the Es mechanics effects for differential settlement of large continuous structure foundation. These analyses should be derived using an assimilating approach that combines in-situ static cone penetration test (CPT) with borehole experiments. To achieve such a task, the Es distribution of stratum of silty clay in region A of China Expo Center (Shanghai) is studied using the Bayesian-maximum entropy method. This method integrates rigorously and efficiently multi-precision of different geotechnical investigations and sources of uncertainty. Single CPT samplings were modeled as a rational probability density curve by maximum entropy theory. Spatial prior multivariate probability density function (PDF) and likelihood PDF of the CPT positions were built by borehole experiments and the potential value of the prediction point, then, preceding numerical integration on the CPT probability density curves, the posterior probability density curve of the prediction point would be calculated by the Bayesian reverse interpolation framework. The results were compared between Gaussian Sequential Stochastic Simulation and Bayesian methods. The differences were also discussed between single CPT samplings of normal distribution and simulated probability density curve based on maximum entropy theory. It is shown that the study of Es spatial distributions can be improved by properly incorporating CPT sampling variation into interpolation process, whereas more informative estimations are generated by considering CPT Uncertainty for the estimation points. Calculation illustrates the significance of stochastic Es characterization in a stratum, and identifies limitations associated with inadequate geostatistical interpolation techniques. This characterization results will provide a multi-precision

  10. Generating a taxonomy of spatially cued attention for visual discrimination: Effects of judgment precision and set size on attention

    Science.gov (United States)

    Hetley, Richard; Dosher, Barbara Anne; Lu, Zhong-Lin

    2014-01-01

    Attention precues improve the performance of perceptual tasks in many but not all circumstances. These spatial attention effects may depend upon display set size or workload, and have been variously attributed to external noise filtering, stimulus enhancement, contrast gain, or response gain, or to uncertainty or other decision effects. In this study, we document systematically different effects of spatial attention in low- and high-precision judgments, with and without external noise, and in different set sizes in order to contribute to the development of a taxonomy of spatial attention. An elaborated perceptual template model (ePTM) provides an integrated account of a complex set of effects of spatial attention with just two attention factors: a set-size dependent exclusion or filtering of external noise and a narrowing of the perceptual template to focus on the signal stimulus. These results are related to the previous literature by classifying the judgment precision and presence of external noise masks in those experiments, suggesting a taxonomy of spatially cued attention in discrimination accuracy. PMID:24939234

  11. Generating a taxonomy of spatially cued attention for visual discrimination: effects of judgment precision and set size on attention.

    Science.gov (United States)

    Hetley, Richard; Dosher, Barbara Anne; Lu, Zhong-Lin

    2014-11-01

    Attention precues improve the performance of perceptual tasks in many but not all circumstances. These spatial attention effects may depend upon display set size or workload, and have been variously attributed to external noise filtering, stimulus enhancement, contrast gain, or response gain, or to uncertainty or other decision effects. In this study, we document systematically different effects of spatial attention in low- and high-precision judgments, with and without external noise, and in different set sizes in order to contribute to the development of a taxonomy of spatial attention. An elaborated perceptual template model (ePTM) provides an integrated account of a complex set of effects of spatial attention with just two attention factors: a set-size dependent exclusion or filtering of external noise and a narrowing of the perceptual template to focus on the signal stimulus. These results are related to the previous literature by classifying the judgment precision and presence of external noise masks in those experiments, suggesting a taxonomy of spatially cued attention in discrimination accuracy.

  12. Human short-term spatial memory: precision predicts capacity.

    Science.gov (United States)

    Banta Lavenex, Pamela; Boujon, Valérie; Ndarugendamwo, Angélique; Lavenex, Pierre

    2015-03-01

    Here, we aimed to determine the capacity of human short-term memory for allocentric spatial information in a real-world setting. Young adults were tested on their ability to learn, on a trial-unique basis, and remember over a 1-min interval the location(s) of 1, 3, 5, or 7 illuminating pads, among 23 pads distributed in a 4m×4m arena surrounded by curtains on three sides. Participants had to walk to and touch the pads with their foot to illuminate the goal locations. In contrast to the predictions from classical slot models of working memory capacity limited to a fixed number of items, i.e., Miller's magical number 7 or Cowan's magical number 4, we found that the number of visited locations to find the goals was consistently about 1.6 times the number of goals, whereas the number of correct choices before erring and the number of errorless trials varied with memory load even when memory load was below the hypothetical memory capacity. In contrast to resource models of visual working memory, we found no evidence that memory resources were evenly distributed among unlimited numbers of items to be remembered. Instead, we found that memory for even one individual location was imprecise, and that memory performance for one location could be used to predict memory performance for multiple locations. Our findings are consistent with a theoretical model suggesting that the precision of the memory for individual locations might determine the capacity of human short-term memory for spatial information. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Testing the Environmental Kuznets Curve Hypothesis for Biodiversity Risk in the US: A Spatial Econometric Approach

    Directory of Open Access Journals (Sweden)

    Robert P. Berrens

    2011-11-01

    Full Text Available This study investigates whether the environmental Kuznets curve (EKC relationship is supported for a measure of biodiversity risk and economic development across the United States (US. Using state-level data for all 48 contiguous states, biodiversity risk is measured using a Modified Index (MODEX. This index is an adaptation of a comprehensive National Biodiversity Risk Assessment Index. The MODEX differs from other measures in that it is takes into account the impact of human activities and conservation measures. The econometric approach includes corrections for spatial autocorrelation effects, which are present in the data. Modeling estimation results do not support the EKC hypothesis for biodiversity risk in the US. This finding is robust over ordinary least squares, spatial error, and spatial lag models, where the latter is shown to be the preferred model. Results from the spatial lag regression show that a 1% increase in human population density is associated with about a 0.19% increase in biodiversity risk. Spatial dependence in this case study explains 30% of the variation, as risk in one state spills over into adjoining states. From a policy perspective, this latter result supports the need for coordinated efforts at state and federal levels to address the problem of biodiversity loss.

  14. A test of the reward-value hypothesis.

    Science.gov (United States)

    Smith, Alexandra E; Dalecki, Stefan J; Crystal, Jonathon D

    2017-03-01

    Rats retain source memory (memory for the origin of information) over a retention interval of at least 1 week, whereas their spatial working memory (radial maze locations) decays within approximately 1 day. We have argued that different forgetting functions dissociate memory systems. However, the two tasks, in our previous work, used different reward values. The source memory task used multiple pellets of a preferred food flavor (chocolate), whereas the spatial working memory task provided access to a single pellet of standard chow-flavored food at each location. Thus, according to the reward-value hypothesis, enhanced performance in the source memory task stems from enhanced encoding/memory of a preferred reward. We tested the reward-value hypothesis by using a standard 8-arm radial maze task to compare spatial working memory accuracy of rats rewarded with either multiple chocolate or chow pellets at each location using a between-subjects design. The reward-value hypothesis predicts superior accuracy for high-valued rewards. We documented equivalent spatial memory accuracy for high- and low-value rewards. Importantly, a 24-h retention interval produced equivalent spatial working memory accuracy for both flavors. These data are inconsistent with the reward-value hypothesis and suggest that reward value does not explain our earlier findings that source memory survives unusually long retention intervals.

  15. Spatial Distribution and Sampling Plans With Fixed Level of Precision for Citrus Aphids (Hom., Aphididae) on Two Orange Species.

    Science.gov (United States)

    Kafeshani, Farzaneh Alizadeh; Rajabpour, Ali; Aghajanzadeh, Sirous; Gholamian, Esmaeil; Farkhari, Mohammad

    2018-04-02

    Aphis spiraecola Patch, Aphis gossypii Glover, and Toxoptera aurantii Boyer de Fonscolombe are three important aphid pests of citrus orchards. In this study, spatial distributions of the aphids on two orange species, Satsuma mandarin and Thomson navel, were evaluated using Taylor's power law and Iwao's patchiness. In addition, a fixed-precision sequential sampling plant was developed for each species on the host plant by Green's model at precision levels of 0.25 and 0.1. The results revealed that spatial distribution parameters and therefore the sampling plan were significantly different according to aphid and host plant species. Taylor's power law provides a better fit for the data than Iwao's patchiness regression. Except T. aurantii on Thomson navel orange, spatial distribution patterns of the aphids were aggregative on both citrus. T. aurantii had regular dispersion pattern on Thomson navel orange. Optimum sample size of the aphids varied from 30-2061 and 1-1622 shoots on Satsuma mandarin and Thomson navel orange based on aphid species and desired precision level. Calculated stop lines of the aphid species on Satsuma mandarin and Thomson navel orange ranged from 0.48 to 19 and 0.19 to 80.4 aphids per 24 shoots according to aphid species and desired precision level. The performance of the sampling plan was validated by resampling analysis using resampling for validation of sampling plans (RVSP) software. This sampling program is useful for IPM program of the aphids in citrus orchards.

  16. Spatial distribution, sampling precision and survey design optimisation with non-normal variables: The case of anchovy (Engraulis encrasicolus) recruitment in Spanish Mediterranean waters

    Science.gov (United States)

    Tugores, M. Pilar; Iglesias, Magdalena; Oñate, Dolores; Miquel, Joan

    2016-02-01

    In the Mediterranean Sea, the European anchovy (Engraulis encrasicolus) displays a key role in ecological and economical terms. Ensuring stock sustainability requires the provision of crucial information, such as species spatial distribution or unbiased abundance and precision estimates, so that management strategies can be defined (e.g. fishing quotas, temporal closure areas or marine protected areas MPA). Furthermore, the estimation of the precision of global abundance at different sampling intensities can be used for survey design optimisation. Geostatistics provide a priori unbiased estimations of the spatial structure, global abundance and precision for autocorrelated data. However, their application to non-Gaussian data introduces difficulties in the analysis in conjunction with low robustness or unbiasedness. The present study applied intrinsic geostatistics in two dimensions in order to (i) analyse the spatial distribution of anchovy in Spanish Western Mediterranean waters during the species' recruitment season, (ii) produce distribution maps, (iii) estimate global abundance and its precision, (iv) analyse the effect of changing the sampling intensity on the precision of global abundance estimates and, (v) evaluate the effects of several methodological options on the robustness of all the analysed parameters. The results suggested that while the spatial structure was usually non-robust to the tested methodological options when working with the original dataset, it became more robust for the transformed datasets (especially for the log-backtransformed dataset). The global abundance was always highly robust and the global precision was highly or moderately robust to most of the methodological options, except for data transformation.

  17. A test of the reward-contrast hypothesis.

    Science.gov (United States)

    Dalecki, Stefan J; Panoz-Brown, Danielle E; Crystal, Jonathon D

    2017-12-01

    Source memory, a facet of episodic memory, is the memory of the origin of information. Whereas source memory in rats is sustained for at least a week, spatial memory degraded after approximately a day. Different forgetting functions may suggest that two memory systems (source memory and spatial memory) are dissociated. However, in previous work, the two tasks used baiting conditions consisting of chocolate and chow flavors; notably, the source memory task used the relatively better flavor. Thus, according to the reward-contrast hypothesis, when chocolate and chow were presented within the same context (i.e., within a single radial maze trial), the chocolate location was more memorable than the chow location because of contrast. We tested the reward-contrast hypothesis using baiting configurations designed to produce reward-contrast. The reward-contrast hypothesis predicts that under these conditions, spatial memory will survive a 24-h retention interval. We documented elimination of spatial memory performance after a 24-h retention interval using a reward-contrast baiting pattern. These data suggest that reward contrast does not explain our earlier findings that source memory survives unusually long retention intervals. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. The precision of spatial selection into the focus of attention in working memory.

    Science.gov (United States)

    Souza, Alessandra S; Thalmann, Mirko; Oberauer, Klaus

    2018-04-23

    Attention helps manage the information held in visual working memory (vWM). Perceptual attention selects the stimuli to be represented in vWM, whereas internal attention prioritizes information already in vWM. In the present study we assessed the spatial precision of perceptual and internal attention in vWM. Participants encoded eight colored dots for a local-recognition test. To manipulate attention, a cue indicated the item most likely to be tested (~65% validity). The cue appeared either before the onset of the memory array (precue) or during the retention interval (retrocue). The precue guides perceptual attention to gate encoding into vWM, whereas the retrocue guides internal attention to prioritize the cued item within vWM. If attentional selection is spatially imprecise, attention should be preferentially allocated to the cued location, with a gradual drop-off of attention over space to nearby uncued locations. In this case, memory for uncued locations should vary as a function of their distance from the cued location. As compared to a no-cue condition, memory was better for validly cued items but worse for uncued items. The spatial distance between the uncued and cued locations modulated the cuing costs: Items close in space to the cued location were insulated from cuing costs. The extension of this spatial proximity effect was larger for precues than for retrocues, mostly because the benefits of attention were larger for precues. These results point to similar selection principles between perceptual and internal attention and to a critical role of spatial distance in the selection of visual representations.

  19. Spatial Precision in Magnetic Resonance Imaging–Guided Radiation Therapy: The Role of Geometric Distortion

    Energy Technology Data Exchange (ETDEWEB)

    Weygand, Joseph, E-mail: jw2899@columbia.edu [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); The University of Texas Graduate School of Biomedical Sciences at Houston, Houston, Texas (United States); Fuller, Clifton David [The University of Texas Graduate School of Biomedical Sciences at Houston, Houston, Texas (United States); Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Ibbott, Geoffrey S. [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); The University of Texas Graduate School of Biomedical Sciences at Houston, Houston, Texas (United States); Mohamed, Abdallah S.R. [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Department of Clinical Oncology and Nuclear Medicine, Alexandria University, Alexandria (Egypt); Ding, Yao [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Yang, Jinzhong [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); The University of Texas Graduate School of Biomedical Sciences at Houston, Houston, Texas (United States); Hwang, Ken-Pin [Department of Imaging Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Wang, Jihong [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); The University of Texas Graduate School of Biomedical Sciences at Houston, Houston, Texas (United States)

    2016-07-15

    Because magnetic resonance imaging–guided radiation therapy (MRIgRT) offers exquisite soft tissue contrast and the ability to image tissues in arbitrary planes, the interest in this technology has increased dramatically in recent years. However, intrinsic geometric distortion stemming from both the system hardware and the magnetic properties of the patient affects MR images and compromises the spatial integrity of MRI-based radiation treatment planning, given that for real-time MRIgRT, precision within 2 mm is desired. In this article, we discuss the causes of geometric distortion, describe some well-known distortion correction algorithms, and review geometric distortion measurements from 12 studies, while taking into account relevant imaging parameters. Eleven of the studies reported phantom measurements quantifying system-dependent geometric distortion, while 2 studies reported simulation data quantifying magnetic susceptibility–induced geometric distortion. Of the 11 studies investigating system-dependent geometric distortion, 5 reported maximum measurements less than 2 mm. The simulation studies demonstrated that magnetic susceptibility–induced distortion is typically smaller than system-dependent distortion but still nonnegligible, with maximum distortion ranging from 2.1 to 2.6 mm at a field strength of 1.5 T. As expected, anatomic landmarks containing interfaces between air and soft tissue had the largest distortions. The evidence indicates that geometric distortion reduces the spatial integrity of MRI-based radiation treatment planning and likely diminishes the efficacy of MRIgRT. Better phantom measurement techniques and more effective distortion correction algorithms are needed to achieve the desired spatial precision.

  20. Spatial Precision in Magnetic Resonance Imaging–Guided Radiation Therapy: The Role of Geometric Distortion

    International Nuclear Information System (INIS)

    Weygand, Joseph; Fuller, Clifton David; Ibbott, Geoffrey S.; Mohamed, Abdallah S.R.; Ding, Yao; Yang, Jinzhong; Hwang, Ken-Pin; Wang, Jihong

    2016-01-01

    Because magnetic resonance imaging–guided radiation therapy (MRIgRT) offers exquisite soft tissue contrast and the ability to image tissues in arbitrary planes, the interest in this technology has increased dramatically in recent years. However, intrinsic geometric distortion stemming from both the system hardware and the magnetic properties of the patient affects MR images and compromises the spatial integrity of MRI-based radiation treatment planning, given that for real-time MRIgRT, precision within 2 mm is desired. In this article, we discuss the causes of geometric distortion, describe some well-known distortion correction algorithms, and review geometric distortion measurements from 12 studies, while taking into account relevant imaging parameters. Eleven of the studies reported phantom measurements quantifying system-dependent geometric distortion, while 2 studies reported simulation data quantifying magnetic susceptibility–induced geometric distortion. Of the 11 studies investigating system-dependent geometric distortion, 5 reported maximum measurements less than 2 mm. The simulation studies demonstrated that magnetic susceptibility–induced distortion is typically smaller than system-dependent distortion but still nonnegligible, with maximum distortion ranging from 2.1 to 2.6 mm at a field strength of 1.5 T. As expected, anatomic landmarks containing interfaces between air and soft tissue had the largest distortions. The evidence indicates that geometric distortion reduces the spatial integrity of MRI-based radiation treatment planning and likely diminishes the efficacy of MRIgRT. Better phantom measurement techniques and more effective distortion correction algorithms are needed to achieve the desired spatial precision.

  1. The role of spatial memory and frames of reference in the precision of angular path integration.

    Science.gov (United States)

    Arthur, Joeanna C; Philbeck, John W; Kleene, Nicholas J; Chichka, David

    2012-09-01

    Angular path integration refers to the ability to maintain an estimate of self-location after a rotational displacement by integrating internally-generated (idiothetic) self-motion signals over time. Previous work has found that non-sensory inputs, namely spatial memory, can play a powerful role in angular path integration (Arthur et al., 2007, 2009). Here we investigated the conditions under which spatial memory facilitates angular path integration. We hypothesized that the benefit of spatial memory is particularly likely in spatial updating tasks in which one's self-location estimate is referenced to external space. To test this idea, we administered passive, non-visual body rotations (ranging 40°-140°) about the yaw axis and asked participants to use verbal reports or open-loop manual pointing to indicate the magnitude of the rotation. Prior to some trials, previews of the surrounding environment were given. We found that when participants adopted an egocentric frame of reference, the previously-observed benefit of previews on within-subject response precision was not manifested, regardless of whether remembered spatial frameworks were derived from vision or spatial language. We conclude that the powerful effect of spatial memory is dependent on one's frame of reference during self-motion updating. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Comparing Spatial Predictions

    KAUST Repository

    Hering, Amanda S.; Genton, Marc G.

    2011-01-01

    Under a general loss function, we develop a hypothesis test to determine whether a significant difference in the spatial predictions produced by two competing models exists on average across the entire spatial domain of interest. The null hypothesis

  3. Spatial distribution of soil moisture in precision farming using integrated soil scanning and field telemetry data

    Science.gov (United States)

    Kalopesas, Charalampos; Galanis, George; Kalopesa, Eleni; Katsogiannos, Fotis; Kalafatis, Panagiotis; Bilas, George; Patakas, Aggelos; Zalidis, George

    2015-04-01

    Mapping the spatial variation of soil moisture content is a vital parameter for precision agriculture techniques. The aim of this study was to examine the correlation of soil moisture and conductivity (EC) data obtained through scanning techniques with field telemetry data and to spatially separate the field into discrete irrigation management zones. Using the Veris MSP3 model, geo-referenced data for electrical conductivity and organic matter preliminary maps were produced in a pilot kiwifruit field in Chrysoupoli, Kavala. Data from 15 stratified sampling points was used in order to produce the corresponding soil maps. Fusion of the Veris produced maps (OM, pH, ECa) resulted on the delineation of the field into three zones of specific management interest. An appropriate pedotransfer function was used in order to estimate a capacity soil indicator, the saturated volumetric water content (θs) for each zone, while the relationship between ECs and ECa was established for each zone. Validation of the uniformity of the three management zones was achieved by measuring specific electrical conductivity (ECs) along a transect in each zone and corresponding semivariograms for ECs within each zone. Near real-time data produced by a telemetric network consisting of soil moisture and electrical conductivity sensors, were used in order to integrate the temporal component of the specific management zones, enabling the calculation of time specific volumetric water contents on a 10 minute interval, an intensity soil indicator necessary to be incorporated to differentiate spatially the irrigation strategies for each zone. This study emphasizes the benefits yielded by fusing near real time telemetric data with soil scanning data and spatial interpolation techniques, enhancing the precision and validity of the desired results. Furthermore the use of telemetric data in combination with modern database management and geospatial software leads to timely produced operational results

  4. The Threshold Hypothesis Applied to Spatial Skill and Mathematics

    Science.gov (United States)

    Freer, Daniel

    2017-01-01

    This cross-sectional study assessed the relation between spatial skills and mathematics in 854 participants across kindergarten, third grade, and sixth grade. Specifically, the study probed for a threshold for spatial skills when performing mathematics, above which spatial scores and mathematics scores would be significantly less related. This…

  5. Precision oncology: origins, optimism, and potential.

    Science.gov (United States)

    Prasad, Vinay; Fojo, Tito; Brada, Michael

    2016-02-01

    Imatinib, the first and arguably the best targeted therapy, became the springboard for developing drugs aimed at molecular targets deemed crucial to tumours. As this development unfolded, a revolution in the speed and cost of genetic sequencing occurred. The result--an armamentarium of drugs and an array of molecular targets--set the stage for precision oncology, a hypothesis that cancer treatment could be markedly improved if therapies were guided by a tumour's genomic alterations. Drawing lessons from the biological basis of cancer and recent empirical investigations, we take a more measured view of precision oncology's promise. Ultimately, the promise is not our concern, but the threshold at which we declare success. We review reports of precision oncology alongside those of precision diagnostics and novel radiotherapy approaches. Although confirmatory evidence is scarce, these interventions have been widely endorsed. We conclude that the current path will probably not be successful or, at a minimum, will have to undergo substantive adjustments before it can be successful. For the sake of patients with cancer, we hope one form of precision oncology will deliver on its promise. However, until confirmatory studies are completed, precision oncology remains unproven, and as such, a hypothesis in need of rigorous testing. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Spatially explicit genetic structure in the freshwater sponge Ephydatia fluviatilis (Linnaeus, 1759 within the framework of the monopolisation hypothesis

    Directory of Open Access Journals (Sweden)

    Livia Lucentini

    2013-02-01

    Full Text Available An apparent paradox is known for crustaceans, rotifers and bryozoans living in inland small water bodies: a potential for wide distribution due to the presence of resting stages is coupled with marked genetic differences between nearby water bodies, with enclave distributions masking clear phylogeographic patterns. According to the monopolisation hypothesis, this is due to the accumulation of resting stages, monopolising each water body. Freshwater sponges could represent a useful system to assess the generality of the mo- nopolisation hypothesis: these organisms i live in the same habitats as crustaceans, rotifers and bryozoans, ii produce resting stages that can accumulate, and iii have indeed a wide distribution. Currently, no studies on spatially explicit genetic differentiation on fresh- water sponges are available. The aim of the present study is to provide additional empirical evidence in support of the generality of the scenario for small aquatic animals with resting stages by analysing genetic diversity at different spatial scales for an additional model system, the freshwater sponge ephydatia fluviatilis (Linnaeus, 1759. We expected that system genetic variability would follow enclave distributions, no clear phylogeographical patterns would be present, and nearby unconnected water bodies would show markedly different populations for this new model too. We analysed the ribosomal internal transcribed spacer regions 5.8S-ITS2-28S, the D3 domain of 28S subunit, the mitochondrial Cytochrome c Oxidase I (COI and ten specific microsatellite markers of nine Italian and one Hungarian populations. Mitochondrial and nuclear sequences showed no or very low genetic polymorphism, whereas high levels of differentiation among populations and a significant polymorphism were observed using microsatellites. Microsatellite loci also showed a high proportion of private alleles for each population and an overall correlation between geographic and genetic

  7. Close but no cigar: Spatial precision deficits following medial temporal lobe lesions provide novel insight into theoretical models of navigation and memory.

    Science.gov (United States)

    Kolarik, Branden S; Baer, Trevor; Shahlaie, Kiarash; Yonelinas, Andrew P; Ekstrom, Arne D

    2018-01-01

    Increasing evidence suggests that the human hippocampus contributes to a range of different behaviors, including episodic memory, language, short-term memory, and navigation. A novel theoretical framework, the Precision and Binding Model, accounts for these phenomenon by describing a role for the hippocampus in high-resolution, complex binding. Other theories like Cognitive Map Theory, in contrast, predict a specific role for the hippocampus in allocentric navigation, while Declarative Memory Theory predicts a specific role in delay-dependent conscious memory. Navigation provides a unique venue for testing these predictions, with past results from research with humans providing inconsistent findings regarding the role of the human hippocampus in spatial navigation. Here, we tested five patients with lesions primarily restricted to the hippocampus and those extending out into the surrounding medial temporal lobe cortex on a virtual water maze task. Consistent with the Precision and Binding Model, we found partially intact allocentric memory in all patients, with impairments in the spatial precision of their searches for a hidden target. We found similar impairments at both immediate and delayed testing. Our findings are consistent with the Precision and Binding Model of hippocampal function, arguing for its role across domains in high-resolution, complex binding. Remembering goal locations in one's environment is a critical skill for survival. How this information is represented in the brain is still not fully understood, but is believed to rely in some capacity on structures in the medial temporal lobe. Contradictory findings from studies of both humans and animals have been difficult to reconcile with regard to the role of the MTL, specifically the hippocampus. By assessing impairments observed during navigation to a goal in patients with medial temporal lobe damage we can better understand the role these structures play in such behavior. Utilizing virtual reality

  8. Correlated cryo-fluorescence and cryo-electron microscopy with high spatial precision and improved sensitivity

    International Nuclear Information System (INIS)

    Schorb, Martin; Briggs, John A.G.

    2014-01-01

    Performing fluorescence microscopy and electron microscopy on the same sample allows fluorescent signals to be used to identify and locate features of interest for subsequent imaging by electron microscopy. To carry out such correlative microscopy on vitrified samples appropriate for structural cryo-electron microscopy it is necessary to perform fluorescence microscopy at liquid-nitrogen temperatures. Here we describe an adaptation of a cryo-light microscopy stage to permit use of high-numerical aperture objectives. This allows high-sensitivity and high-resolution fluorescence microscopy of vitrified samples. We describe and apply a correlative cryo-fluorescence and cryo-electron microscopy workflow together with a fiducial bead-based image correlation procedure. This procedure allows us to locate fluorescent bacteriophages in cryo-electron microscopy images with an accuracy on the order of 50 nm, based on their fluorescent signal. It will allow the user to precisely and unambiguously identify and locate objects and events for subsequent high-resolution structural study, based on fluorescent signals. - Highlights: • Workflow for correlated cryo-fluorescence and cryo-electron microscopy. • Cryo-fluorescence microscopy setup incorporating a high numerical aperture objective. • Fluorescent signals located in cryo-electron micrographs with 50 nm spatial precision

  9. Correlated cryo-fluorescence and cryo-electron microscopy with high spatial precision and improved sensitivity

    Energy Technology Data Exchange (ETDEWEB)

    Schorb, Martin [Structural and Computational Biology Unit, European Molecular Biology Laboratory, D-69117 Heidelberg (Germany); Briggs, John A.G., E-mail: john.briggs@embl.de [Structural and Computational Biology Unit, European Molecular Biology Laboratory, D-69117 Heidelberg (Germany); Cell Biology and Biophysics Unit, European Molecular Biology Laboratory, D-69117 Heidelberg (Germany)

    2014-08-01

    Performing fluorescence microscopy and electron microscopy on the same sample allows fluorescent signals to be used to identify and locate features of interest for subsequent imaging by electron microscopy. To carry out such correlative microscopy on vitrified samples appropriate for structural cryo-electron microscopy it is necessary to perform fluorescence microscopy at liquid-nitrogen temperatures. Here we describe an adaptation of a cryo-light microscopy stage to permit use of high-numerical aperture objectives. This allows high-sensitivity and high-resolution fluorescence microscopy of vitrified samples. We describe and apply a correlative cryo-fluorescence and cryo-electron microscopy workflow together with a fiducial bead-based image correlation procedure. This procedure allows us to locate fluorescent bacteriophages in cryo-electron microscopy images with an accuracy on the order of 50 nm, based on their fluorescent signal. It will allow the user to precisely and unambiguously identify and locate objects and events for subsequent high-resolution structural study, based on fluorescent signals. - Highlights: • Workflow for correlated cryo-fluorescence and cryo-electron microscopy. • Cryo-fluorescence microscopy setup incorporating a high numerical aperture objective. • Fluorescent signals located in cryo-electron micrographs with 50 nm spatial precision.

  10. Precision Viticulture : is it relevant to manage the vineyard according to the within field spatial variability of the environment ?

    Science.gov (United States)

    Tisseyre, Bruno

    2015-04-01

    For more than 15 years, research projects are conducted in the precision viticulture (PV) area around the world. These research projects have provided new insights into the within-field variability in viticulture. Indeed, access to high spatial resolution data (remote sensing, embedded sensors, etc.) changes the knowledge we have of the fields in viticulture. In particular, the field which was until now considered as a homogeneous management unit, presents actually a high spatial variability in terms of yield, vigour an quality. This knowledge will lead (and is already causing) changes on how to manage the vineyard and the quality of the harvest at the within field scale. From the experimental results obtained in various countries of the world, the goal of the presentation is to provide figures on: - the spatial variability of the main parameters (yield, vigor, quality), and how this variability is organized spatially, - the temporal stability of the observed spatial variability and the potential link with environmental parameters like soil, topography, soil water availability, etc. - information sources available at a high spatial resolution conventionally used in precision agriculture likely to highlight this spatial variability (multi-spectral images, soil electrical conductivity, etc.) and the limitations that these information sources are likely to present in viticulture. Several strategies are currently being developed to take into account the within field variability in viticulture. They are based on the development of specific equipments, sensors, actuators and site specific strategies with the aim of adapting the vineyard operations at the within-field level. These strategies will be presented briefly in two ways : - Site specific operations (fertilization, pruning, thinning, irrigation, etc.) in order to counteract the effects of the environment and to obtain a final product with a controlled and consistent wine quality, - Differential harvesting with the

  11. A Precision-Positioning Method for a High-Acceleration Low-Load Mechanism Based on Optimal Spatial and Temporal Distribution of Inertial Energy

    Directory of Open Access Journals (Sweden)

    Xin Chen

    2015-09-01

    Full Text Available High-speed and precision positioning are fundamental requirements for high-acceleration low-load mechanisms in integrated circuit (IC packaging equipment. In this paper, we derive the transient nonlinear dynamicresponse equations of high-acceleration mechanisms, which reveal that stiffness, frequency, damping, and driving frequency are the primary factors. Therefore, we propose a new structural optimization and velocity-planning method for the precision positioning of a high-acceleration mechanism based on optimal spatial and temporal distribution of inertial energy. For structural optimization, we first reviewed the commonly flexible multibody dynamic optimization using equivalent static loads method (ESLM, and then we selected the modified ESLM for optimal spatial distribution of inertial energy; hence, not only the stiffness but also the inertia and frequency of the real modal shapes are considered. For velocity planning, we developed a new velocity-planning method based on nonlinear dynamic-response optimization with varying motion conditions. Our method was verified on a high-acceleration die bonder. The amplitude of residual vibration could be decreased by more than 20% via structural optimization and the positioning time could be reduced by more than 40% via asymmetric variable velocity planning. This method provides an effective theoretical support for the precision positioning of high-acceleration low-load mechanisms.

  12. Why would musical training benefit the neural encoding of speech? The OPERA hypothesis.

    Directory of Open Access Journals (Sweden)

    Aniruddh D. Patel

    2011-06-01

    Full Text Available Mounting evidence suggests that musical training benefits the neural encoding of speech. This paper offers a hypothesis specifying why such benefits occur. The OPERA hypothesis proposes that such benefits are driven by adaptive plasticity in speech-processing networks, and that this plasticity occurs when five conditions are met. These are: 1 Overlap: there is anatomical overlap in the brain networks that process an acoustic feature used in both music and speech (e.g., waveform periodicity, amplitude envelope, 2 Precision: music places higher demands on these shared networks than does speech, in terms of the precision of processing, 3 Emotion: the musical activities that engage this network elicit strong positive emotion, 4 Repetition: the musical activities that engage this network are frequently repeated, and 5 Attention: the musical activities that engage this network are associated with focused attention. According to the OPERA hypothesis, when these conditions are met neural plasticity drives the networks in question to function with higher precision than needed for ordinary speech communication. Yet since speech shares these networks with music, speech processing benefits. The OPERA hypothesis is used to account for the observed superior subcortical encoding of speech in musically trained individuals, and to suggest mechanisms by which musical training might improve linguistic reading abilities.

  13. Why would Musical Training Benefit the Neural Encoding of Speech? The OPERA Hypothesis.

    Science.gov (United States)

    Patel, Aniruddh D

    2011-01-01

    Mounting evidence suggests that musical training benefits the neural encoding of speech. This paper offers a hypothesis specifying why such benefits occur. The "OPERA" hypothesis proposes that such benefits are driven by adaptive plasticity in speech-processing networks, and that this plasticity occurs when five conditions are met. These are: (1) Overlap: there is anatomical overlap in the brain networks that process an acoustic feature used in both music and speech (e.g., waveform periodicity, amplitude envelope), (2) Precision: music places higher demands on these shared networks than does speech, in terms of the precision of processing, (3) Emotion: the musical activities that engage this network elicit strong positive emotion, (4) Repetition: the musical activities that engage this network are frequently repeated, and (5) Attention: the musical activities that engage this network are associated with focused attention. According to the OPERA hypothesis, when these conditions are met neural plasticity drives the networks in question to function with higher precision than needed for ordinary speech communication. Yet since speech shares these networks with music, speech processing benefits. The OPERA hypothesis is used to account for the observed superior subcortical encoding of speech in musically trained individuals, and to suggest mechanisms by which musical training might improve linguistic reading abilities.

  14. Comparing Spatial Predictions

    KAUST Repository

    Hering, Amanda S.

    2011-11-01

    Under a general loss function, we develop a hypothesis test to determine whether a significant difference in the spatial predictions produced by two competing models exists on average across the entire spatial domain of interest. The null hypothesis is that of no difference, and a spatial loss differential is created based on the observed data, the two sets of predictions, and the loss function chosen by the researcher. The test assumes only isotropy and short-range spatial dependence of the loss differential but does allow it to be non-Gaussian, non-zero-mean, and spatially correlated. Constant and nonconstant spatial trends in the loss differential are treated in two separate cases. Monte Carlo simulations illustrate the size and power properties of this test, and an example based on daily average wind speeds in Oklahoma is used for illustration. Supplemental results are available online. © 2011 American Statistical Association and the American Society for Qualitys.

  15. Precision Cosmology

    Science.gov (United States)

    Jones, Bernard J. T.

    2017-04-01

    Preface; Notation and conventions; Part I. 100 Years of Cosmology: 1. Emerging cosmology; 2. The cosmic expansion; 3. The cosmic microwave background; 4. Recent cosmology; Part II. Newtonian Cosmology: 5. Newtonian cosmology; 6. Dark energy cosmological models; 7. The early universe; 8. The inhomogeneous universe; 9. The inflationary universe; Part III. Relativistic Cosmology: 10. Minkowski space; 11. The energy momentum tensor; 12. General relativity; 13. Space-time geometry and calculus; 14. The Einstein field equations; 15. Solutions of the Einstein equations; 16. The Robertson-Walker solution; 17. Congruences, curvature and Raychaudhuri; 18. Observing and measuring the universe; Part IV. The Physics of Matter and Radiation: 19. Physics of the CMB radiation; 20. Recombination of the primeval plasma; 21. CMB polarisation; 22. CMB anisotropy; Part V. Precision Tools for Precision Cosmology: 23. Likelihood; 24. Frequentist hypothesis testing; 25. Statistical inference: Bayesian; 26. CMB data processing; 27. Parametrising the universe; 28. Precision cosmology; 29. Epilogue; Appendix A. SI, CGS and Planck units; Appendix B. Magnitudes and distances; Appendix C. Representing vectors and tensors; Appendix D. The electromagnetic field; Appendix E. Statistical distributions; Appendix F. Functions on a sphere; Appendix G. Acknowledgements; References; Index.

  16. Correction of Spatial Bias in Oligonucleotide Array Data

    Directory of Open Access Journals (Sweden)

    Philippe Serhal

    2013-01-01

    Full Text Available Background. Oligonucleotide microarrays allow for high-throughput gene expression profiling assays. The technology relies on the fundamental assumption that observed hybridization signal intensities (HSIs for each intended target, on average, correlate with their target’s true concentration in the sample. However, systematic, nonbiological variation from several sources undermines this hypothesis. Background hybridization signal has been previously identified as one such important source, one manifestation of which appears in the form of spatial autocorrelation. Results. We propose an algorithm, pyn, for the elimination of spatial autocorrelation in HSIs, exploiting the duality of desirable mutual information shared by probes in a common probe set and undesirable mutual information shared by spatially proximate probes. We show that this correction procedure reduces spatial autocorrelation in HSIs; increases HSI reproducibility across replicate arrays; increases differentially expressed gene detection power; and performs better than previously published methods. Conclusions. The proposed algorithm increases both precision and accuracy, while requiring virtually no changes to users’ current analysis pipelines: the correction consists merely of a transformation of raw HSIs (e.g., CEL files for Affymetrix arrays. A free, open-source implementation is provided as an R package, compatible with standard Bioconductor tools. The approach may also be tailored to other platform types and other sources of bias.

  17. Correction of Spatial Bias in Oligonucleotide Array Data

    Science.gov (United States)

    Lemieux, Sébastien

    2013-01-01

    Background. Oligonucleotide microarrays allow for high-throughput gene expression profiling assays. The technology relies on the fundamental assumption that observed hybridization signal intensities (HSIs) for each intended target, on average, correlate with their target's true concentration in the sample. However, systematic, nonbiological variation from several sources undermines this hypothesis. Background hybridization signal has been previously identified as one such important source, one manifestation of which appears in the form of spatial autocorrelation. Results. We propose an algorithm, pyn, for the elimination of spatial autocorrelation in HSIs, exploiting the duality of desirable mutual information shared by probes in a common probe set and undesirable mutual information shared by spatially proximate probes. We show that this correction procedure reduces spatial autocorrelation in HSIs; increases HSI reproducibility across replicate arrays; increases differentially expressed gene detection power; and performs better than previously published methods. Conclusions. The proposed algorithm increases both precision and accuracy, while requiring virtually no changes to users' current analysis pipelines: the correction consists merely of a transformation of raw HSIs (e.g., CEL files for Affymetrix arrays). A free, open-source implementation is provided as an R package, compatible with standard Bioconductor tools. The approach may also be tailored to other platform types and other sources of bias. PMID:23573083

  18. Multi-agent sequential hypothesis testing

    KAUST Repository

    Kim, Kwang-Ki K.

    2014-12-15

    This paper considers multi-agent sequential hypothesis testing and presents a framework for strategic learning in sequential games with explicit consideration of both temporal and spatial coordination. The associated Bayes risk functions explicitly incorporate costs of taking private/public measurements, costs of time-difference and disagreement in actions of agents, and costs of false declaration/choices in the sequential hypothesis testing. The corresponding sequential decision processes have well-defined value functions with respect to (a) the belief states for the case of conditional independent private noisy measurements that are also assumed to be independent identically distributed over time, and (b) the information states for the case of correlated private noisy measurements. A sequential investment game of strategic coordination and delay is also discussed as an application of the proposed strategic learning rules.

  19. Transformations and representations supporting spatial perspective taking

    Science.gov (United States)

    Yu, Alfred B.; Zacks, Jeffrey M.

    2018-01-01

    Spatial perspective taking is the ability to reason about spatial relations relative to another’s viewpoint. Here, we propose a mechanistic hypothesis that relates mental representations of one’s viewpoint to the transformations used for spatial perspective taking. We test this hypothesis using a novel behavioral paradigm that assays patterns of response time and variation in those patterns across people. The results support the hypothesis that people maintain a schematic representation of the space around their body, update that representation to take another’s perspective, and thereby to reason about the space around their body. This is a powerful computational mechanism that can support imitation, coordination of behavior, and observational learning. PMID:29545731

  20. Measurement of turbulent spatial structure and kinetic energy spectrum by exact temporal-to-spatial mapping

    DEFF Research Database (Denmark)

    Buchhave, Preben; Velte, Clara Marika

    2017-01-01

    distortions caused by Taylor’s hypothesis. The method is first confirmed to produce the correct statistics using computer simulations and later applied to measurements in some of the most difficult regions of a round turbulent jet—the non-equilibrium developing region and the outermost parts of the developed......We present a method for converting a time record of turbulent velocity measured at a point in a flow to a spatial velocity record consisting of consecutive convection elements. The spatial record allows computation of dynamic statistical moments such as turbulent kinetic wavenumber spectra...... and spatial structure functions in a way that completely bypasses the need for Taylor’s hypothesis. The spatial statistics agree with the classical counterparts, such as the total kinetic energy spectrum, at least for spatial extents up to the Taylor microscale. The requirements for applying the method...

  1. French Meteor Network for High Precision Orbits of Meteoroids

    Science.gov (United States)

    Atreya, P.; Vaubaillon, J.; Colas, F.; Bouley, S.; Gaillard, B.; Sauli, I.; Kwon, M. K.

    2011-01-01

    There is a lack of precise meteoroids orbit from video observations as most of the meteor stations use off-the-shelf CCD cameras. Few meteoroids orbit with precise semi-major axis are available using film photographic method. Precise orbits are necessary to compute the dust flux in the Earth s vicinity, and to estimate the ejection time of the meteoroids accurately by comparing them with the theoretical evolution model. We investigate the use of large CCD sensors to observe multi-station meteors and to compute precise orbit of these meteoroids. An ideal spatial and temporal resolution to get an accuracy to those similar of photographic plates are discussed. Various problems faced due to the use of large CCD, such as increasing the spatial and the temporal resolution at the same time and computational problems in finding the meteor position are illustrated.

  2. On Using Taylor's Hypothesis for Three-Dimensional Mixing Layers

    Science.gov (United States)

    LeBoeuf, Richard L.; Mehta, Rabindra D.

    1995-01-01

    In the present study, errors in using Taylor's hypothesis to transform measurements obtained in a temporal (or phase) frame onto a spatial one were evaluated. For the first time, phase-averaged ('real') spanwise and streamwise vorticity data measured on a three-dimensional grid were compared directly to those obtained using Taylor's hypothesis. The results show that even the qualitative features of the spanwise and streamwise vorticity distributions given by the two techniques can be very different. This is particularly true in the region of the spanwise roller pairing. The phase-averaged spanwise and streamwise peak vorticity levels given by Taylor's hypothesis are typically lower (by up to 40%) compared to the real measurements.

  3. The Stoichiometric Divisome: A Hypothesis

    Directory of Open Access Journals (Sweden)

    Waldemar eVollmer

    2015-05-01

    Full Text Available Dividing Escherichia coli cells simultaneously constrict the inner membrane, peptidoglycan layer and outer membrane to synthesize the new poles of the daughter cells. For this, more than 30 proteins localize to mid-cell where they form a large, ring-like assembly, the divisome, facilitating division. Although the precise function of most divisome proteins is unknown, it became apparent in recent years that dynamic protein-protein interactions are essential for divisome assembly and function. However, little is known about the nature of the interactions involved and the stoichiometry of the proteins within the divisome. A recent study (Li et al., 2014 used ribosome profiling to measure the absolute protein synthesis rates in E. coli. Interestingly, they observed that most proteins which participate in known multiprotein complexes are synthesized proportional to their stoichiometry. Based on this principle we present a hypothesis for the stoichiometry of the core of the divisome, taking into account known protein-protein interactions. From this hypothesis we infer a possible mechanism for PG synthesis during division.

  4. The demand for environmental quality and the environmental Kuznets Curve hypothesis

    International Nuclear Information System (INIS)

    Khanna, Neha; Plassmann, Florenz

    2004-01-01

    Household demand for better environmental quality is the key factor in the long-term global applicability of the Environmental Kuznets Curve (EKC) hypothesis. We argue that, for given consumer preferences, the threshold income level at which the EKC turns downwards or the equilibrium income elasticity changes sign from positive to negative depends on the ability to spatially separate production and consumption. We test our hypothesis by estimating the equilibrium income elasticities of five pollutants, using 1990 data for the United States. We find that the change in sign occurs at lower income levels for pollutants for which spatial separation is relatively easy as compared to pollutants for which spatial separation is difficult. Our results suggest that even high-income households in the United States have not yet reached the income level at which their demand for better environmental quality is high enough to cause the income-pollution relationship to turn downwards for all the pollutants that we analyzed

  5. The demand for environmental quality and the environmental Kuznets Curve hypothesis

    Energy Technology Data Exchange (ETDEWEB)

    Khanna, Neha [Department of Economics and Environmental Studies Program, Binghamton, University (LT 1004), P.O. Box 6000, Binghamton, NY 13902-6000 (United States); Plassmann, Florenz [Department of Economics, Binghamton University (LT 904), P.O. Box 6000, Binghamton, NY 13902-6000 (United States)

    2004-12-01

    Household demand for better environmental quality is the key factor in the long-term global applicability of the Environmental Kuznets Curve (EKC) hypothesis. We argue that, for given consumer preferences, the threshold income level at which the EKC turns downwards or the equilibrium income elasticity changes sign from positive to negative depends on the ability to spatially separate production and consumption. We test our hypothesis by estimating the equilibrium income elasticities of five pollutants, using 1990 data for the United States. We find that the change in sign occurs at lower income levels for pollutants for which spatial separation is relatively easy as compared to pollutants for which spatial separation is difficult. Our results suggest that even high-income households in the United States have not yet reached the income level at which their demand for better environmental quality is high enough to cause the income-pollution relationship to turn downwards for all the pollutants that we analyzed.

  6. [Dilemma of null hypothesis in ecological hypothesis's experiment test.

    Science.gov (United States)

    Li, Ji

    2016-06-01

    Experimental test is one of the major test methods of ecological hypothesis, though there are many arguments due to null hypothesis. Quinn and Dunham (1983) analyzed the hypothesis deduction model from Platt (1964) and thus stated that there is no null hypothesis in ecology that can be strictly tested by experiments. Fisher's falsificationism and Neyman-Pearson (N-P)'s non-decisivity inhibit statistical null hypothesis from being strictly tested. Moreover, since the null hypothesis H 0 (α=1, β=0) and alternative hypothesis H 1 '(α'=1, β'=0) in ecological progresses are diffe-rent from classic physics, the ecological null hypothesis can neither be strictly tested experimentally. These dilemmas of null hypothesis could be relieved via the reduction of P value, careful selection of null hypothesis, non-centralization of non-null hypothesis, and two-tailed test. However, the statistical null hypothesis significance testing (NHST) should not to be equivalent to the causality logistical test in ecological hypothesis. Hence, the findings and conclusions about methodological studies and experimental tests based on NHST are not always logically reliable.

  7. High-precision positioning of radar scatterers

    NARCIS (Netherlands)

    Dheenathayalan, P.; Small, D.; Schubert, A.; Hanssen, R.F.

    2016-01-01

    Remote sensing radar satellites cover wide areas and provide spatially dense measurements, with millions of scatterers. Knowledge of the precise position of each radar scatterer is essential to identify the corresponding object and interpret the estimated deformation. The absolute position accuracy

  8. Sexual Orientation-Related Differences in Virtual Spatial Navigation and Spatial Search Strategies.

    Science.gov (United States)

    Rahman, Qazi; Sharp, Jonathan; McVeigh, Meadhbh; Ho, Man-Ling

    2017-07-01

    Spatial abilities are generally hypothesized to differ between men and women, and people with different sexual orientations. According to the cross-sex shift hypothesis, gay men are hypothesized to perform in the direction of heterosexual women and lesbian women in the direction of heterosexual men on cognitive tests. This study investigated sexual orientation differences in spatial navigation and strategy during a virtual Morris water maze task (VMWM). Forty-four heterosexual men, 43 heterosexual women, 39 gay men, and 34 lesbian/bisexual women (aged 18-54 years) navigated a desktop VMWM and completed measures of intelligence, handedness, and childhood gender nonconformity (CGN). We quantified spatial learning (hidden platform trials), probe trial performance, and cued navigation (visible platform trials). Spatial strategies during hidden and probe trials were classified into visual scanning, landmark use, thigmotaxis/circling, and enfilading. In general, heterosexual men scored better than women and gay men on some spatial learning and probe trial measures and used more visual scan strategies. However, some differences disappeared after controlling for age and estimated IQ (e.g., in visual scanning heterosexual men differed from women but not gay men). Heterosexual women did not differ from lesbian/bisexual women. For both sexes, visual scanning predicted probe trial performance. More feminine CGN scores were associated with lower performance among men and greater performance among women on specific spatial learning or probe trial measures. These results provide mixed evidence for the cross-sex shift hypothesis of sexual orientation-related differences in spatial cognition.

  9. Geo-registration of Unprofessional and Weakly-related Image and Precision Evaluation

    Directory of Open Access Journals (Sweden)

    LIU Yingzhen

    2015-09-01

    Full Text Available The 3D geo-spatial model built by unprofessional and weakly-related image is a significant source of geo-spatial information. The unprofessional and weakly-related image cannot be useful geo-spatial information until be geo-registered with accurate geo-spatial orientation and location. In this paper, we present an automatic geo-registration using the coordination acquired by real-time GPS module. We calculate 2D and 3D spatial transformation parameters based on the spatial similarity between the image location in the geo-spatial coordination system and in the 3D reconstruction coordination system. Because of the poor precision of GPS information and especially the unstability of elevation measurement, we use RANSAC algorithm to get rid of outliers. In the experiment, we compare the geo-registered image positions to their differential GPS coordinates. The errors of translation, rotation and scaling are evaluated quantitively and the causes of bad result are analyzed. The experiment demonstrates that this geo-registration method can get a precise result with enough images.

  10. Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture

    Science.gov (United States)

    Sanfilippo, Antonio P [Richland, WA; Cowell, Andrew J [Kennewick, WA; Gregory, Michelle L [Richland, WA; Baddeley, Robert L [Richland, WA; Paulson, Patrick R [Pasco, WA; Tratz, Stephen C [Richland, WA; Hohimer, Ryan E [West Richland, WA

    2012-03-20

    Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

  11. Usefulness of Models in Precision Nutrient Management

    DEFF Research Database (Denmark)

    Plauborg, Finn; Manevski, Kiril; Zhenjiang, Zhou

    Modern agriculture increasingly applies new methods and technologies to increase production and nutrient use efficiencies and at the same time reduce leaching of nutrients and greenhouse gas emissions. GPS based ECa-measurement equipment, ER or EM instrumentations, are used to spatially character......Modern agriculture increasingly applies new methods and technologies to increase production and nutrient use efficiencies and at the same time reduce leaching of nutrients and greenhouse gas emissions. GPS based ECa-measurement equipment, ER or EM instrumentations, are used to spatially...... and mineral composition. Mapping of crop status and the spatial-temporal variability within fields with red-infrared reflection are used to support decision on split fertilisation and more precise dosing. The interpretation and use of these various data in precise nutrient management is not straightforward...... of mineralisation. However, whether the crop would benefit from this depended to a large extent on soil hydraulic conductivity within the range of natural variation when testing the model. In addition the initialisation of the distribution of soil total carbon and nitrogen into conceptual model compartments...

  12. The large numbers hypothesis and a relativistic theory of gravitation

    International Nuclear Information System (INIS)

    Lau, Y.K.; Prokhovnik, S.J.

    1986-01-01

    A way to reconcile Dirac's large numbers hypothesis and Einstein's theory of gravitation was recently suggested by Lau (1985). It is characterized by the conjecture of a time-dependent cosmological term and gravitational term in Einstein's field equations. Motivated by this conjecture and the large numbers hypothesis, we formulate here a scalar-tensor theory in terms of an action principle. The cosmological term is required to be spatially dependent as well as time dependent in general. The theory developed is appled to a cosmological model compatible with the large numbers hypothesis. The time-dependent form of the cosmological term and the scalar potential are then deduced. A possible explanation of the smallness of the cosmological term is also given and the possible significance of the scalar field is speculated

  13. Gelatin-based laser direct-write technique for the precise spatial patterning of cells.

    Science.gov (United States)

    Schiele, Nathan R; Chrisey, Douglas B; Corr, David T

    2011-03-01

    Laser direct-writing provides a method to pattern living cells in vitro, to study various cell-cell interactions, and to build cellular constructs. However, the materials typically used may limit its long-term application. By utilizing gelatin coatings on the print ribbon and growth surface, we developed a new approach for laser cell printing that overcomes the limitations of Matrigel™. Gelatin is free of growth factors and extraneous matrix components that may interfere with cellular processes under investigation. Gelatin-based laser direct-write was able to successfully pattern human dermal fibroblasts with high post-transfer viability (91% ± 3%) and no observed double-strand DNA damage. As seen with atomic force microscopy, gelatin offers a unique benefit in that it is present temporarily to allow cell transfer, but melts and is removed with incubation to reveal the desired application-specific growth surface. This provides unobstructed cellular growth after printing. Monitoring cell location after transfer, we show that melting and removal of gelatin does not affect cellular placement; cells maintained registry within 5.6 ± 2.5 μm to the initial pattern. This study demonstrates the effectiveness of gelatin in laser direct-writing to create spatially precise cell patterns with the potential for applications in tissue engineering, stem cell, and cancer research.

  14. Is Social Categorization Spatially Organized in a "Mental Line"? Empirical Evidences for Spatial Bias in Intergroup Differentiation.

    Science.gov (United States)

    Presaghi, Fabio; Rullo, Marika

    2018-01-01

    Social categorization is the differentiation between the self and others and between one's own group and other groups and it is such a natural and spontaneous process that often we are not aware of it. The way in which the brain organizes social categorization remains an unresolved issue. We present three experiments investigating the hypothesis that social categories are mentally ordered from left to right on an ingroup-outgroup continuum when membership is salient. To substantiate our hypothesis, we consider empirical evidence from two areas of psychology: research on differences in processing of ingroups and outgroups and research on the effects of spatial biases on processing of quantitative information (e.g., time; numbers) which appears to be arranged from left to right on a small-large continuum, an effect known as the spatial-numerical association of response codes (SNARC). In Experiments 1 and 2 we tested the hypothesis that when membership of a social category is activated, people implicitly locate ingroup categories to the left of a mental line whereas outgroup categories are located on the far right of the same mental line. This spatial organization persists even when stimuli are presented on one of the two sides of the screen and their (explicit) position is spatially incompatible with the implicit mental spatial organization of social categories (Experiment 3). Overall the results indicate that ingroups and outgroups are processed differently. The results are discussed with respect to social categorization theory, spatial agency bias, i.e., the effect observed in Western cultures whereby the agent of an action is mentally represented on the left and the recipient on the right, and the SNARC effect.

  15. Strong spatial genetic structure in five tropical Piper species: should the Baker–Fedorov hypothesis be revived for tropical shrubs?

    Science.gov (United States)

    Lasso, E; Dalling, J W; Bermingham, E

    2011-01-01

    Fifty years ago, Baker and Fedorov proposed that the high species diversity of tropical forests could arise from the combined effects of inbreeding and genetic drift leading to population differentiation and eventually to sympatric speciation. Decades of research, however have failed to support the Baker–Fedorov hypothesis (BFH), and it has now been discarded in favor of a paradigm where most trees are self-incompatible or strongly outcrossing, and where long-distance pollen dispersal prevents population drift. Here, we propose that several hyper-diverse genera of tropical herbs and shrubs, including Piper (>1,000 species), may provide an exception. Species in this genus often have aggregated, high-density populations with self-compatible breeding systems; characteristics which the BFH would predict lead to high local genetic differentiation. We test this prediction for five Piper species on Barro Colorado Island, Panama, using Amplified Fragment Length Polymorphism (AFLP) markers. All species showed strong genetic structure at both fine- and large-spatial scales. Over short distances (200–750 m) populations showed significant genetic differentiation (Fst 0.11–0.46, P < 0.05), with values of spatial genetic structure that exceed those reported for other tropical tree species (Sp = 0.03–0.136). This genetic structure probably results from the combined effects of limited seed and pollen dispersal, clonal spread, and selfing. These processes are likely to have facilitated the diversification of populations in response to local natural selection or genetic drift and may explain the remarkable diversity of this rich genus. PMID:22393518

  16. Accuracy and precision of oscillometric blood pressure in standing conscious horses

    DEFF Research Database (Denmark)

    Olsen, Emil; Pedersen, Tilde Louise Skovgaard; Robinson, Rebecca

    2016-01-01

    from a teaching and research herd. HYPOTHESIS/OBJECTIVE: To evaluate the accuracy and precision of systolic arterial pressure (SAP), diastolic arterial pressure (DAP), and mean arterial pressure (MAP) in conscious horses obtained with an oscillometric NIBP device when compared to invasively measured...... administration. Agreement analysis with replicate measures was utilized to calculate bias (accuracy) and standard deviation (SD) of bias (precision). RESULTS: A total of 252 pairs of invasive arterial BP and NIBP measurements were analyzed. Compared to the direct BP measures, the NIBP MAP had an accuracy of -4...... mm Hg and precision of 10 mm Hg. SAP had an accuracy of -8 mm Hg and a precision of 17 mm Hg and DAP had an accuracy of -7 mm Hg and a precision of 14 mm Hg. CONCLUSIONS AND CLINICAL RELEVANCE: MAP from the evaluated NIBP monitor is accurate and precise in the adult horse across a range of BP...

  17. Modulation of the Object/Background Interaction by Spatial Frequency

    Directory of Open Access Journals (Sweden)

    Yanju Ren

    2011-05-01

    Full Text Available With regard to the relationship between object and background perception in the natural scene images, functional isolation hypothesis and interactive hypothesis were proposed. Based on previous studies, the present study investigated the role of spatial frequency in the relationship between object and background perception in the natural scene images. In three experiments, participants reported the object, background, or both after seeing each picture for 500 ms followed by a mask. The authors found that (a backgrounds were identified more accurately when they contained a consistent rather than an inconsistent object, independently of spatial frequency; (b objects were identified more accurately in a consistent than an inconsistent background under the condition of low spatial frequencies but not high spatial frequencies; (c spatial frequency modulation remained when both objects and backgrounds were reported simultaneously. The authors conclude that object/background interaction is partially dependent on spatial frequency.

  18. Seeing via miniature eye movements: A dynamic hypothesis for vision

    Directory of Open Access Journals (Sweden)

    Ehud eAhissar

    2012-11-01

    Full Text Available During natural viewing, the eyes are never still. Even during fixation, miniature movements of the eyes move the retinal image across tens of foveal photoreceptors. Most theories of vision implicitly assume that the visual system ignores these movements and somehow overcomes the resulting smearing. However, evidence has accumulated to indicate that fixational eye movements cannot be ignored by the visual system if fine spatial details are to be resolved. We argue that the only way the visual system can achieve its high resolution given its fixational movements is by seeing via these movements. Seeing via eye movements also eliminates the instability of the image, which would be induced by them otherwise. Here we present a hypothesis for vision, in which coarse details are spatially-encoded in gaze-related coordinates, and fine spatial details are temporally-encoded in relative retinal coordinates. The temporal encoding presented here achieves its highest resolution by encoding along the elongated axes of simple cell receptive fields and not across these axes as suggested by spatial models of vision. According to our hypothesis, fine details of shape are encoded by inter-receptor temporal phases, texture by instantaneous intra-burst rates of individual receptors, and motion by inter-burst temporal frequencies. We further describe the ability of the visual system to readout the encoded information and recode it internally. We show how reading out of retinal signals can be facilitated by neuronal phase-locked loops (NPLLs, which lock to the retinal jitter; this locking enables recoding of motion information and temporal framing of shape and texture processing. A possible implementation of this locking-and-recoding process by specific thalamocortical loops is suggested. Overall it is suggested that high-acuity vision is based primarily on temporal mechanisms of the sort presented here and low-acuity vision is based primarily on spatial mechanisms.

  19. Simple Syllabic Calls Accompany Discrete Behavior Patterns in Captive Pteronotus parnellii: An Illustration of the Motivation-Structure Hypothesis

    Directory of Open Access Journals (Sweden)

    Matthew J. Clement

    2012-01-01

    Full Text Available Mustached bats, Pteronotus parnellii, are highly social and vocal. Individuals of this species roost in tight clusters, and emit an acoustically rich repertoire of calls whose behavioral significance is largely unknown. We recorded their social and vocal behaviors within a colony housed under semi-natural conditions. We also quantified the spatial spread of each bat’s roosting location and discovered that this was relatively fixed and roughly confined to an individual’s body width. The spatial precision in roosting was accompanied by an equally remarkable match between specific vocalizations and well-timed, discrete, identifiable postures/behaviors, as revealed by logistic regression analysis. The bodily behaviors included crouching, marking, yawning, nipping, flicking, fighting, kissing, inspecting, and fly-bys. Two echolocation-like calls were used to maintain spacing in the colony, two noisy broadband calls were emitted during fights, two tonal calls conveyed fear, and another tonal call signaled appeasement. Overall, the results establish that mustached bats exhibit complex social interactions common to other social mammals. The correspondence of relatively low frequency and noisy, broadband calls with aggression, and of tonal, high frequency calls with fear supports Morton’s Motivation-Structure hypothesis, and establishes a link between motivation and the acoustic structure of social calls emitted by mustached bats.

  20. Influence of local topography on precision irrigation management

    Science.gov (United States)

    Precision irrigation management is currently accomplished using spatial information about soil properties through soil series maps or electrical conductivity (EC measurements. Crop yield, however, is consistently influenced by local topography, both in rain-fed and irrigated environments. Utilizing ...

  1. A Modified Version of Taylor’s Hypothesis for Solar Probe Plus Observations

    Science.gov (United States)

    Klein, Kristopher G.; Perez, Jean C.; Verscharen, Daniel; Mallet, Alfred; Chandran, Benjamin D. G.

    2015-03-01

    The Solar Probe Plus (SPP) spacecraft will explore the near-Sun environment, reaching heliocentric distances less than 10 {{R}⊙ }. Near Earth, spacecraft measurements of fluctuating velocities and magnetic fields taken in the time domain are translated into information about the spatial structure of the solar wind via Taylor’s “frozen turbulence” hypothesis. Near the perihelion of SPP, however, the solar-wind speed is comparable to the Alfvén speed, and Taylor’s hypothesis in its usual form does not apply. In this paper, we show that under certain assumptions, a modified version of Taylor’s hypothesis can be recovered in the near-Sun region. We consider only the transverse, non-compressive component of the fluctuations at length scales exceeding the proton gyroradius, and we describe these fluctuations using an approximate theoretical framework developed by Heinemann and Olbert. We show that fluctuations propagating away from the Sun in the plasma frame obey a relation analogous to Taylor’s hypothesis when {{V}sc,\\bot }\\gg {{z}-} and {{z}+}\\gg {{z}-}, where {{V}sc,\\bot } is the component of the spacecraft velocity perpendicular to the mean magnetic field and {{{\\boldsymbol{z}} }+} ({{{\\boldsymbol{z}} }-}) is the Elsasser variable corresponding to transverse, non-compressive fluctuations propagating away from (toward) the Sun in the plasma frame. Observations and simulations suggest that, in the near-Sun solar wind, the above inequalities are satisfied and {{{\\boldsymbol{z}} }+} fluctuations account for most of the fluctuation energy. The modified form of Taylor’s hypothesis that we derive may thus make it possible to characterize the spatial structure of the energetically dominant component of the turbulence encountered by SPP.

  2. Is Social Categorization Spatially Organized in a “Mental Line”? Empirical Evidences for Spatial Bias in Intergroup Differentiation

    Directory of Open Access Journals (Sweden)

    Fabio Presaghi

    2018-02-01

    Full Text Available Social categorization is the differentiation between the self and others and between one’s own group and other groups and it is such a natural and spontaneous process that often we are not aware of it. The way in which the brain organizes social categorization remains an unresolved issue. We present three experiments investigating the hypothesis that social categories are mentally ordered from left to right on an ingroup–outgroup continuum when membership is salient. To substantiate our hypothesis, we consider empirical evidence from two areas of psychology: research on differences in processing of ingroups and outgroups and research on the effects of spatial biases on processing of quantitative information (e.g., time; numbers which appears to be arranged from left to right on a small–large continuum, an effect known as the spatial-numerical association of response codes (SNARC. In Experiments 1 and 2 we tested the hypothesis that when membership of a social category is activated, people implicitly locate ingroup categories to the left of a mental line whereas outgroup categories are located on the far right of the same mental line. This spatial organization persists even when stimuli are presented on one of the two sides of the screen and their (explicit position is spatially incompatible with the implicit mental spatial organization of social categories (Experiment 3. Overall the results indicate that ingroups and outgroups are processed differently. The results are discussed with respect to social categorization theory, spatial agency bias, i.e., the effect observed in Western cultures whereby the agent of an action is mentally represented on the left and the recipient on the right, and the SNARC effect.

  3. Using Big Data Analytics to Advance Precision Radiation Oncology.

    Science.gov (United States)

    McNutt, Todd R; Benedict, Stanley H; Low, Daniel A; Moore, Kevin; Shpitser, Ilya; Jiang, Wei; Lakshminarayanan, Pranav; Cheng, Zhi; Han, Peijin; Hui, Xuan; Nakatsugawa, Minoru; Lee, Junghoon; Moore, Joseph A; Robertson, Scott P; Shah, Veeraj; Taylor, Russ; Quon, Harry; Wong, John; DeWeese, Theodore

    2018-06-01

    Big clinical data analytics as a primary component of precision medicine is discussed, identifying where these emerging tools fit in the spectrum of genomics and radiomics research. A learning health system (LHS) is conceptualized that uses clinically acquired data with machine learning to advance the initiatives of precision medicine. The LHS is comprehensive and can be used for clinical decision support, discovery, and hypothesis derivation. These developing uses can positively impact the ultimate management and therapeutic course for patients. The conceptual model for each use of clinical data, however, is different, and an overview of the implications is discussed. With advancements in technologies and culture to improve the efficiency, accuracy, and breadth of measurements of the patient condition, the concept of an LHS may be realized in precision radiation therapy. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. Precision Mass Measurement of Argon Isotopes

    CERN Multimedia

    Lunney, D

    2002-01-01

    % IS388\\\\ \\\\ A precision mass measurement of the neutron-deficient isotopes $^{32,33,34}$Ar is proposed. Mass values of these isotopes are of importance for: a) a stringent test of the Isobaric-Multiplet- Mass-Equation, b) a verification of the correctness of calculated charge-dependent corrections as used in super-allowed $\\beta$- decay studies aiming at a test of the CVC hypothesis, and c) the determination of the kinematics in electron-neutrino correlation experiments searching for scalar currents in weak interaction. The measurements will be carried out with the ISOLTRAP Penning trap mass spectrometer.

  5. Isotopic Resonance Hypothesis: Experimental Verification by Escherichia coli Growth Measurements

    Science.gov (United States)

    Xie, Xueshu; Zubarev, Roman A.

    2015-03-01

    Isotopic composition of reactants affects the rates of chemical and biochemical reactions. As a rule, enrichment of heavy stable isotopes leads to progressively slower reactions. But the recent isotopic resonance hypothesis suggests that the dependence of the reaction rate upon the enrichment degree is not monotonous. Instead, at some ``resonance'' isotopic compositions, the kinetics increases, while at ``off-resonance'' compositions the same reactions progress slower. To test the predictions of this hypothesis for the elements C, H, N and O, we designed a precise (standard error +/-0.05%) experiment that measures the parameters of bacterial growth in minimal media with varying isotopic composition. A number of predicted resonance conditions were tested, with significant enhancements in kinetics discovered at these conditions. The combined statistics extremely strongly supports the validity of the isotopic resonance phenomenon (p biotechnology, medicine, chemistry and other areas.

  6. Precision requirements for space-based X(CO2) data

    International Nuclear Information System (INIS)

    Miller, C.E.; Crisp, D.; Miller, C.E.; Salawitch, J.; Sander, S.P.; Sen, B.; Toon, C.; DeCola, P.L.; Olsen, S.C.; Randerson, J.T.; Michalak, A.M.; Alkhaled, A.; Michalak, A.M.; Rayner, P.; Jacob, D.J.; Suntharalingam, P.; Wofsy, S.C.; Jacob, D.J.; Suntharalingam, P.; Wofsy, S.C.; Jones, D.B.A.; Denning, A.S.; Nicholls, M.E.; O'Brien, D.; Doney, S.C.; Pawson, S.; Pawson, S.; Connor, B.J.; Fung, I.Y.; Tans, P.; Wennberg, P.O.; Yung, Y.L.; Law, R.M.

    2007-01-01

    Precision requirements are determined for space-based column-averaged CO 2 dry air mole fraction X(CO 2 ) data. These requirements result from an assessment of spatial and temporal gradients in X(CO 2 ), the relationship between X(CO 2 ) precision and surface CO 2 flux uncertainties inferred from inversions of the X(CO 2 ) data, and the effects of X(CO 2 ) biases on the fidelity of CO 2 flux inversions. Observational system simulation experiments and synthesis inversion modeling demonstrate that the Orbiting Carbon Observatory mission design and sampling strategy provide the means to achieve these X(CO 2 ) data precision requirements. (authors)

  7. Modelling firm heterogeneity with spatial 'trends'

    Energy Technology Data Exchange (ETDEWEB)

    Sarmiento, C. [North Dakota State University, Fargo, ND (United States). Dept. of Agricultural Business & Applied Economics

    2004-04-15

    The hypothesis underlying this article is that firm heterogeneity can be captured by spatial characteristics of the firm (similar to the inclusion of a time trend in time series models). The hypothesis is examined in the context of modelling electric generation by coal powered plants in the presence of firm heterogeneity.

  8. High-Precision Half-Life Measurement for the Superallowed β+ Emitter Alm26

    Science.gov (United States)

    Finlay, P.; Ettenauer, S.; Ball, G. C.; Leslie, J. R.; Svensson, C. E.; Andreoiu, C.; Austin, R. A. E.; Bandyopadhyay, D.; Cross, D. S.; Demand, G.; Djongolov, M.; Garrett, P. E.; Green, K. L.; Grinyer, G. F.; Hackman, G.; Leach, K. G.; Pearson, C. J.; Phillips, A. A.; Sumithrarachchi, C. S.; Triambak, S.; Williams, S. J.

    2011-01-01

    A high-precision half-life measurement for the superallowed β+ emitter Alm26 was performed at the TRIUMF-ISAC radioactive ion beam facility yielding T1/2=6346.54±0.46stat±0.60systms, consistent with, but 2.5 times more precise than, the previous world average. The Alm26 half-life and ft value, 3037.53(61) s, are now the most precisely determined for any superallowed β decay. Combined with recent theoretical corrections for isospin-symmetry-breaking and radiative effects, the corrected Ft value for Alm26, 3073.0(12) s, sets a new benchmark for the high-precision superallowed Fermi β-decay studies used to test the conserved vector current hypothesis and determine the Vud element of the Cabibbo-Kobayashi-Maskawa quark mixing matrix.

  9. Advanced methods and algorithm for high precision astronomical imaging

    International Nuclear Information System (INIS)

    Ngole-Mboula, Fred-Maurice

    2016-01-01

    One of the biggest challenges of modern cosmology is to gain a more precise knowledge of the dark energy and the dark matter nature. Fortunately, the dark matter can be traced directly through its gravitational effect on galaxies shapes. The European Spatial Agency Euclid mission will precisely provide data for such a purpose. A critical step is analyzing these data will be to accurately model the instrument Point Spread Function (PSF), which the focus of this thesis.We developed non parametric methods to reliably estimate the PSFs across an instrument field-of-view, based on unresolved stars images and accounting for noise, under sampling and PSFs spatial variability. At the core of these contributions, modern mathematical tools and concepts such as sparsity. An important extension of this work will be to account for the PSFs wavelength dependency. (author) [fr

  10. Poultry, pig and the risk of BSE following the feed ban in France--a spatial analysis.

    Science.gov (United States)

    Abrial, David; Calavas, Didier; Jarrige, Nathalie; Ducrot, Christian

    2005-01-01

    A spatial analysis was carried out in order to analyse the reason why the risk of Bovine Spongiform Encephalopathy (BSE) was spatially heterogeneous in France, during the period following the feed ban of Meat and Bone Meal to cattle. The hypothesis of cross-contamination between cattle feedstuff and monogastric feedstuff, which was strongly suggested from previous investigations, was assessed, with the assumption that the higher the pig or poultry density is in a given area, the higher the risk of cross-contamination and cattle infection might be. The data concerned the 467 BSE cases born in France after the ban of meat and bone meal (July 1990) and detected between July 1st, 2001 and December 31, 2003, when the surveillance system was optimal and not spatially biased. The disease mapping models were elaborated with the Bayesian graphical modelling methods and based on a Poisson distribution with spatial smoothing (hierarchical approach) and covariates. The parameters were estimated by a Markov Chain Monte Carlo simulation method. The main result was that the poultry density did not significantly influence the risk of BSE whereas the pig density was significantly associated with an increase in the risk of 2.4% per 10 000 pigs. The areas with a significant pig effect were located in regions with a high pig density as well as a high ratio of pigs to cattle. Despite the absence of a global effect of poultry density on the BSE risk, some areas had a significant poultry effect and the risk was better explained in some others when considering both pig and poultry densities. These findings were in agreement with the hypothesis of cross-contamination, which could take place at the feedstuff factory, during the shipment of food or on the farm. Further studies are needed to more precisely explore how the cross-contamination happened.

  11. Precision half-life measurement of 11C: The most precise mirror transition F t value

    Science.gov (United States)

    Valverde, A. A.; Brodeur, M.; Ahn, T.; Allen, J.; Bardayan, D. W.; Becchetti, F. D.; Blankstein, D.; Brown, G.; Burdette, D. P.; Frentz, B.; Gilardy, G.; Hall, M. R.; King, S.; Kolata, J. J.; Long, J.; Macon, K. T.; Nelson, A.; O'Malley, P. D.; Skulski, M.; Strauss, S. Y.; Vande Kolk, B.

    2018-03-01

    Background: The precise determination of the F t value in T =1 /2 mixed mirror decays is an important avenue for testing the standard model of the electroweak interaction through the determination of Vu d in nuclear β decays. 11C is an interesting case, as its low mass and small QE C value make it particularly sensitive to violations of the conserved vector current hypothesis. The present dominant source of uncertainty in the 11CF t value is the half-life. Purpose: A high-precision measurement of the 11C half-life was performed, and a new world average half-life was calculated. Method: 11C was created by transfer reactions and separated using the TwinSol facility at the Nuclear Science Laboratory at the University of Notre Dame. It was then implanted into a tantalum foil, and β counting was used to determine the half-life. Results: The new half-life, t1 /2=1220.27 (26 ) s, is consistent with the previous values but significantly more precise. A new world average was calculated, t1/2 world=1220.41 (32 ) s, and a new estimate for the Gamow-Teller to Fermi mixing ratio ρ is presented along with standard model correlation parameters. Conclusions: The new 11C world average half-life allows the calculation of a F tmirror value that is now the most precise value for all superallowed mixed mirror transitions. This gives a strong impetus for an experimental determination of ρ , to allow for the determination of Vu d from this decay.

  12. Large numbers hypothesis. II - Electromagnetic radiation

    Science.gov (United States)

    Adams, P. J.

    1983-01-01

    This paper develops the theory of electromagnetic radiation in the units covariant formalism incorporating Dirac's large numbers hypothesis (LNH). A direct field-to-particle technique is used to obtain the photon propagation equation which explicitly involves the photon replication rate. This replication rate is fixed uniquely by requiring that the form of a free-photon distribution function be preserved, as required by the 2.7 K cosmic radiation. One finds that with this particular photon replication rate the units covariant formalism developed in Paper I actually predicts that the ratio of photon number to proton number in the universe varies as t to the 1/4, precisely in accord with LNH. The cosmological red-shift law is also derived and it is shown to differ considerably from the standard form of (nu)(R) - const.

  13. Problems in implementation of the spatial plan of the Republic of Srpska until 2015: Quantitative analysis

    Directory of Open Access Journals (Sweden)

    Bijelić Branislav

    2017-01-01

    Full Text Available The implementation of spatial plans in the Republic of Srpska is certainly the weakest phase of the process of spatial planning in this entity. It is particularly evident in the case of the Spatial Plan of the Republic of Srpska until 2015 which is the highest strategic spatial planning document in the Republic of Srpska. More precisely, the implementation of spatial plans has been defined as the carrying out of spatial planning documents, i.e. planning propositions as defined in the spatial plans. For the purpose of this paper, a quantitative analysis of the implementation of the planning propositions envisioned by this document has been carried out. The difference between what was planned and what was implemented at the end of the planning period (ex-post evaluation of planning decisions is presented in this paper. The weighting factor is defined for each thematic field and planning proposition, where the main criterion for determining the weighting factor is the share of the planning proposition and thematic field in the estimated total costs of the plan (financial criterion. The paper has also tackled the issue of the implementation of the Spatial Plan of Bosnia and Herzegovina for the period 1981 - 2000, as well as of the Spatial Plan of the Republic of Srpska 1996 - 2001 - Phased Plan for the period 1996 - 2001, as the previous strategic spatial planning documents of the highest rank covering the area of the Republic of Srpska. The research results have proven primary hypothesis of the paper that the level of the implementation of Spatial Plan of the Republic of Srpska until 2015 is less than 10%.

  14. Spatial analysis of NDVI readings with difference sampling density

    Science.gov (United States)

    Advanced remote sensing technologies provide research an innovative way of collecting spatial data for use in precision agriculture. Sensor information and spatial analysis together allow for a complete understanding of the spatial complexity of a field and its crop. The objective of the study was...

  15. Inferring spatial clouds statistics from limited field-of-view, zenith observations

    Energy Technology Data Exchange (ETDEWEB)

    Sun, C.H.; Thorne, L.R. [Sandia National Labs., Livermore, CA (United States)

    1996-04-01

    Many of the Cloud and Radiation Testbed (CART) measurements produce a time series of zenith observations, but spatial averages are often the desired data product. One possible approach to deriving spatial averages from temporal averages is to invoke Taylor`s hypothesis where and when it is valid. Taylor`s hypothesis states that when the turbulence is small compared with the mean flow, the covariance in time is related to the covariance in space by the speed of the mean flow. For clouds fields, Taylor`s hypothesis would apply when the {open_quotes}local{close_quotes} turbulence is small compared with advective flow (mean wind). The objective of this study is to determine under what conditions Taylor`s hypothesis holds or does not hold true for broken cloud fields.

  16. IQ as moderator of terminal decline in perceptual and motor speed, spatial, and verbal ability: Testing the cognitive reserve hypothesis in a population-based sample followed from age 70 until death.

    Science.gov (United States)

    Thorvaldsson, Valgeir; Skoog, Ingmar; Johansson, Boo

    2017-03-01

    Terminal decline (TD) refers to acceleration in within-person cognitive decline prior to death. The cognitive reserve hypothesis postulates that individuals with higher IQ are able to better tolerate age-related increase in brain pathologies. On average, they will exhibit a later onset of TD, but once they start to decline, their trajectory is steeper relative to those with lower IQ. We tested these predictions using data from initially nondemented individuals (n = 179) in the H70-study repeatedly measured at ages 70, 75, 79, 81, 85, 88, 90, 92, 95, 97, 99, and 100, or until death, on cognitive tests of perceptual-and-motor-speed and spatial and verbal ability. We quantified IQ using the Raven's Coloured Progressive Matrices (RCPM) test administrated at age 70. We fitted random change point TD models to the data, within a Bayesian framework, conditioned on IQ, age of death, education, and sex. In line with predictions, we found that 1 additional standard deviation on the IQ scale was associated with a delay in onset of TD by 1.87 (95% highest density interval [HDI; 0.20, 4.08]) years on speed, 1.96 (95% HDI [0.15, 3.54]) years on verbal ability, but only 0.88 (95% HDI [-0.93, 3.49]) year on spatial ability. Higher IQ was associated with steeper rate of decline within the TD phase on measures of speed and verbal ability, whereas results on spatial ability were nonconclusive. Our findings provide partial support for the cognitive reserve hypothesis and demonstrate that IQ can be a significant moderator of cognitive change trajectories in old age. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. Support for distinct subcomponents of spatial working memory: a double dissociation between spatial-simultaneous and spatial-sequential performance in unilateral neglect.

    Science.gov (United States)

    Wansard, Murielle; Bartolomeo, Paolo; Bastin, Christine; Segovia, Fermín; Gillet, Sophie; Duret, Christophe; Meulemans, Thierry

    2015-01-01

    Over the last decade, many studies have demonstrated that visuospatial working memory (VSWM) can be divided into separate subsystems dedicated to the retention of visual patterns and their serial order. Impaired VSWM has been suggested to exacerbate left visual neglect in right-brain-damaged individuals. The aim of this study was to investigate the segregation between spatial-sequential and spatial-simultaneous working memory in individuals with neglect. We demonstrated that patterns of results on these VSWM tasks can be dissociated. Spatial-simultaneous and sequential aspects of VSWM can be selectively impaired in unilateral neglect. Our results support the hypothesis of multiple VSWM subsystems, which should be taken into account to better understand neglect-related deficits.

  18. Hypothesis Designs for Three-Hypothesis Test Problems

    OpenAIRE

    Yan Li; Xiaolong Pu

    2010-01-01

    As a helpful guide for applications, the alternative hypotheses of the three-hypothesis test problems are designed under the required error probabilities and average sample number in this paper. The asymptotic formulas and the proposed numerical quadrature formulas are adopted, respectively, to obtain the hypothesis designs and the corresponding sequential test schemes under the Koopman-Darmois distributions. The example of the normal mean test shows that our methods are qu...

  19. Spatial working memory for locations specified by vision and audition: testing the amodality hypothesis.

    Science.gov (United States)

    Loomis, Jack M; Klatzky, Roberta L; McHugh, Brendan; Giudice, Nicholas A

    2012-08-01

    Spatial working memory can maintain representations from vision, hearing, and touch, representations referred to here as spatial images. The present experiment addressed whether spatial images from vision and hearing that are simultaneously present within working memory retain modality-specific tags or are amodal. Observers were presented with short sequences of targets varying in angular direction, with the targets in a given sequence being all auditory, all visual, or a sequential mixture of the two. On two thirds of the trials, one of the locations was repeated, and observers had to respond as quickly as possible when detecting this repetition. Ancillary detection and localization tasks confirmed that the visual and auditory targets were perceptually comparable. Response latencies in the working memory task showed small but reliable costs in performance on trials involving a sequential mixture of auditory and visual targets, as compared with trials of pure vision or pure audition. These deficits were statistically reliable only for trials on which the modalities of the matching location switched from the penultimate to the final target in the sequence, indicating a switching cost. The switching cost for the pair in immediate succession means that the spatial images representing the target locations retain features of the visual or auditory representations from which they were derived. However, there was no reliable evidence of a performance cost for mixed modalities in the matching pair when the second of the two did not immediately follow the first, suggesting that more enduring spatial images in working memory may be amodal.

  20. Time takes space: selective effects of multitasking on concurrent spatial processing.

    Science.gov (United States)

    Mäntylä, Timo; Coni, Valentina; Kubik, Veit; Todorov, Ivo; Del Missier, Fabio

    2017-08-01

    Many everyday activities require coordination and monitoring of complex relations of future goals and deadlines. Cognitive offloading may provide an efficient strategy for reducing control demands by representing future goals and deadlines as a pattern of spatial relations. We tested the hypothesis that multiple-task monitoring involves time-to-space transformational processes, and that these spatial effects are selective with greater demands on coordinate (metric) than categorical (nonmetric) spatial relation processing. Participants completed a multitasking session in which they monitored four series of deadlines, running on different time scales, while making concurrent coordinate or categorical spatial judgments. We expected and found that multitasking taxes concurrent coordinate, but not categorical, spatial processing. Furthermore, males showed a better multitasking performance than females. These findings provide novel experimental evidence for the hypothesis that efficient multitasking involves metric relational processing.

  1. Spatial Contiguity and Incidental Learning in Multimedia Environments

    Science.gov (United States)

    Paek, Seungoh; Hoffman, Daniel L.; Saravanos, Antonios

    2017-01-01

    Drawing on dual-process theories of cognitive function, the degree to which spatial contiguity influences incidental learning outcomes was examined. It was hypothesized that spatial contiguity would mediate what was learned even in the absence of an explicit learning goal. To test this hypothesis, 149 adults completed a multimedia-related task…

  2. An evaluation for spatial resolution, using a single target on a medical image

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kyung Sung [Dept. of Radiotechnology, Cheju Halla University, Cheju (Korea, Republic of)

    2016-12-15

    Hitherto, spatial resolution has commonly been evaluated by test patterns or phantoms built on some specific distances (from close to far) between two objects (or double targets). This evaluation method's shortcoming is that resolution is restricted to target distances of phantoms made for test. Therefore, in order to solve the problem, this study proposes and verifies a new method to efficiently test spatial resolution with a single target. For the research I used PSF and JND to propose an idea to measure spatial resolution. After that, I made experiments by commonly used phantoms to verify my new evaluation hypothesis inferred from the above method. To analyse the hypothesis, I used LabVIEW program and got a line pixel from digital image. The result was identical to my spatial-resolution hypothesis inferred from a single target. The findings of the experiment proves only a single target can be enough to relatively evaluate spatial resolution on a digital image. In other words, the limit of the traditional spatial-resolution evaluation method, based on double targets, can be overcome by my new evaluation one using a single target.

  3. Predicting recovery from acid rain using the micro-spatial heterogeneity of soil columns downhill the infiltration zone of beech stemflow: introduction of a hypothesis.

    Science.gov (United States)

    Berger, Torsten W; Muras, Alexander

    Release of stored sulfur may delay the recovery of soil pH from Acid Rain. It is hypothesized that analyzing the micro-spatial heterogeneity of soil columns downhill of a beech stem enables predictions of soil recovery as a function of historic acid loads and time. We demonstrated in a very simplified approach, how these two different factors may be untangled from each other using synthetic data. Thereafter, we evaluated the stated hypothesis based upon chemical soil data with increasing distance from the stem of beech trees. It is predicted that the top soil will recover from acid deposition, as already recorded in the infiltration zone of stemflow near the base of the stem. However, in the between trees areas and especially in deeper soil horizons recovery may be highly delayed.

  4. Life Origination Hydrate Hypothesis (LOH-Hypothesis

    Directory of Open Access Journals (Sweden)

    Victor Ostrovskii

    2012-01-01

    Full Text Available The paper develops the Life Origination Hydrate Hypothesis (LOH-hypothesis, according to which living-matter simplest elements (LMSEs, which are N-bases, riboses, nucleosides, nucleotides, DNA- and RNA-like molecules, amino-acids, and proto-cells repeatedly originated on the basis of thermodynamically controlled, natural, and inevitable processes governed by universal physical and chemical laws from CH4, niters, and phosphates under the Earth's surface or seabed within the crystal cavities of the honeycomb methane-hydrate structure at low temperatures; the chemical processes passed slowly through all successive chemical steps in the direction that is determined by a gradual decrease in the Gibbs free energy of reacting systems. The hypothesis formulation method is based on the thermodynamic directedness of natural movement and consists ofan attempt to mentally backtrack on the progression of nature and thus reveal principal milestones alongits route. The changes in Gibbs free energy are estimated for different steps of the living-matter origination process; special attention is paid to the processes of proto-cell formation. Just the occurrence of the gas-hydrate periodic honeycomb matrix filled with LMSEs almost completely in its final state accounts for size limitation in the DNA functional groups and the nonrandom location of N-bases in the DNA chains. The slowness of the low-temperature chemical transformations and their “thermodynamic front” guide the gross process of living matter origination and its successive steps. It is shown that the hypothesis is thermodynamically justified and testable and that many observed natural phenomena count in its favor.

  5. Drift chambers for a large-area, high-precision muon spectrometer

    International Nuclear Information System (INIS)

    Alberini, C.; Bari, G.; Cara Romeo, G.; Cifarelli, L.; Del Papa, C.; Iacobucci, G.; Laurenti, G.; Maccarrone, G.; Massam, T.; Motta, F.; Nania, R.; Perotto, E.; Prisco, G.; Willutsky, M.; Basile, M.; Contin, A.; Palmonari, F.; Sartorelli, G.

    1987-01-01

    We have tested two prototypes of high-precision drift chamber for a magnetic muon spectrometer. Results of the tests are presented, with special emphasis on their efficiency and spatial resolution as a function of particle rate. (orig.)

  6. Mapping spatial patterns with morphological image processing

    Science.gov (United States)

    Peter Vogt; Kurt H. Riitters; Christine Estreguil; Jacek Kozak; Timothy G. Wade; James D. Wickham

    2006-01-01

    We use morphological image processing for classifying spatial patterns at the pixel level on binary land-cover maps. Land-cover pattern is classified as 'perforated,' 'edge,' 'patch,' and 'core' with higher spatial precision and thematic accuracy compared to a previous approach based on image convolution, while retaining the...

  7. Spatial synchrony in cisco recruitment

    Science.gov (United States)

    Myers, Jared T.; Yule, Daniel L.; Jones, Michael L.; Ahrenstorff, Tyler D.; Hrabik, Thomas R.; Claramunt, Randall M.; Ebener, Mark P.; Berglund, Eric K.

    2015-01-01

    We examined the spatial scale of recruitment variability for disparate cisco (Coregonus artedi) populations in the Great Lakes (n = 8) and Minnesota inland lakes (n = 4). We found that the scale of synchrony was approximately 400 km when all available data were utilized; much greater than the 50-km scale suggested for freshwater fish populations in an earlier global analysis. The presence of recruitment synchrony between Great Lakes and inland lake cisco populations supports the hypothesis that synchronicity is driven by climate and not dispersal. We also found synchrony in larval densities among three Lake Superior populations separated by 25–275 km, which further supports the hypothesis that broad-scale climatic factors are the cause of spatial synchrony. Among several candidate climate variables measured during the period of larval cisco emergence, maximum wind speeds exhibited the most similar spatial scale of synchrony to that observed for cisco. Other factors, such as average water temperatures, exhibited synchrony on broader spatial scales, which suggests they could also be contributing to recruitment synchrony. Our results provide evidence that abiotic factors can induce synchronous patterns of recruitment for populations of cisco inhabiting waters across a broad geographic range, and show that broad-scale synchrony of recruitment can occur in freshwater fish populations as well as those from marine systems.

  8. Brain morphology of the threespine stickleback (Gasterosteus aculeatus) varies inconsistently with respect to habitat complexity: A test of the Clever Foraging Hypothesis.

    Science.gov (United States)

    Ahmed, Newaz I; Thompson, Cole; Bolnick, Daniel I; Stuart, Yoel E

    2017-05-01

    The Clever Foraging Hypothesis asserts that organisms living in a more spatially complex environment will have a greater neurological capacity for cognitive processes related to spatial memory, navigation, and foraging. Because the telencephalon is often associated with spatial memory and navigation tasks, this hypothesis predicts a positive association between telencephalon size and environmental complexity. The association between habitat complexity and brain size has been supported by comparative studies across multiple species but has not been widely studied at the within-species level. We tested for covariation between environmental complexity and neuroanatomy of threespine stickleback ( Gasterosteus aculeatus ) collected from 15 pairs of lakes and their parapatric streams on Vancouver Island. In most pairs, neuroanatomy differed between the adjoining lake and stream populations. However, the magnitude and direction of this difference were inconsistent between watersheds and did not covary strongly with measures of within-site environmental heterogeneity. Overall, we find weak support for the Clever Foraging Hypothesis in our study.

  9. A simulation of driven reconnection by a high precision MHD code

    International Nuclear Information System (INIS)

    Kusano, Kanya; Ouchi, Yasuo; Hayashi, Takaya; Horiuchi, Ritoku; Watanabe, Kunihiko; Sato, Tetsuya.

    1988-01-01

    A high precision MHD code, which has the fourth-order accuracy for both the spatial and time steps, is developed, and is applied to the simulation studies of two dimensional driven reconnection. It is confirm that the numerical dissipation of this new scheme is much less than that of two-step Lax-Wendroff scheme. The effect of the plasma compressibility on the reconnection dynamics is investigated by means of this high precision code. (author)

  10. Fine-grained versus categorical: Pupil size differentiates between strategies for spatial working memory performance.

    Science.gov (United States)

    Starc, Martina; Anticevic, Alan; Repovš, Grega

    2017-05-01

    Pupillometry provides an accessible option to track working memory processes with high temporal resolution. Several studies showed that pupil size increases with the number of items held in working memory; however, no study has explored whether pupil size also reflects the quality of working memory representations. To address this question, we used a spatial working memory task to investigate the relationship of pupil size with spatial precision of responses and indicators of reliance on generalized spatial categories. We asked 30 participants (15 female, aged 19-31) to remember the position of targets presented at various locations along a hidden radial grid. After a delay, participants indicated the remembered location with a high-precision joystick providing a parametric measure of trial-to-trial accuracy. We recorded participants' pupil dilations continuously during task performance. Results showed a significant relation between pupil dilation during preparation/early encoding and the precision of responses, possibly reflecting the attentional resources devoted to memory encoding. In contrast, pupil dilation at late maintenance and response predicted larger shifts of responses toward prototypical locations, possibly reflecting larger reliance on categorical representation. On an intraindividual level, smaller pupil dilations during encoding predicted larger dilations during late maintenance and response. On an interindividual level, participants relying more on categorical representation also produced larger precision errors. The results confirm the link between pupil size and the quality of spatial working memory representation. They suggest compensatory strategies of spatial working memory performance-loss of precise spatial representation likely increases reliance on generalized spatial categories. © 2017 Society for Psychophysiological Research.

  11. Trap array configuration influences estimates and precision of black bear density and abundance.

    Directory of Open Access Journals (Sweden)

    Clay M Wilton

    Full Text Available Spatial capture-recapture (SCR models have advanced our ability to estimate population density for wide ranging animals by explicitly incorporating individual movement. Though these models are more robust to various spatial sampling designs, few studies have empirically tested different large-scale trap configurations using SCR models. We investigated how extent of trap coverage and trap spacing affects precision and accuracy of SCR parameters, implementing models using the R package secr. We tested two trapping scenarios, one spatially extensive and one intensive, using black bear (Ursus americanus DNA data from hair snare arrays in south-central Missouri, USA. We also examined the influence that adding a second, lower barbed-wire strand to snares had on quantity and spatial distribution of detections. We simulated trapping data to test bias in density estimates of each configuration under a range of density and detection parameter values. Field data showed that using multiple arrays with intensive snare coverage produced more detections of more individuals than extensive coverage. Consequently, density and detection parameters were more precise for the intensive design. Density was estimated as 1.7 bears per 100 km2 and was 5.5 times greater than that under extensive sampling. Abundance was 279 (95% CI = 193-406 bears in the 16,812 km2 study area. Excluding detections from the lower strand resulted in the loss of 35 detections, 14 unique bears, and the largest recorded movement between snares. All simulations showed low bias for density under both configurations. Results demonstrated that in low density populations with non-uniform distribution of population density, optimizing the tradeoff among snare spacing, coverage, and sample size is of critical importance to estimating parameters with high precision and accuracy. With limited resources, allocating available traps to multiple arrays with intensive trap spacing increased the amount of

  12. Conditional sampling technique to test the applicability of the Taylor hypothesis for the large-scale coherent structures

    Science.gov (United States)

    Hussain, A. K. M. F.

    1980-01-01

    Comparisons of the distributions of large scale structures in turbulent flow with distributions based on time dependent signals from stationary probes and the Taylor hypothesis are presented. The study investigated an area in the near field of a 7.62 cm circular air jet at a Re of 32,000, specifically having coherent structures through small-amplitude controlled excitation and stable vortex pairing in the jet column mode. Hot-wire and X-wire anemometry were employed to establish phase averaged spatial distributions of longitudinal and lateral velocities, coherent Reynolds stress and vorticity, background turbulent intensities, streamlines and pseudo-stream functions. The Taylor hypothesis was used to calculate spatial distributions of the phase-averaged properties, with results indicating that the usage of the local time-average velocity or streamwise velocity produces large distortions.

  13. Precisely Tailored DNA Nanostructures and their Theranostic Applications.

    Science.gov (United States)

    Zhu, Bing; Wang, Lihua; Li, Jiang; Fan, Chunhai

    2017-12-01

    A critical challenge in nanotechnology is the limited precision and controllability of the structural parameters, which brings about concerns in uniformity, reproducibility and performance. Self-assembled DNA nanostructures, as a newly emerged type of nano-biomaterials, possess low-nanometer precision, excellent programmability and addressability. They can precisely arrange various molecules and materials to form spatially ordered complex, resulting in unambiguous physical or chemical properties. Because of these, DNA nanostructures have shown great promise in numerous biomedical theranostic applications. In this account, we briefly review the history and advances on construction of DNA nanoarchitectures and superstructures with accurate structural parameters. We focus on recent progress in exploiting these DNA nanostructures as platforms for quantitative biosensing, intracellular diagnosis, imaging, and smart drug delivery. We also discuss key challenges in practical applications. © 2017 The Chemical Society of Japan & Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Pressure from particle image velocimetry for convective flows: a Taylor’s hypothesis approach

    International Nuclear Information System (INIS)

    De Kat, R; Ganapathisubramani, B

    2013-01-01

    Taylor’s hypothesis is often applied in turbulent flow analysis to map temporal information into spatial information. Recent efforts in deriving pressure from particle image velocimetry (PIV) have proposed multiple approaches, each with its own weakness and strength. Application of Taylor’s hypothesis allows us to counter the weakness of an Eulerian approach that is described by de Kat and van Oudheusden (2012 Exp. Fluids 52 1089–106). Two different approaches of using Taylor’s hypothesis in determining planar pressure are investigated: one where pressure is determined from volumetric PIV data and one where pressure is determined from time-resolved stereoscopic PIV data. A performance assessment on synthetic data shows that application of Taylor’s hypothesis can improve determination of pressure from PIV data significantly compared with a time-resolved volumetric approach. The technique is then applied to time-resolved PIV data taken in a cross-flow plane of a turbulent jet (Ganapathisubramani et al 2007 Exp. Fluids 42 923–39). Results appear to indicate that pressure can indeed be obtained from PIV data in turbulent convective flows using the Taylor’s hypothesis approach, where there are no other methods to determine pressure. The role of convection velocity in determination of pressure is also discussed. (paper)

  15. Spatial memory and integration processes in congenital blindness.

    Science.gov (United States)

    Vecchi, Tomaso; Tinti, Carla; Cornoldi, Cesare

    2004-12-22

    The paper tests the hypothesis that difficulties met by the blind in spatial processing are due to the simultaneous treatment of independent spatial representations. Results showed that lack of vision does not impede the ability to process and transform mental images; however, blind people are significantly poorer in the recall of more than a single spatial pattern at a time than in the recall of the corresponding material integrated into a single pattern. It is concluded that the simultaneous maintenance of different spatial information is affected by congenital blindness, while cognitive processes that may involve sequential manipulation are not.

  16. Multineuronal Spike Sequences Repeat with Millisecond Precision

    Directory of Open Access Journals (Sweden)

    Koki eMatsumoto

    2013-06-01

    Full Text Available Cortical microcircuits are nonrandomly wired by neurons. As a natural consequence, spikes emitted by microcircuits are also nonrandomly patterned in time and space. One of the prominent spike organizations is a repetition of fixed patterns of spike series across multiple neurons. However, several questions remain unsolved, including how precisely spike sequences repeat, how the sequences are spatially organized, how many neurons participate in sequences, and how different sequences are functionally linked. To address these questions, we monitored spontaneous spikes of hippocampal CA3 neurons ex vivo using a high-speed functional multineuron calcium imaging technique that allowed us to monitor spikes with millisecond resolution and to record the location of spiking and nonspiking neurons. Multineuronal spike sequences were overrepresented in spontaneous activity compared to the statistical chance level. Approximately 75% of neurons participated in at least one sequence during our observation period. The participants were sparsely dispersed and did not show specific spatial organization. The number of sequences relative to the chance level decreased when larger time frames were used to detect sequences. Thus, sequences were precise at the millisecond level. Sequences often shared common spikes with other sequences; parts of sequences were subsequently relayed by following sequences, generating complex chains of multiple sequences.

  17. Contacting nanowires and nanotubes with atomic precision for electronic transport

    KAUST Repository

    Qin, Shengyong; Hellstrom, Sondra; Bao, Zhenan; Boyanov, Boyan; Li, An-Ping

    2012-01-01

    Making contacts to nanostructures with atomic precision is an important process in the bottom-up fabrication and characterization of electronic nanodevices. Existing contacting techniques use top-down lithography and chemical etching, but lack atomic precision and introduce the possibility of contamination. Here, we report that a field-induced emission process can be used to make local contacts onto individual nanowires and nanotubes with atomic spatial precision. The gold nano-islands are deposited onto nanostructures precisely by using a scanning tunneling microscope tip, which provides a clean and controllable method to ensure both electrically conductive and mechanically reliable contacts. To demonstrate the wide applicability of the technique, nano-contacts are fabricated on silicide atomic wires, carbon nanotubes, and copper nanowires. The electrical transport measurements are performed in situ by utilizing the nanocontacts to bridge the nanostructures to the transport probes. © 2012 American Institute of Physics.

  18. Data-driven inference for the spatial scan statistic.

    Science.gov (United States)

    Almeida, Alexandre C L; Duarte, Anderson R; Duczmal, Luiz H; Oliveira, Fernando L P; Takahashi, Ricardo H C

    2011-08-02

    Kulldorff's spatial scan statistic for aggregated area maps searches for clusters of cases without specifying their size (number of areas) or geographic location in advance. Their statistical significance is tested while adjusting for the multiple testing inherent in such a procedure. However, as is shown in this work, this adjustment is not done in an even manner for all possible cluster sizes. A modification is proposed to the usual inference test of the spatial scan statistic, incorporating additional information about the size of the most likely cluster found. A new interpretation of the results of the spatial scan statistic is done, posing a modified inference question: what is the probability that the null hypothesis is rejected for the original observed cases map with a most likely cluster of size k, taking into account only those most likely clusters of size k found under null hypothesis for comparison? This question is especially important when the p-value computed by the usual inference process is near the alpha significance level, regarding the correctness of the decision based in this inference. A practical procedure is provided to make more accurate inferences about the most likely cluster found by the spatial scan statistic.

  19. THE FRACTAL MARKET HYPOTHESIS

    Directory of Open Access Journals (Sweden)

    FELICIA RAMONA BIRAU

    2012-05-01

    Full Text Available In this article, the concept of capital market is analysed using Fractal Market Hypothesis which is a modern, complex and unconventional alternative to classical finance methods. Fractal Market Hypothesis is in sharp opposition to Efficient Market Hypothesis and it explores the application of chaos theory and fractal geometry to finance. Fractal Market Hypothesis is based on certain assumption. Thus, it is emphasized that investors did not react immediately to the information they receive and of course, the manner in which they interpret that information may be different. Also, Fractal Market Hypothesis refers to the way that liquidity and investment horizons influence the behaviour of financial investors.

  20. THE FRACTAL MARKET HYPOTHESIS

    OpenAIRE

    FELICIA RAMONA BIRAU

    2012-01-01

    In this article, the concept of capital market is analysed using Fractal Market Hypothesis which is a modern, complex and unconventional alternative to classical finance methods. Fractal Market Hypothesis is in sharp opposition to Efficient Market Hypothesis and it explores the application of chaos theory and fractal geometry to finance. Fractal Market Hypothesis is based on certain assumption. Thus, it is emphasized that investors did not react immediately to the information they receive and...

  1. Evaluation of 7Be fallout spatial variability

    International Nuclear Information System (INIS)

    Pinto, Victor Meriguetti

    2011-01-01

    The cosmogenic radionuclide beryllium-7 (Be) is produced in the atmosphere by cosmic particle reactions and is being used as a tracer for soil erosion and climatic processes research. After the production, 7 Be bonds to aerosol particles in the atmosphere and is deposited on the soil surface with other radionuclide species by rainfall. Because of the high adsorption on soil particles and its short half-life of 53.2 days, this radionuclide follows of the erosion process and can be used as a tracer to evaluate the sediment transport that occurs during a single rain event or short period of rain events. A key assumption for the erosion evaluation through this radiotracer is the uniformity of the spatial distribution of the 7 Be fallout. The 7 Be method was elaborated recently and due to its few applications, some assumptions related to the method were not yet properly investigated yet, and the hypothesis of 7 Be fallout uniformity needs to be evaluated. The aim of this study was to evaluate the 7 Be fallout spatial distribution through the rain water 7 Be activity analysis of the first five millimeters of single rain events. The rain water was sampled using twelve collectors distributed on an experimental area of about 300 m2 , located in the campus of Sao Paulo University, Piracicaba. The 7 Be activities were measured using a 53% efficiency gamma-ray spectrometer from the Radioisotope laboratory of CENA. The 7 Be activities in rain water varied from 0.26 to 1.81 Sq.L - 1, with the highest values in summer and lowest in spring. In each one of the 5 single events, the spatial variability of 7 Se activity in rain water was high, showing the high randomness of the fallout spatial distribution. A simulation using the 7 Be spatial variability values obtained here and 7 Se average reference inventories taken from the literature was performed determining the lowest detectable erosion rate estimated by 7 Be model. The importance of taking a representative number of samples to

  2. Learning-Related Changes in Adolescents' Neural Networks during Hypothesis-Generating and Hypothesis-Understanding Training

    Science.gov (United States)

    Lee, Jun-Ki; Kwon, Yongju

    2012-01-01

    Fourteen science high school students participated in this study, which investigated neural-network plasticity associated with hypothesis-generating and hypothesis-understanding in learning. The students were divided into two groups and participated in either hypothesis-generating or hypothesis-understanding type learning programs, which were…

  3. Precision bounds for gradient magnetometry with atomic ensembles

    Science.gov (United States)

    Apellaniz, Iagoba; Urizar-Lanz, Iñigo; Zimborás, Zoltán; Hyllus, Philipp; Tóth, Géza

    2018-05-01

    We study gradient magnetometry with an ensemble of atoms with arbitrary spin. We calculate precision bounds for estimating the gradient of the magnetic field based on the quantum Fisher information. For quantum states that are invariant under homogeneous magnetic fields, we need to measure a single observable to estimate the gradient. On the other hand, for states that are sensitive to homogeneous fields, a simultaneous measurement is needed, as the homogeneous field must also be estimated. We prove that for the cases studied in this paper, such a measurement is feasible. We present a method to calculate precision bounds for gradient estimation with a chain of atoms or with two spatially separated atomic ensembles. We also consider a single atomic ensemble with an arbitrary density profile, where the atoms cannot be addressed individually, and which is a very relevant case for experiments. Our model can take into account even correlations between particle positions. While in most of the discussion we consider an ensemble of localized particles that are classical with respect to their spatial degree of freedom, we also discuss the case of gradient metrology with a single Bose-Einstein condensate.

  4. A novel hypothesis splitting method implementation for multi-hypothesis filters

    DEFF Research Database (Denmark)

    Bayramoglu, Enis; Ravn, Ole; Andersen, Nils Axel

    2013-01-01

    The paper presents a multi-hypothesis filter library featuring a novel method for splitting Gaussians into ones with smaller variances. The library is written in C++ for high performance and the source code is open and free1. The multi-hypothesis filters commonly approximate the distribution tran...

  5. Utilization of special computerized tomography and nuclear medicine techniques for quality control and for the optimization of combined precision chemotherapy and precision radiation therapy

    International Nuclear Information System (INIS)

    Wiley, A.L. Jr.; Wirtanen, G.W.; Chien, I.-C.

    1984-01-01

    A combination of precision (selective, intra-arterial) chemotherapy and precision radiotherapy can be used for advanced pancreatic, biliary tract, and sarcomatous malignancies. There were some remarkable responses, but also a few poor responses and even some morbidity. Accordingly, methods are developed of pre-selecting those patients whose tumors are likely to respond to such therapy, as well as methods for improving the therapeutic ratio by the rational optimization of combined therapy. Specifically, clinical tumor blood flow characteristics (monitored with nuclear medicine techniques) may provide useful criteria for such selection. The authors also evaluate qualitatively the drug distribution or exposure space with specialized color-coded computerized tomography images, which demonstrate spatially dependent enhancement of intra-arterial contrast in tumor and in adjacent normal tissues. Such clinical data can improve the quality control aspects of intra-arterial chemotherapy administration, as well as the possibility of achievement of a significant therapeutic ratio by the integration of precision chemotherapy and precision radiation therapy. (Auth.)

  6. Novel encoding and updating of positional, or directional, spatial cues are processed by distinct hippocampal subfields: Evidence for parallel information processing and the "what" stream.

    Science.gov (United States)

    Hoang, Thu-Huong; Aliane, Verena; Manahan-Vaughan, Denise

    2018-05-01

    The specific roles of hippocampal subfields in spatial information processing and encoding are, as yet, unclear. The parallel map theory postulates that whereas the CA1 processes discrete environmental features (positional cues used to generate a "sketch map"), the dentate gyrus (DG) processes large navigation-relevant landmarks (directional cues used to generate a "bearing map"). Additionally, the two-streams hypothesis suggests that hippocampal subfields engage in differentiated processing of information from the "where" and the "what" streams. We investigated these hypotheses by analyzing the effect of exploration of discrete "positional" features and large "directional" spatial landmarks on hippocampal neuronal activity in rats. As an indicator of neuronal activity we measured the mRNA induction of the immediate early genes (IEGs), Arc and Homer1a. We observed an increase of this IEG mRNA in CA1 neurons of the distal neuronal compartment and in proximal CA3, after novel spatial exploration of discrete positional cues, whereas novel exploration of directional cues led to increases in IEG mRNA in the lower blade of the DG and in proximal CA3. Strikingly, the CA1 did not respond to directional cues and the DG did not respond to positional cues. Our data provide evidence for both the parallel map theory and the two-streams hypothesis and suggest a precise compartmentalization of the encoding and processing of "what" and "where" information occurs within the hippocampal subfields. © 2018 The Authors. Hippocampus Published by Wiley Periodicals, Inc.

  7. Precision Attitude Control for the BETTII Balloon-Borne Interferometer

    Science.gov (United States)

    Benford, Dominic J.; Fixsen, Dale J.; Rinehart. Stephen

    2012-01-01

    The Balloon Experimental Twin Telescope for Infrared Interferometry (BETTII) is an 8-meter baseline far-infrared interferometer to fly on a high altitude balloon. Operating at wavelengths of 30-90 microns, BETTII will obtain spatial and spectral information on science targets at angular resolutions down to less than half an arcsecond, a capability unmatched by other far-infrared facilities. This requires attitude control at a level ofless than a tenth of an arcsecond, a great challenge for a lightweight balloon-borne system. We have designed a precision attitude determination system to provide gondola attitude knowledge at a level of 2 milliarcseconds at rates up to 100Hz, with accurate absolute attitude determination at the half arcsecond level at rates of up to 10Hz. A mUlti-stage control system involving rigid body motion and tip-tilt-piston correction provides precision pointing stability to the level required for the far-infrared instrument to perform its spatial/spectral interferometry in an open-loop control. We present key aspects of the design of the attitude determination and control and its development status.

  8. Variability: A Pernicious Hypothesis.

    Science.gov (United States)

    Noddings, Nel

    1992-01-01

    The hypothesis of greater male variability in test results is discussed in its historical context, and reasons feminists have objected to the hypothesis are considered. The hypothesis acquires political importance if it is considered that variability results from biological, rather than cultural, differences. (SLD)

  9. Role of spatial averaging in multicellular gradient sensing.

    Science.gov (United States)

    Smith, Tyler; Fancher, Sean; Levchenko, Andre; Nemenman, Ilya; Mugler, Andrew

    2016-05-20

    Gradient sensing underlies important biological processes including morphogenesis, polarization, and cell migration. The precision of gradient sensing increases with the length of a detector (a cell or group of cells) in the gradient direction, since a longer detector spans a larger range of concentration values. Intuition from studies of concentration sensing suggests that precision should also increase with detector length in the direction transverse to the gradient, since then spatial averaging should reduce the noise. However, here we show that, unlike for concentration sensing, the precision of gradient sensing decreases with transverse length for the simplest gradient sensing model, local excitation-global inhibition. The reason is that gradient sensing ultimately relies on a subtraction of measured concentration values. While spatial averaging indeed reduces the noise in these measurements, which increases precision, it also reduces the covariance between the measurements, which results in the net decrease in precision. We demonstrate how a recently introduced gradient sensing mechanism, regional excitation-global inhibition (REGI), overcomes this effect and recovers the benefit of transverse averaging. Using a REGI-based model, we compute the optimal two- and three-dimensional detector shapes, and argue that they are consistent with the shapes of naturally occurring gradient-sensing cell populations.

  10. Physiopathological Hypothesis of Cellulite

    Science.gov (United States)

    de Godoy, José Maria Pereira; de Godoy, Maria de Fátima Guerreiro

    2009-01-01

    A series of questions are asked concerning this condition including as regards to its name, the consensus about the histopathological findings, physiological hypothesis and treatment of the disease. We established a hypothesis for cellulite and confirmed that the clinical response is compatible with this hypothesis. Hence this novel approach brings a modern physiological concept with physiopathologic basis and clinical proof of the hypothesis. We emphasize that the choice of patient, correct diagnosis of cellulite and the technique employed are fundamental to success. PMID:19756187

  11. Defending the Decimals: Why Foolishly False Precision Might Strengthen Social Science

    Directory of Open Access Journals (Sweden)

    Jeremy Freese

    2014-12-01

    Full Text Available Social scientists often report regression coefficients using more significant figures than are meaningful given measurement precision and sample size. Common sense says we should not do this. Yet, as normative practice, eliminating these extra digits introduces a more serious scientific problem when accompanied by other ascendant reporting practices intended to reduce social science’s long-standing emphasis on null hypothesis significance testing. Coefficient p-values can no longer be recovered to the degree of precision that p-values have been abundantly demonstrated to influence actual research practice. Developing methods for detecting and addressing systematically exaggerated effect sizes across collections of studies cannot be done effectively if p-values are hidden. Regarding what is preferable for scientific literature versus an individual study, the costs of false precision are therefore innocuous compared to alternatives that either encourage the continuation of practices known to exaggerate causal effects or thwart assessment of how much such exaggeration occurs.

  12. The roles of categorical and coordinate spatial relations in recognizing buildings.

    Science.gov (United States)

    Palermo, Liana; Piccardi, Laura; Nori, Raffaella; Giusberti, Fiorella; Guariglia, Cecilia

    2012-11-01

    Categorical spatial information is considered more useful for recognizing objects, and coordinate spatial information for guiding actions--for example, during navigation or grasping. In contrast with this assumption, we hypothesized that buildings, unlike other categories of objects, require both categorical and coordinate spatial information in order to be recognized. This hypothesis arose from evidence that right-brain-damaged patients have deficits in both coordinate judgments and recognition of buildings and from the fact that buildings are very useful for guiding navigation in urban environments. To test this hypothesis, we assessed 210 healthy college students while they performed four different tasks that required categorical and coordinate judgments and the recognition of common objects and buildings. Our results showed that both categorical and coordinate spatial representations are necessary to recognize a building, whereas only categorical representations are necessary to recognize an object. We discuss our data in view of a recent neural framework for visuospatial processing, suggesting that recognizing buildings may specifically activate the parieto-medial-temporal pathway.

  13. [Value of the space perception test for evaluation of the aptitude for precision work in geodesy].

    Science.gov (United States)

    Remlein-Mozolewska, G

    1982-01-01

    The visual spatial localization ability of geodesy and cartography - employers and of the pupils trained for the mentioned profession has been examined. The examination has been based on work duration and the time of its performance. A correlation between the localization ability and the precision of the hand - movements required in everyday work has been proven. The better the movement precision, the more efficient the visual spatial localization. The length of work has not been significant. The test concerned appeared to be highly useful in geodesy for qualifying workers for the posts requiring good hands efficiency.

  14. Spatial variability of macrobenthic zonation on exposed sandy beaches

    Science.gov (United States)

    Veiga, Puri; Rubal, Marcos; Cacabelos, Eva; Maldonado, Cristina; Sousa-Pinto, Isabel

    2014-07-01

    We analysed the consistence of vertical patterns of distribution (i.e. zonation) for macrofauna at different spatial scales on four intermediate exposed beaches in the North of Portugal. We tested the hypothesis that biological zonation on exposed sandy beaches would vary at the studied spatial scales. For this aim, abundance, diversity and structure of macrobenthic assemblages were examined at the scales of transect and beach. Moreover, the main environmental factors that could potentially drive zonation patterns were investigated. Univariate and multivariate analyses revealed that the number of biological zones ranged from two to three depending on the beach and from indistinct zonation to three zones at the scale of transect. Therefore, results support our working hypothesis because zonation patterns were not consistent at the studied spatial scales. The median particle size, sorting coefficient and water content were significantly correlated with zonation patterns of macrobenthic assemblages. However, a high degree of correlation was not reached when the total structure of the assemblage was considered.

  15. Effects of Hand Proximity and Movement Direction in Spatial and Temporal Gap Discrimination.

    Science.gov (United States)

    Wiemers, Michael; Fischer, Martin H

    2016-01-01

    Previous research on the interplay between static manual postures and visual attention revealed enhanced visual selection near the hands (near-hand effect). During active movements there is also superior visual performance when moving toward compared to away from the stimulus (direction effect). The "modulated visual pathways" hypothesis argues that differential involvement of magno- and parvocellular visual processing streams causes the near-hand effect. The key finding supporting this hypothesis is an increase in temporal and a reduction in spatial processing in near-hand space (Gozli et al., 2012). Since this hypothesis has, so far, only been tested with static hand postures, we provide a conceptual replication of Gozli et al.'s (2012) result with moving hands, thus also probing the generality of the direction effect. Participants performed temporal or spatial gap discriminations while their right hand was moving below the display. In contrast to Gozli et al. (2012), temporal gap discrimination was superior at intermediate and not near hand proximity. In spatial gap discrimination, a direction effect without hand proximity effect suggests that pragmatic attentional maps overshadowed temporal/spatial processing biases for far/near-hand space.

  16. Precision manufacturing

    CERN Document Server

    Dornfeld, David

    2008-01-01

    Today there is a high demand for high-precision products. The manufacturing processes are now highly sophisticated and derive from a specialized genre called precision engineering. Precision Manufacturing provides an introduction to precision engineering and manufacturing with an emphasis on the design and performance of precision machines and machine tools, metrology, tooling elements, machine structures, sources of error, precision machining processes and precision process planning. As well as discussing the critical role precision machine design for manufacturing has had in technological developments over the last few hundred years. In addition, the influence of sustainable manufacturing requirements in precision processes is introduced. Drawing upon years of practical experience and using numerous examples and illustrative applications, David Dornfeld and Dae-Eun Lee cover precision manufacturing as it applies to: The importance of measurement and metrology in the context of Precision Manufacturing. Th...

  17. Micro-precise spatiotemporal delivery system embedded in 3D printing for complex tissue regeneration.

    Science.gov (United States)

    Tarafder, Solaiman; Koch, Alia; Jun, Yena; Chou, Conrad; Awadallah, Mary R; Lee, Chang H

    2016-04-25

    Three dimensional (3D) printing has emerged as an efficient tool for tissue engineering and regenerative medicine, given its advantages for constructing custom-designed scaffolds with tunable microstructure/physical properties. Here we developed a micro-precise spatiotemporal delivery system embedded in 3D printed scaffolds. PLGA microspheres (μS) were encapsulated with growth factors (GFs) and then embedded inside PCL microfibers that constitute custom-designed 3D scaffolds. Given the substantial difference in the melting points between PLGA and PCL and their low heat conductivity, μS were able to maintain its original structure while protecting GF's bioactivities. Micro-precise spatial control of multiple GFs was achieved by interchanging dispensing cartridges during a single printing process. Spatially controlled delivery of GFs, with a prolonged release, guided formation of multi-tissue interfaces from bone marrow derived mesenchymal stem/progenitor cells (MSCs). To investigate efficacy of the micro-precise delivery system embedded in 3D printed scaffold, temporomandibular joint (TMJ) disc scaffolds were fabricated with micro-precise spatiotemporal delivery of CTGF and TGFβ3, mimicking native-like multiphase fibrocartilage. In vitro, TMJ disc scaffolds spatially embedded with CTGF/TGFβ3-μS resulted in formation of multiphase fibrocartilaginous tissues from MSCs. In vivo, TMJ disc perforation was performed in rabbits, followed by implantation of CTGF/TGFβ3-μS-embedded scaffolds. After 4 wks, CTGF/TGFβ3-μS embedded scaffolds significantly improved healing of the perforated TMJ disc as compared to the degenerated TMJ disc in the control group with scaffold embedded with empty μS. In addition, CTGF/TGFβ3-μS embedded scaffolds significantly prevented arthritic changes on TMJ condyles. In conclusion, our micro-precise spatiotemporal delivery system embedded in 3D printing may serve as an efficient tool to regenerate complex and inhomogeneous tissues.

  18. Spatial Outlier Detection of CO2 Monitoring Data Based on Spatial Local Outlier Factor

    Directory of Open Access Journals (Sweden)

    Liu Xin

    2015-12-01

    Full Text Available Spatial local outlier factor (SLOF algorithm was adopted in this study for spatial outlier detection because of the limitations of the traditional static threshold detection. Based on the spatial characteristics of CO2 monitoring data obtained in the carbon capture and storage (CCS project, the K-Nearest Neighbour (KNN graph was constructed using the latitude and longitude information of the monitoring points to identify the spatial neighbourhood of the monitoring points. Then SLOF was adopted to calculate the outlier degrees of the monitoring points and the 3σ rule was employed to identify the spatial outlier. Finally, the selection of K value was analysed and the optimal one was selected. The results show that, compared with the static threshold method, the proposed algorithm has a higher detection precision. It can overcome the shortcomings of the static threshold method and improve the accuracy and diversity of local outlier detection, which provides a reliable reference for the safety assessment and warning of CCS monitoring.

  19. Data-driven inference for the spatial scan statistic

    Directory of Open Access Journals (Sweden)

    Duczmal Luiz H

    2011-08-01

    Full Text Available Abstract Background Kulldorff's spatial scan statistic for aggregated area maps searches for clusters of cases without specifying their size (number of areas or geographic location in advance. Their statistical significance is tested while adjusting for the multiple testing inherent in such a procedure. However, as is shown in this work, this adjustment is not done in an even manner for all possible cluster sizes. Results A modification is proposed to the usual inference test of the spatial scan statistic, incorporating additional information about the size of the most likely cluster found. A new interpretation of the results of the spatial scan statistic is done, posing a modified inference question: what is the probability that the null hypothesis is rejected for the original observed cases map with a most likely cluster of size k, taking into account only those most likely clusters of size k found under null hypothesis for comparison? This question is especially important when the p-value computed by the usual inference process is near the alpha significance level, regarding the correctness of the decision based in this inference. Conclusions A practical procedure is provided to make more accurate inferences about the most likely cluster found by the spatial scan statistic.

  20. Transitions between central and peripheral vision create spatial/temporal distortions: a hypothesis concerning the perceived break of the curveball.

    Directory of Open Access Journals (Sweden)

    Arthur Shapiro

    2010-10-01

    Full Text Available The human visual system does not treat all parts of an image equally: the central segments of an image, which fall on the fovea, are processed with a higher resolution than the segments that fall in the visual periphery. Even though the differences between foveal and peripheral resolution are large, these differences do not usually disrupt our perception of seamless visual space. Here we examine a motion stimulus in which the shift from foveal to peripheral viewing creates a dramatic spatial/temporal discontinuity.The stimulus consists of a descending disk (global motion with an internal moving grating (local motion. When observers view the disk centrally, they perceive both global and local motion (i.e., observers see the disk's vertical descent and the internal spinning. When observers view the disk peripherally, the internal portion appears stationary, and the disk appears to descend at an angle. The angle of perceived descent increases as the observer views the stimulus from further in the periphery. We examine the first- and second-order information content in the display with the use of a three-dimensional Fourier analysis and show how our results can be used to describe perceived spatial/temporal discontinuities in real-world situations.The perceived shift of the disk's direction in the periphery is consistent with a model in which foveal processing separates first- and second-order motion information while peripheral processing integrates first- and second-order motion information. We argue that the perceived distortion may influence real-world visual observations. To this end, we present a hypothesis and analysis of the perception of the curveball and rising fastball in the sport of baseball. The curveball is a physically measurable phenomenon: the imbalance of forces created by the ball's spin causes the ball to deviate from a straight line and to follow a smooth parabolic path. However, the curveball is also a perceptual puzzle

  1. Transitions between central and peripheral vision create spatial/temporal distortions: a hypothesis concerning the perceived break of the curveball.

    Science.gov (United States)

    Shapiro, Arthur; Lu, Zhong-Lin; Huang, Chang-Bing; Knight, Emily; Ennis, Robert

    2010-10-13

    The human visual system does not treat all parts of an image equally: the central segments of an image, which fall on the fovea, are processed with a higher resolution than the segments that fall in the visual periphery. Even though the differences between foveal and peripheral resolution are large, these differences do not usually disrupt our perception of seamless visual space. Here we examine a motion stimulus in which the shift from foveal to peripheral viewing creates a dramatic spatial/temporal discontinuity. The stimulus consists of a descending disk (global motion) with an internal moving grating (local motion). When observers view the disk centrally, they perceive both global and local motion (i.e., observers see the disk's vertical descent and the internal spinning). When observers view the disk peripherally, the internal portion appears stationary, and the disk appears to descend at an angle. The angle of perceived descent increases as the observer views the stimulus from further in the periphery. We examine the first- and second-order information content in the display with the use of a three-dimensional Fourier analysis and show how our results can be used to describe perceived spatial/temporal discontinuities in real-world situations. The perceived shift of the disk's direction in the periphery is consistent with a model in which foveal processing separates first- and second-order motion information while peripheral processing integrates first- and second-order motion information. We argue that the perceived distortion may influence real-world visual observations. To this end, we present a hypothesis and analysis of the perception of the curveball and rising fastball in the sport of baseball. The curveball is a physically measurable phenomenon: the imbalance of forces created by the ball's spin causes the ball to deviate from a straight line and to follow a smooth parabolic path. However, the curveball is also a perceptual puzzle because batters often

  2. Measurement of turbulent spatial structure and kinetic energy spectrum by exact temporal-to-spatial mapping

    Science.gov (United States)

    Buchhave, Preben; Velte, Clara M.

    2017-08-01

    We present a method for converting a time record of turbulent velocity measured at a point in a flow to a spatial velocity record consisting of consecutive convection elements. The spatial record allows computation of dynamic statistical moments such as turbulent kinetic wavenumber spectra and spatial structure functions in a way that completely bypasses the need for Taylor's hypothesis. The spatial statistics agree with the classical counterparts, such as the total kinetic energy spectrum, at least for spatial extents up to the Taylor microscale. The requirements for applying the method are access to the instantaneous velocity magnitude, in addition to the desired flow quantity, and a high temporal resolution in comparison to the relevant time scales of the flow. We map, without distortion and bias, notoriously difficult developing turbulent high intensity flows using three main aspects that distinguish these measurements from previous work in the field: (1) The measurements are conducted using laser Doppler anemometry and are therefore not contaminated by directional ambiguity (in contrast to, e.g., frequently employed hot-wire anemometers); (2) the measurement data are extracted using a correctly and transparently functioning processor and are analysed using methods derived from first principles to provide unbiased estimates of the velocity statistics; (3) the exact mapping proposed herein has been applied to the high turbulence intensity flows investigated to avoid the significant distortions caused by Taylor's hypothesis. The method is first confirmed to produce the correct statistics using computer simulations and later applied to measurements in some of the most difficult regions of a round turbulent jet—the non-equilibrium developing region and the outermost parts of the developed jet. The proposed mapping is successfully validated using corresponding directly measured spatial statistics in the fully developed jet, even in the difficult outer regions of

  3. Commissioning and proof of functionality of the OPERA precision tracker, especially of the time measuring system; Inbetriebnahme und Funktionsnachweis des OPERA Precision Trackers insbesondere des Zeitmesssystems

    Energy Technology Data Exchange (ETDEWEB)

    Janutta, Benjamin

    2008-10-15

    The commissioning and the proof of functionality of the Precision Tracker of the OPERA experiment is subject of this thesis. The timing system of the precision tracker is of major concern here. At first the time.resolution of the timing electronics was characterized additionally general running parameters were studied. Afterwards the installation and commissioning were carried out. The precision tracker is supposed to determine the momentum of throughgoing myons with an accuracy of {delta}p/p<0.25 as well as the sign of their charge. The commissioning is finished by now and it was shown, that the data acquisition system runs very reliable and only 1.5% show an slightly higher number of hits. The nominal spatial track resolution of {sigma}<600 {mu}m was also reached. (orig.)

  4. Critiques of the seismic hypothesis and the vegetation stabilization hypothesis for the formation of Mima mounds along the western coast of the U.S.

    Science.gov (United States)

    Gabet, Emmanuel J.; Burnham, Jennifer L. Horwath; Perron, J. Taylor

    2016-09-01

    A recent paper published in Geomorphology by Gabet et al. (2014) presents the results of a numerical model supporting the hypothesis that burrowing mammals build Mima mounds - small, densely packed hillocks found primarily in the western United States. The model is based on field observations and produces realistic-looking mounds with spatial distributions similar to real moundfields. Alternative explanations have been proposed for these Mima mounds, including formation by seismic shaking and vegetation-controlled erosion and deposition. In this short communication, we present observations from moundfields in the coastal states of the western U.S. that are incompatible with these alternative theories.

  5. [Precision nutrition in the era of precision medicine].

    Science.gov (United States)

    Chen, P Z; Wang, H

    2016-12-06

    Precision medicine has been increasingly incorporated into clinical practice and is enabling a new era for disease prevention and treatment. As an important constituent of precision medicine, precision nutrition has also been drawing more attention during physical examinations. The main aim of precision nutrition is to provide safe and efficient intervention methods for disease treatment and management, through fully considering the genetics, lifestyle (dietary, exercise and lifestyle choices), metabolic status, gut microbiota and physiological status (nutrient level and disease status) of individuals. Three major components should be considered in precision nutrition, including individual criteria for sufficient nutritional status, biomarker monitoring or techniques for nutrient detection and the applicable therapeutic or intervention methods. It was suggested that, in clinical practice, many inherited and chronic metabolic diseases might be prevented or managed through precision nutritional intervention. For generally healthy populations, because lifestyles, dietary factors, genetic factors and environmental exposures vary among individuals, precision nutrition is warranted to improve their physical activity and reduce disease risks. In summary, research and practice is leading toward precision nutrition becoming an integral constituent of clinical nutrition and disease prevention in the era of precision medicine.

  6. Pulsed beams as field probes for precision measurement

    International Nuclear Information System (INIS)

    Hudson, J. J.; Ashworth, H. T.; Kara, D. M.; Tarbutt, M. R.; Sauer, B. E.; Hinds, E. A.

    2007-01-01

    We describe a technique for mapping the spatial variation of static electric, static magnetic, and rf magnetic fields using a pulsed atomic or molecular beam. The method is demonstrated using a beam designed to measure the electric dipole moment of the electron. We present maps of the interaction region, showing sensitivity to (i) electric field variation of 1.5 V/cm at 3.3 kV/cm with a spatial resolution of 15 mm; (ii) magnetic field variation of 5 nT with 25 mm resolution; (iii) radio-frequency magnetic field amplitude with 15 mm resolution. This diagnostic technique is very powerful in the context of high-precision atomic and molecular physics experiments, where pulsed beams have not hitherto found widespread application

  7. A Trace Data-Based Approach for an Accurate Estimation of Precise Utilization Maps in LTE

    Directory of Open Access Journals (Sweden)

    Almudena Sánchez

    2017-01-01

    Full Text Available For network planning and optimization purposes, mobile operators make use of Key Performance Indicators (KPIs, computed from Performance Measurements (PMs, to determine whether network performance needs to be improved. In current networks, PMs, and therefore KPIs, suffer from lack of precision due to an insufficient temporal and/or spatial granularity. In this work, an automatic method, based on data traces, is proposed to improve the accuracy of radio network utilization measurements collected in a Long-Term Evolution (LTE network. The method’s output is an accurate estimate of the spatial and temporal distribution for the cell utilization ratio that can be extended to other indicators. The method can be used to improve automatic network planning and optimization algorithms in a centralized Self-Organizing Network (SON entity, since potential issues can be more precisely detected and located inside a cell thanks to temporal and spatial precision. The proposed method is tested with real connection traces gathered in a large geographical area of a live LTE network and considers overload problems due to trace file size limitations, which is a key consideration when analysing a large network. Results show how these distributions provide a very detailed information of network utilization, compared to cell based statistics.

  8. High-precision half-life determination for the superallowed β+ emitter Ga62

    Science.gov (United States)

    Grinyer, G. F.; Finlay, P.; Svensson, C. E.; Ball, G. C.; Leslie, J. R.; Austin, R. A. E.; Bandyopadhyay, D.; Chaffey, A.; Chakrawarthy, R. S.; Garrett, P. E.; Hackman, G.; Hyland, B.; Kanungo, R.; Leach, K. G.; Mattoon, C. M.; Morton, A. C.; Pearson, C. J.; Phillips, A. A.; Ressler, J. J.; Sarazin, F.; Savajols, H.; Schumaker, M. A.; Wong, J.

    2008-01-01

    The half-life of the superallowed β+ emitter Ga62 has been measured at TRIUMF's Isotope Separator and Accelerator facility using a fast-tape-transport system and 4π continuous-flow gas proportional counter to detect the positrons from the decay of Ga62 to the daughter Zn62. The result, T1/2=116.100±0.025 ms, represents the most precise measurement to date (0.022%) for any superallowed β-decay half-life. When combined with six previous measurements of the Ga62 half-life, a new world average of T1/2=116.121±0.021 ms is obtained. This new half-life measurement results in a 20% improvement in the precision of the Ga62 superallowed ft value while reducing its mean by 0.9σ to ft=3074.3(12) s. The impact of this half-life measurement on precision tests of the CVC hypothesis and isospin symmetry breaking corrections for A⩾62 superallowed decays is discussed.

  9. Abiotic and biotic controls of spatial pattern at alpine treeline

    Science.gov (United States)

    Malanson, George P.; Xiao, Ningchuan; Alftine, K.J.; Bekker, Mathew; Butler, David R.; Brown, Daniel G.; Cairns, David M.; Fagre, Daniel; Walsh, Stephen J.

    2000-01-01

    At alpine treeline, trees and krummholz forms affect the environment in ways that increase their growth and reproduction. We assess the way in which these positive feedbacks combine in spatial patterns to alter the environment in the neighborhood of existing plants. The research is significant because areas of alpine tundra are susceptible to encroachment by woody species as climate changes. Moreover, understanding the general processes of plant invasion is important. The importance of spatial pattern has been recognized, but the spatial pattern of positive feedbacks per se has not been explored in depth. We present a linked set of models of vegetation change at an alpine forest-tundra ecotone. Our aim is to create models that are as simple as possible in order to test specific hypotheses. We present results from a model of the resource averaging hypothesis and the positive feedback switch hypothesis of treelines. We compare the patterns generated by the models to patterns observed in fine scale remotely sensed data.

  10. The Literal Translation Hypothesis in ESP Teaching/Learning Environments

    Directory of Open Access Journals (Sweden)

    Pedro A. Fuertes-Olivera

    2015-11-01

    Full Text Available Research on the characteristics of specialized vocabulary usually replicates studies that deal with general words, e.g. they typically describe frequent terms and focus on their linguistic characteristics to aid in the learning and acquisition of the terms. We dispute this practise, as we believe that the basic characteristic of terms is that they are coined to restrict meaning, i.e. to be as precise and as specific as possible in a particular context. For instance, around 70% of English and Spanish accounting terms are multi-word terms, most of which contain more than three orthographic words that syntactically behave in a way that is very different from the syntactic behaviour of the node on which they are formed (Fuertes-Olivera and Tarp, forthcoming. This has prompted us to propose a research framework that investigates whether or not the literal translation hypothesis, which has been addressed in several areas of translation studies, can also be applied in ESP teaching/learning environments. If plausible, the assumptions on which this hypothesis is based can shed light on how learners disambiguate terms they encounter. Within this framework, this paper presents evidence that the literal translation hypothesis is possible in ESP; it offers the results of a pilot study that sheds light on how this hypothesis may work, and also discusses its usability in the context of ESP learning. In particular, this paper presents strategies for teaching multi-word terms that are different from those currently based on corpus data. We believe that exercises such as “cloze”, “fill in” and similar “guessing” exercises must be abandoned in ESP teaching/learning environments. Instead, we propose exercises that reproduce L1 teaching and learning activities, i.e., exercises that are typically used when acquiring specialised knowledge and skills in any domain, e.g. taking part in meetings and giving presentations in a business context.

  11. A precise extragalactic test of General Relativity.

    Science.gov (United States)

    Collett, Thomas E; Oldham, Lindsay J; Smith, Russell J; Auger, Matthew W; Westfall, Kyle B; Bacon, David; Nichol, Robert C; Masters, Karen L; Koyama, Kazuya; van den Bosch, Remco

    2018-06-22

    Einstein's theory of gravity, General Relativity, has been precisely tested on Solar System scales, but the long-range nature of gravity is still poorly constrained. The nearby strong gravitational lens ESO 325-G004 provides a laboratory to probe the weak-field regime of gravity and measure the spatial curvature generated per unit mass, γ. By reconstructing the observed light profile of the lensed arcs and the observed spatially resolved stellar kinematics with a single self-consistent model, we conclude that γ = 0.97 ± 0.09 at 68% confidence. Our result is consistent with the prediction of 1 from General Relativity and provides a strong extragalactic constraint on the weak-field metric of gravity. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  12. The effects of selective and divided attention on sensory precision and integration.

    Science.gov (United States)

    Odegaard, Brian; Wozny, David R; Shams, Ladan

    2016-02-12

    In our daily lives, our capacity to selectively attend to stimuli within or across sensory modalities enables enhanced perception of the surrounding world. While previous research on selective attention has studied this phenomenon extensively, two important questions still remain unanswered: (1) how selective attention to a single modality impacts sensory integration processes, and (2) the mechanism by which selective attention improves perception. We explored how selective attention impacts performance in both a spatial task and a temporal numerosity judgment task, and employed a Bayesian Causal Inference model to investigate the computational mechanism(s) impacted by selective attention. We report three findings: (1) in the spatial domain, selective attention improves precision of the visual sensory representations (which were relatively precise), but not the auditory sensory representations (which were fairly noisy); (2) in the temporal domain, selective attention improves the sensory precision in both modalities (both of which were fairly reliable to begin with); (3) in both tasks, selective attention did not exert a significant influence over the tendency to integrate sensory stimuli. Therefore, it may be postulated that a sensory modality must possess a certain inherent degree of encoding precision in order to benefit from selective attention. It also appears that in certain basic perceptual tasks, the tendency to integrate crossmodal signals does not depend significantly on selective attention. We conclude with a discussion of how these results relate to recent theoretical considerations of selective attention. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  13. Precision study of the $\\beta$-decay of $^{74}$Rb

    CERN Multimedia

    Van Duppen, P L E; Lunney, D

    2002-01-01

    We are proposing a high-resolution study of the $\\beta$-decay of $^{74}$Rb in order to extrapolate our precision knowledge of the superallowed $\\beta$-decays from the sd and fp shells towards the medium-heavy Z=N nuclei. The primary goal is to provide new data for testing the CVC hypothesis and the unitarity condition of the CKM matrix of the Standard Model. The presented programme would involve the careful measurements of the decay properties of $^{74}$Rb including the branching ratios to the excited states as well as the precise determination of the decay energy of $^{74}$Rb. The experimental methods readily available at ISOLDE include high-transmission conversion electron spectroscopy, $\\gamma$-ray spectroscopy as well as the measurements of the masses of $^{74}$Rb and $^{74}$Kr using two complementary techniques, ISOLTRAP and MISTRAL. The experiment would rely on a high-quality $^{74}$Rb beam available at ISOLDE with adequate intensity.

  14. The Income Inequality Hypothesis Revisited : Assessing the Hypothesis Using Four Methodological Approaches

    NARCIS (Netherlands)

    Kragten, N.; Rözer, J.

    The income inequality hypothesis states that income inequality has a negative effect on individual’s health, partially because it reduces social trust. This article aims to critically assess the income inequality hypothesis by comparing several analytical strategies, namely OLS regression,

  15. A Fixed-Precision Sequential Sampling Plan for the Potato Tuberworm Moth, Phthorimaea operculella Zeller (Lepidoptera: Gelechidae), on Potato Cultivars.

    Science.gov (United States)

    Shahbi, M; Rajabpour, A

    2017-08-01

    Phthorimaea operculella Zeller is an important pest of potato in Iran. Spatial distribution and fixed-precision sequential sampling for population estimation of the pest on two potato cultivars, Arinda ® and Sante ® , were studied in two separate potato fields during two growing seasons (2013-2014 and 2014-2015). Spatial distribution was investigated by Taylor's power law and Iwao's patchiness. Results showed that the spatial distribution of eggs and larvae was random. In contrast to Iwao's patchiness, Taylor's power law provided a highly significant relationship between variance and mean density. Therefore, fixed-precision sequential sampling plan was developed by Green's model at two precision levels of 0.25 and 0.1. The optimum sample size on Arinda ® and Sante ® cultivars at precision level of 0.25 ranged from 151 to 813 and 149 to 802 leaves, respectively. At 0.1 precision level, the sample sizes varied from 5083 to 1054 and 5100 to 1050 leaves for Arinda ® and Sante ® cultivars, respectively. Therefore, the optimum sample sizes for the cultivars, with different resistance levels, were not significantly different. According to the calculated stop lines, the sampling must be continued until cumulative number of eggs + larvae reached to 15-16 or 96-101 individuals at precision levels of 0.25 or 0.1, respectively. The performance of the sampling plan was validated by resampling analysis using resampling for validation of sampling plans software. The sampling plant provided in this study can be used to obtain a rapid estimate of the pest density with minimal effort.

  16. [Spatial mobility on reaching adult age].

    Science.gov (United States)

    De Coninck, F

    1990-12-01

    "Starting with longitudinal data on two cohorts of women living in the Alpes-Maritimes [France] in 1982 (a sample of 1,500 women in total) we try to establish the role of the spatial distribution of opportunities at a number of key stages in the life cycle: marriage, birth of first child, making professional use of qualifications, confrontation of a situation of professional risk and professional mobility during the years immediately following the completion of studies. The underlying hypothesis is that control of social location often depends on the control of spatial location." (SUMMARY IN ENG) excerpt

  17. Analysing the distribution of synaptic vesicles using a spatial point process model

    DEFF Research Database (Denmark)

    Khanmohammadi, Mahdieh; Waagepetersen, Rasmus; Nava, Nicoletta

    2014-01-01

    functionality by statistically modelling the distribution of the synaptic vesicles in two groups of rats: a control group subjected to sham stress and a stressed group subjected to a single acute foot-shock (FS)-stress episode. We hypothesize that the synaptic vesicles have different spatial distributions...... in the two groups. The spatial distributions are modelled using spatial point process models with an inhomogeneous conditional intensity and repulsive pairwise interactions. Our results verify the hypothesis that the two groups have different spatial distributions....

  18. High precision straw tube chamber with cathode readout

    International Nuclear Information System (INIS)

    Bychkov, V.N.; Golutvin, I.A.; Ershov, Yu.V.

    1992-01-01

    The high precision straw chamber with cathode readout was constructed and investigated. The 10 mm straws were made of aluminized mylar strip with transparent longitudinal window. The X coordinate information has been taken from the cathode strips as induced charges and investigated via centroid method. The spatial resolution σ=120 μm has been obtained with signal/noise ratio about 60. The possible ways for improving the signal/noise ratio have been described. 7 refs.; 8 figs

  19. Precision of systematic and random sampling in clustered populations: habitat patches and aggregating organisms.

    Science.gov (United States)

    McGarvey, Richard; Burch, Paul; Matthews, Janet M

    2016-01-01

    Natural populations of plants and animals spatially cluster because (1) suitable habitat is patchy, and (2) within suitable habitat, individuals aggregate further into clusters of higher density. We compare the precision of random and systematic field sampling survey designs under these two processes of species clustering. Second, we evaluate the performance of 13 estimators for the variance of the sample mean from a systematic survey. Replicated simulated surveys, as counts from 100 transects, allocated either randomly or systematically within the study region, were used to estimate population density in six spatial point populations including habitat patches and Matérn circular clustered aggregations of organisms, together and in combination. The standard one-start aligned systematic survey design, a uniform 10 x 10 grid of transects, was much more precise. Variances of the 10 000 replicated systematic survey mean densities were one-third to one-fifth of those from randomly allocated transects, implying transect sample sizes giving equivalent precision by random survey would need to be three to five times larger. Organisms being restricted to patches of habitat was alone sufficient to yield this precision advantage for the systematic design. But this improved precision for systematic sampling in clustered populations is underestimated by standard variance estimators used to compute confidence intervals. True variance for the survey sample mean was computed from the variance of 10 000 simulated survey mean estimates. Testing 10 published and three newly proposed variance estimators, the two variance estimators (v) that corrected for inter-transect correlation (ν₈ and ν(W)) were the most accurate and also the most precise in clustered populations. These greatly outperformed the two "post-stratification" variance estimators (ν₂ and ν₃) that are now more commonly applied in systematic surveys. Similar variance estimator performance rankings were found with

  20. No functional role of attention-based rehearsal in maintenance of spatial working memory representations

    NARCIS (Netherlands)

    Belopolsky, A.V.; Theeuwes, J.

    2009-01-01

    The present study systematically examined the role of attention in maintenance of spatial representations in working memory as proposed by the attention-based rehearsal hypothesis [Awh, E., Jonides, J., & Reuter-Lorenz, P. A. (1998). Rehearsal in spatial working memory. Journal of Experimental

  1. Mutual repression enhances the steepness and precision of gene expression boundaries.

    Directory of Open Access Journals (Sweden)

    Thomas R Sokolowski

    Full Text Available Embryonic development is driven by spatial patterns of gene expression that determine the fate of each cell in the embryo. While gene expression is often highly erratic, embryonic development is usually exceedingly precise. In particular, gene expression boundaries are robust not only against intra-embryonic fluctuations such as noise in gene expression and protein diffusion, but also against embryo-to-embryo variations in the morphogen gradients, which provide positional information to the differentiating cells. How development is robust against intra- and inter-embryonic variations is not understood. A common motif in the gene regulation networks that control embryonic development is mutual repression between pairs of genes. To assess the role of mutual repression in the robust formation of gene expression patterns, we have performed large-scale stochastic simulations of a minimal model of two mutually repressing gap genes in Drosophila, hunchback (hb and knirps (kni. Our model includes not only mutual repression between hb and kni, but also the stochastic and cooperative activation of hb by the anterior morphogen Bicoid (Bcd and of kni by the posterior morphogen Caudal (Cad, as well as the diffusion of Hb and Kni between neighboring nuclei. Our analysis reveals that mutual repression can markedly increase the steepness and precision of the gap gene expression boundaries. In contrast to other mechanisms such as spatial averaging and cooperative gene activation, mutual repression thus allows for gene-expression boundaries that are both steep and precise. Moreover, mutual repression dramatically enhances their robustness against embryo-to-embryo variations in the morphogen levels. Finally, our simulations reveal that diffusion of the gap proteins plays a critical role not only in reducing the width of the gap gene expression boundaries via the mechanism of spatial averaging, but also in repairing patterning errors that could arise because of the

  2. HD 101065, the Most Peculiar Star: First Results from Precise Radial ...

    Indian Academy of Sciences (India)

    Abstract. In this paper we discuss the prospects for asteroseismology with spatial resolution and motivate studies of the most chemically peculiar. roAp star HD 101065. We present the first results from a high-precision radial velocity (RV) study of HD 101065 based on data spanning four nights that were acquired using the ...

  3. On the Keyhole Hypothesis

    DEFF Research Database (Denmark)

    Mikkelsen, Kaare B.; Kidmose, Preben; Hansen, Lars Kai

    2017-01-01

    simultaneously recorded scalp EEG. A cross-validation procedure was employed to ensure unbiased estimates. We present several pieces of evidence in support of the keyhole hypothesis: There is a high mutual information between data acquired at scalp electrodes and through the ear-EEG "keyhole," furthermore we......We propose and test the keyhole hypothesis that measurements from low dimensional EEG, such as ear-EEG reflect a broadly distributed set of neural processes. We formulate the keyhole hypothesis in information theoretical terms. The experimental investigation is based on legacy data consisting of 10...

  4. Precision of DVC approaches for strain analysis in bone imaged with μCT at different dimensional levels.

    Science.gov (United States)

    Dall'Ara, Enrico; Peña-Fernández, Marta; Palanca, Marco; Giorgi, Mario; Cristofolini, Luca; Tozzi, Gianluca

    2017-11-01

    Accurate measurement of local strain in heterogeneous and anisotropic bone tissue is fundamental to understand the pathophysiology of musculoskeletal diseases, to evaluate the effect of interventions from preclinical studies, and to optimize the design and delivery of biomaterials. Digital volume correlation (DVC) can be used to measure the three-dimensional displacement and strain fields from micro-Computed Tomography (µCT) images of loaded specimens. However, this approach is affected by the quality of the input images, by the morphology and density of the tissue under investigation, by the correlation scheme, and by the operational parameters used in the computation. Therefore, for each application the precision of the method should be evaluated. In this paper we present the results collected from datasets analyzed in previous studies as well as new data from a recent experimental campaign for characterizing the relationship between the precision of two different DVC approaches and the spatial resolution of the outputs. Different bone structures scanned with laboratory source µCT or Synchrotron light µCT (SRµCT) were processed in zero-strain tests to evaluate the precision of the DVC methods as a function of the subvolume size that ranged from 8 to 2500 micrometers. The results confirmed that for every microstructure the precision of DVC improves for larger subvolume size, following power laws. However, for the first time large differences in the precision of both local and global DVC approaches have been highlighted when SRµCT or in vivo µCT images were used instead of conventional ex vivo µCT. These findings suggest that in situ mechanical testing protocols applied in SRµCT facilities should be optimized in order to allow DVC analyses of localized strain measurements. Moreover, for in vivo µCT applications DVC analyses should be performed only with relatively course spatial resolution for achieving a reasonable precision of the method. In conclusion

  5. The biophysical model for accuracy of cellular sensing spatial gradients of multiple chemoattractants

    International Nuclear Information System (INIS)

    Chang, Qiang; Zuo, Li

    2013-01-01

    Spatial gradients of surrounding chemoattractants are the key factors in determining the directionality of eukaryotic cell movement. Thus, it is important for cells to accurately measure the spatial gradients of surrounding chemoattractants. Here, we study the precision of sensing the spatial gradients of multiple chemoattractants using cooperative receptor clusters. Cooperative receptors on cells are modeled as an Ising chain of Monod–Wyman–Changeux clusters subject to multiple chemical-gradient fields to study the physical limits of multiple chemoattractants spatial gradients sensing. We found that eukaryotic cells cannot sense each chemoattractant gradient individually. Instead, cells can only sense a weighted sum of surrounding chemical gradients. Moreover, the precision of sensing one chemical gradient is signicantly affected by coexisting chemoattractant concentrations. These findings can provide a further insight into the role of chemoattractants in immune response and help develop novel treatments for inflammatory diseases. (paper)

  6. A digital x-ray imaging MWPC detector system for precision absorptiometry

    International Nuclear Information System (INIS)

    Batemen, J.E.; Connolly, J.F.; Glasgow, W.

    1977-11-01

    An X-ray absorptiometric imaging system (based on a xenon-filled multiwire proportional counter) has been developed with high counting rate capability, good spatial resolution and linear mass response, aimed at permitting bone mass measurements to be made in the peripheral skeleton with precision approaching 1%. The system is described and preliminary results on test phantoms are presented. (author)

  7. Approaches to informed consent for hypothesis-testing and hypothesis-generating clinical genomics research.

    Science.gov (United States)

    Facio, Flavia M; Sapp, Julie C; Linn, Amy; Biesecker, Leslie G

    2012-10-10

    Massively-parallel sequencing (MPS) technologies create challenges for informed consent of research participants given the enormous scale of the data and the wide range of potential results. We propose that the consent process in these studies be based on whether they use MPS to test a hypothesis or to generate hypotheses. To demonstrate the differences in these approaches to informed consent, we describe the consent processes for two MPS studies. The purpose of our hypothesis-testing study is to elucidate the etiology of rare phenotypes using MPS. The purpose of our hypothesis-generating study is to test the feasibility of using MPS to generate clinical hypotheses, and to approach the return of results as an experimental manipulation. Issues to consider in both designs include: volume and nature of the potential results, primary versus secondary results, return of individual results, duty to warn, length of interaction, target population, and privacy and confidentiality. The categorization of MPS studies as hypothesis-testing versus hypothesis-generating can help to clarify the issue of so-called incidental or secondary results for the consent process, and aid the communication of the research goals to study participants.

  8. G-DOC Plus - an integrative bioinformatics platform for precision medicine.

    Science.gov (United States)

    Bhuvaneshwar, Krithika; Belouali, Anas; Singh, Varun; Johnson, Robert M; Song, Lei; Alaoui, Adil; Harris, Michael A; Clarke, Robert; Weiner, Louis M; Gusev, Yuriy; Madhavan, Subha

    2016-04-30

    G-DOC Plus is a data integration and bioinformatics platform that uses cloud computing and other advanced computational tools to handle a variety of biomedical BIG DATA including gene expression arrays, NGS and medical images so that they can be analyzed in the full context of other omics and clinical information. G-DOC Plus currently holds data from over 10,000 patients selected from private and public resources including Gene Expression Omnibus (GEO), The Cancer Genome Atlas (TCGA) and the recently added datasets from REpository for Molecular BRAin Neoplasia DaTa (REMBRANDT), caArray studies of lung and colon cancer, ImmPort and the 1000 genomes data sets. The system allows researchers to explore clinical-omic data one sample at a time, as a cohort of samples; or at the level of population, providing the user with a comprehensive view of the data. G-DOC Plus tools have been leveraged in cancer and non-cancer studies for hypothesis generation and validation; biomarker discovery and multi-omics analysis, to explore somatic mutations and cancer MRI images; as well as for training and graduate education in bioinformatics, data and computational sciences. Several of these use cases are described in this paper to demonstrate its multifaceted usability. G-DOC Plus can be used to support a variety of user groups in multiple domains to enable hypothesis generation for precision medicine research. The long-term vision of G-DOC Plus is to extend this translational bioinformatics platform to stay current with emerging omics technologies and analysis methods to continue supporting novel hypothesis generation, analysis and validation for integrative biomedical research. By integrating several aspects of the disease and exposing various data elements, such as outpatient lab workup, pathology, radiology, current treatments, molecular signatures and expected outcomes over a web interface, G-DOC Plus will continue to strengthen precision medicine research. G-DOC Plus is available

  9. Spatial attention enhances the selective integration of activity from area MT.

    Science.gov (United States)

    Masse, Nicolas Y; Herrington, Todd M; Cook, Erik P

    2012-09-01

    Distinguishing which of the many proposed neural mechanisms of spatial attention actually underlies behavioral improvements in visually guided tasks has been difficult. One attractive hypothesis is that attention allows downstream neural circuits to selectively integrate responses from the most informative sensory neurons. This would allow behavioral performance to be based on the highest-quality signals available in visual cortex. We examined this hypothesis by asking how spatial attention affects both the stimulus sensitivity of middle temporal (MT) neurons and their corresponding correlation with behavior. Analyzing a data set pooled from two experiments involving four monkeys, we found that spatial attention did not appreciably affect either the stimulus sensitivity of the neurons or the correlation between their activity and behavior. However, for those sessions in which there was a robust behavioral effect of attention, focusing attention inside the neuron's receptive field significantly increased the correlation between these two metrics, an indication of selective integration. These results suggest that, similar to mechanisms proposed for the neural basis of perceptual learning, the behavioral benefits of focusing spatial attention are attributable to selective integration of neural activity from visual cortical areas by their downstream targets.

  10. Instrument-induced spatial crosstalk deconvolution algorithm

    Science.gov (United States)

    Wright, Valerie G.; Evans, Nathan L., Jr.

    1986-01-01

    An algorithm has been developed which reduces the effects of (deconvolves) instrument-induced spatial crosstalk in satellite image data by several orders of magnitude where highly precise radiometry is required. The algorithm is based upon radiance transfer ratios which are defined as the fractional bilateral exchange of energy betwen pixels A and B.

  11. Negative affect improves the quality of memories: trading capacity for precision in sensory and working memory.

    Science.gov (United States)

    Spachtholz, Philipp; Kuhbandner, Christof; Pekrun, Reinhard

    2014-08-01

    Research has shown that negative affect reduces working memory capacity. Commonly, this effect has been attributed to an allocation of resources to task-irrelevant thoughts, suggesting that negative affect has detrimental consequences for working memory performance. However, rather than simply being a detrimental effect, the affect-induced capacity reduction may reflect a trading of capacity for precision of stored representations. To test this hypothesis, we induced neutral or negative affect and concurrently measured the number and precision of representations stored in sensory and working memory. Compared with neutral affect, negative affect reduced the capacity of both sensory and working memory. However, in both memory systems, this decrease in capacity was accompanied by an increase in precision. These findings demonstrate that observers unintentionally trade capacity for precision as a function of affective state and indicate that negative affect can be beneficial for the quality of memories. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  12. Stochastic precision analysis of 2D cardiac strain estimation in vivo

    International Nuclear Information System (INIS)

    Bunting, E A; Provost, J; Konofagou, E E

    2014-01-01

    Ultrasonic strain imaging has been applied to echocardiography and carries great potential to be used as a tool in the clinical setting. Two-dimensional (2D) strain estimation may be useful when studying the heart due to the complex, 3D deformation of the cardiac tissue. Increasing the framerate used for motion estimation, i.e. motion estimation rate (MER), has been shown to improve the precision of the strain estimation, although maintaining the spatial resolution necessary to view the entire heart structure in a single heartbeat remains challenging at high MERs. Two previously developed methods, the temporally unequispaced acquisition sequence (TUAS) and the diverging beam sequence (DBS), have been used in the past to successfully estimate in vivo axial strain at high MERs without compromising spatial resolution. In this study, a stochastic assessment of 2D strain estimation precision is performed in vivo for both sequences at varying MERs (65, 272, 544, 815 Hz for TUAS; 250, 500, 1000, 2000 Hz for DBS). 2D incremental strains were estimated during left ventricular contraction in five healthy volunteers using a normalized cross-correlation function and a least-squares strain estimator. Both sequences were shown capable of estimating 2D incremental strains in vivo. The conditional expected value of the elastographic signal-to-noise ratio (E(SNRe|ε)) was used to compare strain estimation precision of both sequences at multiple MERs over a wide range of clinical strain values. The results here indicate that axial strain estimation precision is much more dependent on MER than lateral strain estimation, while lateral estimation is more affected by strain magnitude. MER should be increased at least above 544 Hz to avoid suboptimal axial strain estimation. Radial and circumferential strain estimations were influenced by the axial and lateral strain in different ways. Furthermore, the TUAS and DBS were found to be of comparable precision at similar MERs. (paper)

  13. Plasmonic micropillars for precision cell force measurement across a large field-of-view

    Science.gov (United States)

    Xiao, Fan; Wen, Ximiao; Tan, Xing Haw Marvin; Chiou, Pei-Yu

    2018-01-01

    A plasmonic micropillar platform with self-organized gold nanospheres is reported for the precision cell traction force measurement across a large field-of-view (FOV). Gold nanospheres were implanted into the tips of polymer micropillars by annealing gold microdisks with nanosecond laser pulses. Each gold nanosphere is physically anchored in the center of a pillar tip and serves as a strong, point-source-like light scattering center for each micropillar. This allows a micropillar to be clearly observed and precisely tracked even under a low magnification objective lens for the concurrent and precision measurement across a large FOV. A spatial resolution of 30 nm for the pillar deflection measurement has been accomplished on this platform with a 20× objective lens.

  14. High-precision branching ratio measurement for the superallowed β+ emitter Ga62

    Science.gov (United States)

    Finlay, P.; Ball, G. C.; Leslie, J. R.; Svensson, C. E.; Towner, I. S.; Austin, R. A. E.; Bandyopadhyay, D.; Chaffey, A.; Chakrawarthy, R. S.; Garrett, P. E.; Grinyer, G. F.; Hackman, G.; Hyland, B.; Kanungo, R.; Leach, K. G.; Mattoon, C. M.; Morton, A. C.; Pearson, C. J.; Phillips, A. A.; Ressler, J. J.; Sarazin, F.; Savajols, H.; Schumaker, M. A.; Wong, J.

    2008-08-01

    A high-precision branching ratio measurement for the superallowed β+ decay of Ga62 was performed at the Isotope Separator and Accelerator (ISAC) radioactive ion beam facility. The 8π spectrometer, an array of 20 high-purity germanium detectors, was employed to detect the γ rays emitted following Gamow-Teller and nonanalog Fermi β+ decays of Ga62, and the SCEPTAR plastic scintillator array was used to detect the emitted β particles. Thirty γ rays were identified following Ga62 decay, establishing the superallowed branching ratio to be 99.858(8)%. Combined with the world-average half-life and a recent high-precision Q-value measurement for Ga62, this branching ratio yields an ft value of 3074.3±1.1 s, making Ga62 among the most precisely determined superallowed ft values. Comparison between the superallowed ft value determined in this work and the world-average corrected F tmacr value allows the large nuclear-structure-dependent correction for Ga62 decay to be experimentally determined from the CVC hypothesis to better than 7% of its own value, the most precise experimental determination for any superallowed emitter. These results provide a benchmark for the refinement of the theoretical description of isospin-symmetry breaking in A⩾62 superallowed decays.

  15. Cosmological signatures of anisotropic spatial curvature

    International Nuclear Information System (INIS)

    Pereira, Thiago S.; Marugán, Guillermo A. Mena; Carneiro, Saulo

    2015-01-01

    If one is willing to give up the cherished hypothesis of spatial isotropy, many interesting cosmological models can be developed beyond the simple anisotropically expanding scenarios. One interesting possibility is presented by shear-free models in which the anisotropy emerges at the level of the curvature of the homogeneous spatial sections, whereas the expansion is dictated by a single scale factor. We show that such models represent viable alternatives to describe the large-scale structure of the inflationary universe, leading to a kinematically equivalent Sachs-Wolfe effect. Through the definition of a complete set of spatial eigenfunctions we compute the two-point correlation function of scalar perturbations in these models. In addition, we show how such scenarios would modify the spectrum of the CMB assuming that the observations take place in a small patch of a universe with anisotropic curvature

  16. Cosmological signatures of anisotropic spatial curvature

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Thiago S. [Departamento de Física, Universidade Estadual de Londrina, 86057-970, Londrina – PR (Brazil); Marugán, Guillermo A. Mena [Instituto de Estructura de la Materia, IEM-CSIC, Serrano 121, 28006, Madrid (Spain); Carneiro, Saulo, E-mail: tspereira@uel.br, E-mail: mena@iem.cfmac.csic.es, E-mail: saulo.carneiro@pq.cnpq.br [Instituto de Física, Universidade Federal da Bahia, 40210-340, Salvador – BA (Brazil)

    2015-07-01

    If one is willing to give up the cherished hypothesis of spatial isotropy, many interesting cosmological models can be developed beyond the simple anisotropically expanding scenarios. One interesting possibility is presented by shear-free models in which the anisotropy emerges at the level of the curvature of the homogeneous spatial sections, whereas the expansion is dictated by a single scale factor. We show that such models represent viable alternatives to describe the large-scale structure of the inflationary universe, leading to a kinematically equivalent Sachs-Wolfe effect. Through the definition of a complete set of spatial eigenfunctions we compute the two-point correlation function of scalar perturbations in these models. In addition, we show how such scenarios would modify the spectrum of the CMB assuming that the observations take place in a small patch of a universe with anisotropic curvature.

  17. Imaging Optical Frequencies with 100 μ Hz Precision and 1.1 μ m Resolution

    Science.gov (United States)

    Marti, G. Edward; Hutson, Ross B.; Goban, Akihisa; Campbell, Sara L.; Poli, Nicola; Ye, Jun

    2018-03-01

    We implement imaging spectroscopy of the optical clock transition of lattice-trapped degenerate fermionic Sr in the Mott-insulating regime, combining micron spatial resolution with submillihertz spectral precision. We use these tools to demonstrate atomic coherence for up to 15 s on the clock transition and reach a record frequency precision of 2.5 ×10-19. We perform the most rapid evaluation of trapping light shifts and record a 150 mHz linewidth, the narrowest Rabi line shape observed on a coherent optical transition. The important emerging capability of combining high-resolution imaging and spectroscopy will improve the clock precision, and provide a path towards measuring many-body interactions and testing fundamental physics.

  18. Hypothesis in research

    Directory of Open Access Journals (Sweden)

    Eudaldo Enrique Espinoza Freire

    2018-01-01

    Full Text Available It is intended with this work to have a material with the fundamental contents, which enable the university professor to formulate the hypothesis, for the development of an investigation, taking into account the problem to be solved. For its elaboration, the search of information in primary documents was carried out, such as thesis of degree and reports of research results, selected on the basis of its relevance with the analyzed subject, current and reliability, secondary documents, as scientific articles published in journals of recognized prestige, the selection was made with the same terms as in the previous documents. It presents a conceptualization of the updated hypothesis, its characterization and an analysis of the structure of the hypothesis in which the determination of the variables is deepened. The involvement of the university professor in the teaching-research process currently faces some difficulties, which are manifested, among other aspects, in an unstable balance between teaching and research, which leads to a separation between them.

  19. Precise Truss Assembly Using Commodity Parts and Low Precision Welding

    Science.gov (United States)

    Komendera, Erik; Reishus, Dustin; Dorsey, John T.; Doggett, W. R.; Correll, Nikolaus

    2014-01-01

    Hardware and software design and system integration for an intelligent precision jigging robot (IPJR), which allows high precision assembly using commodity parts and low-precision bonding, is described. Preliminary 2D experiments that are motivated by the problem of assembling space telescope optical benches and very large manipulators on orbit using inexpensive, stock hardware and low-precision welding are also described. An IPJR is a robot that acts as the precise "jigging", holding parts of a local structure assembly site in place, while an external low precision assembly agent cuts and welds members. The prototype presented in this paper allows an assembly agent (for this prototype, a human using only low precision tools), to assemble a 2D truss made of wooden dowels to a precision on the order of millimeters over a span on the order of meters. The analysis of the assembly error and the results of building a square structure and a ring structure are discussed. Options for future work, to extend the IPJR paradigm to building in 3D structures at micron precision are also summarized.

  20. A neural measure of precision in visual working memory.

    Science.gov (United States)

    Ester, Edward F; Anderson, David E; Serences, John T; Awh, Edward

    2013-05-01

    Recent studies suggest that the temporary storage of visual detail in working memory is mediated by sensory recruitment or sustained patterns of stimulus-specific activation within feature-selective regions of visual cortex. According to a strong version of this hypothesis, the relative "quality" of these patterns should determine the clarity of an individual's memory. Here, we provide a direct test of this claim. We used fMRI and a forward encoding model to characterize population-level orientation-selective responses in visual cortex while human participants held an oriented grating in memory. This analysis, which enables a precise quantitative description of multivoxel, population-level activity measured during working memory storage, revealed graded response profiles whose amplitudes were greatest for the remembered orientation and fell monotonically as the angular distance from this orientation increased. Moreover, interparticipant differences in the dispersion-but not the amplitude-of these response profiles were strongly correlated with performance on a concurrent memory recall task. These findings provide important new evidence linking the precision of sustained population-level responses in visual cortex and memory acuity.

  1. Precision Measurement of the Beryllium-7 Solar Neutrino Interaction Rate in Borexino

    Science.gov (United States)

    Saldanha, Richard Nigel

    Solar neutrinos, since their first detection nearly forty years ago, have revealed valuable information regarding the source of energy production in the Sun, and have demonstrated that neutrino oscillations are well described by the Large Mixing Angle (LMA) oscillation parameters with matter interactions due to the Mikheyev-Smirnov-Wolfenstein (MSW) effect. This thesis presents a precision measurement of the 7Be solar neutrino interaction rate within Borexino, an underground liquid scintillator detector that is designed to measure solar neutrino interactions through neutrino-electron elastic scattering. The thesis includes a detailed description of the analysis techniques developed and used for this measurement as well as an evaluation of the relevant systematic uncertainties that affect the precision of the result. The rate of neutrino-electron elastic scattering from 0.862 MeV 7Be neutrinos is determined to be 45.4 +/- 1.6 (stat) +/- 1.5 (sys) counts/day/100 ton. Due to extensive detector calibrations and improved analysis methods, the systematic uncertainty in the interaction rate has been reduced by more than a factor of two from the previous evaluation. In the no-oscillation hypothesis, the interaction rate corresponds to a 0.862 MeV 7Be electron neutrino flux of (2.75 +/- 0.13) x 10 9 cm-2 sec-1. Including the predicted neutrino flux from the Standard Solar Model yields an electron neutrino survival probability of Pee 0.51 +/- 0.07 and rules out the no-oscillation hypothesis at 5.1sigma The LMA-MSW neutrino oscillation model predicts a transition in the solar Pee value between low ( 10 MeV) energies which has not yet been experimentally confirmed. This result, in conjunction with the Standard Solar Model, represents the most precise measurement of the electron neutrino survival probability for solar neutrinos at sub-MeV energies.

  2. A high precision straw tube chamber with cathode readout

    International Nuclear Information System (INIS)

    Bychkov, V.N.; Golutvin, I.A.; Ershov, Yu.V.; Zubarev, E.V.; Ivanov, A.B.; Lysiakov, V.N.; Makhankov, A.V.; Movchan, S.A.; Peshekhonov, V.D.; Preda, T.

    1993-01-01

    The high precision straw chamber with cathode readout was constructed and investigated. The 10 mm diameter straws were made of aluminized Mylar with transparent longitudinal window. The X-coordinate information has been taken from cathode strips as induced charges and investigated with the centroid method. The spatial resolution σ x =103 μm was obtained at a signal-to-noise ratio of about 70. The possible ways to improve the signal-to-noise ratio are discussed. (orig.)

  3. Is the Aluminum Hypothesis Dead?

    Science.gov (United States)

    2014-01-01

    The Aluminum Hypothesis, the idea that aluminum exposure is involved in the etiology of Alzheimer disease, dates back to a 1965 demonstration that aluminum causes neurofibrillary tangles in the brains of rabbits. Initially the focus of intensive research, the Aluminum Hypothesis has gradually been abandoned by most researchers. Yet, despite this current indifference, the Aluminum Hypothesis continues to attract the attention of a small group of scientists and aluminum continues to be viewed with concern by some of the public. This review article discusses reasons that mainstream science has largely abandoned the Aluminum Hypothesis and explores a possible reason for some in the general public continuing to view aluminum with mistrust. PMID:24806729

  4. Men’s Perception of Raped Women: Test of the Sexually Transmitted Disease Hypothesis and the Cuckoldry Hypothesis

    Directory of Open Access Journals (Sweden)

    Prokop Pavol

    2016-06-01

    Full Text Available Rape is a recurrent adaptive problem of female humans and females of a number of non-human animals. Rape has various physiological and reproductive costs to the victim. The costs of rape are furthermore exaggerated by social rejection and blaming of a victim, particularly by men. The negative perception of raped women by men has received little attention from an evolutionary perspective. Across two independent studies, we investigated whether the risk of sexually transmitted diseases (the STD hypothesis, Hypothesis 1 or paternity uncertainty (the cuckoldry hypothesis, Hypothesis 2 influence the negative perception of raped women by men. Raped women received lower attractiveness score than non-raped women, especially in long-term mate attractiveness score. The perceived attractiveness of raped women was not influenced by the presence of experimentally manipulated STD cues on faces of putative rapists. Women raped by three men received lower attractiveness score than women raped by one man. These results provide stronger support for the cuckoldry hypothesis (Hypothesis 2 than for the STD hypothesis (Hypothesis 1. Single men perceived raped women as more attractive than men in a committed relationship (Hypothesis 3, suggesting that the mating opportunities mediate men’s perception of victims of rape. Overall, our results suggest that the risk of cuckoldry underlie the negative perception of victims of rape by men rather than the fear of disease transmission.

  5. From ear to body: the auditory-motor loop in spatial cognition.

    Science.gov (United States)

    Viaud-Delmon, Isabelle; Warusfel, Olivier

    2014-01-01

    SPATIAL MEMORY IS MAINLY STUDIED THROUGH THE VISUAL SENSORY MODALITY: navigation tasks in humans rarely integrate dynamic and spatial auditory information. In order to study how a spatial scene can be memorized on the basis of auditory and idiothetic cues only, we constructed an auditory equivalent of the Morris water maze, a task widely used to assess spatial learning and memory in rodents. Participants were equipped with wireless headphones, which delivered a soundscape updated in real time according to their movements in 3D space. A wireless tracking system (video infrared with passive markers) was used to send the coordinates of the subject's head to the sound rendering system. The rendering system used advanced HRTF-based synthesis of directional cues and room acoustic simulation for the auralization of a realistic acoustic environment. Participants were guided blindfolded in an experimental room. Their task was to explore a delimitated area in order to find a hidden auditory target, i.e., a sound that was only triggered when walking on a precise location of the area. The position of this target could be coded in relationship to auditory landmarks constantly rendered during the exploration of the area. The task was composed of a practice trial, 6 acquisition trials during which they had to memorize the localization of the target, and 4 test trials in which some aspects of the auditory scene were modified. The task ended with a probe trial in which the auditory target was removed. The configuration of searching paths allowed observing how auditory information was coded to memorize the position of the target. They suggested that space can be efficiently coded without visual information in normal sighted subjects. In conclusion, space representation can be based on sensorimotor and auditory cues only, providing another argument in favor of the hypothesis that the brain has access to a modality-invariant representation of external space.

  6. From ear to body: the auditory-motor loop in spatial cognition

    Directory of Open Access Journals (Sweden)

    Isabelle eViaud-Delmon

    2014-09-01

    Full Text Available Spatial memory is mainly studied through the visual sensory modality: navigation tasks in humans rarely integrate dynamic and spatial auditory information. In order to study how a spatial scene can be memorized on the basis of auditory and idiothetic cues only, we constructed an auditory equivalent of the Morris water maze, a task widely used to assess spatial learning and memory in rodents. Participants were equipped with wireless headphones, which delivered a soundscape updated in real time according to their movements in 3D space. A wireless tracking system (video infrared with passive markers was used to send the coordinates of the subject’s head to the sound rendering system. The rendering system used advanced HRTF-based synthesis of directional cues and room acoustic simulation for the auralization of a realistic acoustic environment. Participants were guided blindfolded in an experimental room. Their task was to explore a delimitated area in order to find a hidden auditory target, i.e. a sound that was only triggered when walking on a precise location of the area. The position of this target could be coded in relationship to auditory landmarks constantly rendered during the exploration of the area. The task was composed of a practice trial, 6 acquisition trials during which they had to memorise the localisation of the target, and 4 test trials in which some aspects of the auditory scene were modified. The task ended with a probe trial in which the auditory target was removed.The configuration of searching paths allowed observing how auditory information was coded to memorise the position of the target. They suggested that space can be efficiently coded without visual information in normal sighted subjects. In conclusion, space representation can be based on sensorimotor and auditory cues only, providing another argument in favour of the hypothesis that the brain has access to a modality-invariant representation of external space.

  7. rpe v5: an emulator for reduced floating-point precision in large numerical simulations

    Science.gov (United States)

    Dawson, Andrew; Düben, Peter D.

    2017-06-01

    This paper describes the rpe (reduced-precision emulator) library which has the capability to emulate the use of arbitrary reduced floating-point precision within large numerical models written in Fortran. The rpe software allows model developers to test how reduced floating-point precision affects the result of their simulations without having to make extensive code changes or port the model onto specialized hardware. The software can be used to identify parts of a program that are problematic for numerical precision and to guide changes to the program to allow a stronger reduction in precision.The development of rpe was motivated by the strong demand for more computing power. If numerical precision can be reduced for an application under consideration while still achieving results of acceptable quality, computational cost can be reduced, since a reduction in numerical precision may allow an increase in performance or a reduction in power consumption. For simulations with weather and climate models, savings due to a reduction in precision could be reinvested to allow model simulations at higher spatial resolution or complexity, or to increase the number of ensemble members to improve predictions. rpe was developed with a particular focus on the community of weather and climate modelling, but the software could be used with numerical simulations from other domains.

  8. Spatial price transmission and market integration of Cistanthera ...

    African Journals Online (AJOL)

    Spatial price transmission and market integration of Cistanthera papaverifera species in ... Log in or Register to get access to full text downloads. ... the null hypothesis of non-stationarity at their levels at 1% and 5% significance level. ... This study concludes that sawn-wood marketing in Delta State have a high degree of ...

  9. High-precision relative position and attitude measurement for on-orbit maintenance of spacecraft

    Science.gov (United States)

    Zhu, Bing; Chen, Feng; Li, Dongdong; Wang, Ying

    2018-02-01

    In order to realize long-term on-orbit running of satellites, space stations, etc spacecrafts, in addition to the long life design of devices, The life of the spacecraft can also be extended by the on-orbit servicing and maintenance. Therefore, it is necessary to keep precise and detailed maintenance of key components. In this paper, a high-precision relative position and attitude measurement method used in the maintenance of key components is given. This method mainly considers the design of the passive cooperative marker, light-emitting device and high resolution camera in the presence of spatial stray light and noise. By using a series of algorithms, such as background elimination, feature extraction, position and attitude calculation, and so on, the high precision relative pose parameters as the input to the control system between key operation parts and maintenance equipment are obtained. The simulation results show that the algorithm is accurate and effective, satisfying the requirements of the precision operation technique.

  10. Spatial variability in branchial basket meristics and morphology of ...

    African Journals Online (AJOL)

    We examined spatial variability in meristic and morphological characteristics of the branchial basket of sardine Sardinops sagax collected from four geographical regions around the southern African coast, namely Namibia and the South African west, south and east coasts. Our analysis tested the hypothesis of three putative ...

  11. Speech cues contribute to audiovisual spatial integration.

    Directory of Open Access Journals (Sweden)

    Christopher W Bishop

    Full Text Available Speech is the most important form of human communication but ambient sounds and competing talkers often degrade its acoustics. Fortunately the brain can use visual information, especially its highly precise spatial information, to improve speech comprehension in noisy environments. Previous studies have demonstrated that audiovisual integration depends strongly on spatiotemporal factors. However, some integrative phenomena such as McGurk interference persist even with gross spatial disparities, suggesting that spatial alignment is not necessary for robust integration of audiovisual place-of-articulation cues. It is therefore unclear how speech-cues interact with audiovisual spatial integration mechanisms. Here, we combine two well established psychophysical phenomena, the McGurk effect and the ventriloquist's illusion, to explore this dependency. Our results demonstrate that conflicting spatial cues may not interfere with audiovisual integration of speech, but conflicting speech-cues can impede integration in space. This suggests a direct but asymmetrical influence between ventral 'what' and dorsal 'where' pathways.

  12. Spatial forecast of landslides in three gorges based on spatial data mining.

    Science.gov (United States)

    Wang, Xianmin; Niu, Ruiqing

    2009-01-01

    The Three Gorges is a region with a very high landslide distribution density and a concentrated population. In Three Gorges there are often landslide disasters, and the potential risk of landslides is tremendous. In this paper, focusing on Three Gorges, which has a complicated landform, spatial forecasting of landslides is studied by establishing 20 forecast factors (spectra, texture, vegetation coverage, water level of reservoir, slope structure, engineering rock group, elevation, slope, aspect, etc). China-Brazil Earth Resources Satellite (Cbers) images were adopted based on C4.5 decision tree to mine spatial forecast landslide criteria in Guojiaba Town (Zhigui County) in Three Gorges and based on this knowledge, perform intelligent spatial landslide forecasts for Guojiaba Town. All landslides lie in the dangerous and unstable regions, so the forecast result is good. The method proposed in the paper is compared with seven other methods: IsoData, K-Means, Mahalanobis Distance, Maximum Likelihood, Minimum Distance, Parallelepiped and Information Content Model. The experimental results show that the method proposed in this paper has a high forecast precision, noticeably higher than that of the other seven methods.

  13. Precision Oncology: Between Vaguely Right and Precisely Wrong.

    Science.gov (United States)

    Brock, Amy; Huang, Sui

    2017-12-01

    Precision Oncology seeks to identify and target the mutation that drives a tumor. Despite its straightforward rationale, concerns about its effectiveness are mounting. What is the biological explanation for the "imprecision?" First, Precision Oncology relies on indiscriminate sequencing of genomes in biopsies that barely represent the heterogeneous mix of tumor cells. Second, findings that defy the orthodoxy of oncogenic "driver mutations" are now accumulating: the ubiquitous presence of oncogenic mutations in silent premalignancies or the dynamic switching without mutations between various cell phenotypes that promote progression. Most troublesome is the observation that cancer cells that survive treatment still will have suffered cytotoxic stress and thereby enter a stem cell-like state, the seeds for recurrence. The benefit of "precision targeting" of mutations is inherently limited by this counterproductive effect. These findings confirm that there is no precise linear causal relationship between tumor genotype and phenotype, a reminder of logician Carveth Read's caution that being vaguely right may be preferable to being precisely wrong. An open-minded embrace of the latest inconvenient findings indicating nongenetic and "imprecise" phenotype dynamics of tumors as summarized in this review will be paramount if Precision Oncology is ultimately to lead to clinical benefits. Cancer Res; 77(23); 6473-9. ©2017 AACR . ©2017 American Association for Cancer Research.

  14. Neuroticism, intelligence, and intra-individual variability in elementary cognitive tasks: testing the mental noise hypothesis.

    Science.gov (United States)

    Colom, Roberto; Quiroga, Ma Angeles

    2009-08-01

    Some studies show positive correlations between intraindividual variability in elementary speed measures (reflecting processing efficiency) and individual differences in neuroticism (reflecting instability in behaviour). The so-called neural noise hypothesis assumes that higher levels of noise are related both to smaller indices of processing efficiency and greater levels of neuroticism. Here, we test this hypothesis measuring mental speed by means of three elementary cognitive tasks tapping similar basic processes but varying systematically their content (verbal, numerical, and spatial). Neuroticism and intelligence are also measured. The sample comprised 196 undergraduate psychology students. The results show that (1) processing efficiency is generally unrelated to individual differences in neuroticism, (2) processing speed and efficiency correlate with intelligence, and (3) only the efficiency index is genuinely related to intelligence when the colinearity between speed and efficiency is controlled.

  15. Development and validation of an automated and marker-free CT-based spatial analysis method (CTSA) for assessment of femoral hip implant migration: In vitro accuracy and precision comparable to that of radiostereometric analysis (RSA).

    Science.gov (United States)

    Scheerlinck, Thierry; Polfliet, Mathias; Deklerck, Rudi; Van Gompel, Gert; Buls, Nico; Vandemeulebroucke, Jef

    2016-01-01

    We developed a marker-free automated CT-based spatial analysis (CTSA) method to detect stem-bone migration in consecutive CT datasets and assessed the accuracy and precision in vitro. Our aim was to demonstrate that in vitro accuracy and precision of CTSA is comparable to that of radiostereometric analysis (RSA). Stem and bone were segmented in 2 CT datasets and both were registered pairwise. The resulting rigid transformations were compared and transferred to an anatomically sound coordinate system, taking the stem as reference. This resulted in 3 translation parameters and 3 rotation parameters describing the relative amount of stem-bone displacement, and it allowed calculation of the point of maximal stem migration. Accuracy was evaluated in 39 comparisons by imposing known stem migration on a stem-bone model. Precision was estimated in 20 comparisons based on a zero-migration model, and in 5 patients without stem loosening. Limits of the 95% tolerance intervals (TIs) for accuracy did not exceed 0.28 mm for translations and 0.20° for rotations (largest standard deviation of the signed error (SD(SE)): 0.081 mm and 0.057°). In vitro, limits of the 95% TI for precision in a clinically relevant setting (8 comparisons) were below 0.09 mm and 0.14° (largest SD(SE): 0.012 mm and 0.020°). In patients, the precision was lower, but acceptable, and dependent on CT scan resolution. CTSA allows detection of stem-bone migration with an accuracy and precision comparable to that of RSA. It could be valuable for evaluation of subtle stem loosening in clinical practice.

  16. Derivation and precision of mean field electrodynamics with mesoscale fluctuations

    Science.gov (United States)

    Zhou, Hongzhe; Blackman, Eric G.

    2018-06-01

    Mean field electrodynamics (MFE) facilitates practical modelling of secular, large scale properties of astrophysical or laboratory systems with fluctuations. Practitioners commonly assume wide scale separation between mean and fluctuating quantities, to justify equality of ensemble and spatial or temporal averages. Often however, real systems do not exhibit such scale separation. This raises two questions: (I) What are the appropriate generalized equations of MFE in the presence of mesoscale fluctuations? (II) How precise are theoretical predictions from MFE? We address both by first deriving the equations of MFE for different types of averaging, along with mesoscale correction terms that depend on the ratio of averaging scale to variation scale of the mean. We then show that even if these terms are small, predictions of MFE can still have a significant precision error. This error has an intrinsic contribution from the dynamo input parameters and a filtering contribution from differences in the way observations and theory are projected through the measurement kernel. Minimizing the sum of these contributions can produce an optimal scale of averaging that makes the theory maximally precise. The precision error is important to quantify when comparing to observations because it quantifies the resolution of predictive power. We exemplify these principles for galactic dynamos, comment on broader implications, and identify possibilities for further work.

  17. Semantic Features, Perceptual Expectations, and Frequency as Factors in the Learning of Polar Spatial Adjective Concepts.

    Science.gov (United States)

    Dunckley, Candida J. Lutes; Radtke, Robert C.

    Two semantic theories of word learning, a perceptual complexity hypothesis (H. Clark, 1970) and a quantitative complexity hypothesis (E. Clark, 1972) were tested by teaching 24 preschoolers and 16 college students CVC labels for five polar spatial adjective concepts having single word representations in English, and for three having no direct…

  18. Semantic elaboration in auditory and visual spatial memory.

    Science.gov (United States)

    Taevs, Meghan; Dahmani, Louisa; Zatorre, Robert J; Bohbot, Véronique D

    2010-01-01

    The aim of this study was to investigate the hypothesis that semantic information facilitates auditory and visual spatial learning and memory. An auditory spatial task was administered, whereby healthy participants were placed in the center of a semi-circle that contained an array of speakers where the locations of nameable and non-nameable sounds were learned. In the visual spatial task, locations of pictures of abstract art intermixed with nameable objects were learned by presenting these items in specific locations on a computer screen. Participants took part in both the auditory and visual spatial tasks, which were counterbalanced for order and were learned at the same rate. Results showed that learning and memory for the spatial locations of nameable sounds and pictures was significantly better than for non-nameable stimuli. Interestingly, there was a cross-modal learning effect such that the auditory task facilitated learning of the visual task and vice versa. In conclusion, our results support the hypotheses that the semantic representation of items, as well as the presentation of items in different modalities, facilitate spatial learning and memory.

  19. Evaluation of the Precision of Satellite-Derived Sea Surface Temperature Fields

    Science.gov (United States)

    Wu, F.; Cornillon, P. C.; Guan, L.

    2016-02-01

    A great deal of attention has been focused on the temporal accuracy of satellite-derived sea surface temperature (SST) fields with little attention being given to their spatial precision. Specifically, the primary measure of the quality of SST fields has been the bias and variance of selected values minus co-located (in space and time) in situ values. Contributing values, determined by the location of the in situ values and the necessity that the satellite-derived values be cloud free, are generally widely separated in space and time hence provide little information related to the pixel-to-pixel uncertainty in the retrievals. But the main contribution to the uncertainty in satellite-derived SST retrievals relates to atmospheric contamination and because the spatial scales of atmospheric features are, in general, large compared with the pixel separation of modern infra-red sensors, the pixel-to-pixel uncertainty is often smaller than the accuracy determined from in situ match-ups. This makes selection of satellite-derived datasets for the study of submesoscale processes, for which the spatial structure of the upper ocean is significant, problematic. In this presentation we present a methodology to characterize the spatial precision of satellite-derived SST fields. The method is based on an examination of the high wavenumber tail of the 2-D spectrum of SST fields in the Sargasso Sea, a low energy region of the ocean close to the track of the MV Oleander, a container ship making weekly roundtrips between New York and Bermuda, with engine intake temperatures sampled every 75 m along track. Important spectral characteristics are the point at which the satellite-derived spectra separate from the Oleander spectra and the spectral slope following separation. In this presentation a number of high resolution 375 m to 10 km SST datasets are evaluated based on this approach.

  20. Rumlig kultur / Spatial Culture

    DEFF Research Database (Denmark)

    RUMLIG KULTUR / SPATIAL CULTURE præsenterer et humanvidenskabeligt livtag med storbyens erfaringsverden. Emnerne for 21 kapitler spænder fra billedhuggeren Bjørn Nørgaard og boligbyggeriet Bispebjerg Bakke til stedsopfattelsen i moderne guidebøger. Undervjs inddrages bykulturens tænkere såsom Steen...... artikler et forskningsfelt for rumlig kultur, hvori alskens sanse- og refleksionsformer finder sammen. Based in humanistic urban studies as practiced in the Department of Arts and Cultural Studies, University of Copenhagen, SPATIAL CULTURE outlines a novel framework for understanding the social...... and cultural environments of the modern and contemporary metropolis. The contributions focus on urban and suburban cultures of Copenhagen, New York, Hong Kong, Berlin and anderswo, demonstrating how the precise analysis of cultural and artistic phenomena informs a multilayered understanding...

  1. Processing of spatial and non-spatial information in rats with lesions of the medial and lateral entorhinal cortex: Environmental complexity matters.

    Science.gov (United States)

    Rodo, Christophe; Sargolini, Francesca; Save, Etienne

    2017-03-01

    The entorhinal-hippocampal circuitry has been suggested to play an important role in episodic memory but the contribution of the entorhinal cortex remains elusive. Predominant theories propose that the medial entorhinal cortex (MEC) processes spatial information whereas the lateral entorhinal cortex (LEC) processes non spatial information. A recent study using an object exploration task has suggested that the involvement of the MEC and LEC spatial and non-spatial information processing could be modulated by the amount of information to be processed, i.e. environmental complexity. To address this hypothesis we used an object exploration task in which rats with excitotoxic lesions of the MEC and LEC had to detect spatial and non-spatial novelty among a set of objects and we varied environmental complexity by decreasing the number of objects or amount of object diversity. Reducing diversity resulted in restored ability to process spatial and non-spatial information in MEC and LEC groups, respectively. Reducing the number of objects yielded restored ability to process non-spatial information in the LEC group but not the ability to process spatial information in the MEC group. The findings indicate that the MEC and LEC are not strictly necessary for spatial and non-spatial processing but that their involvement depends on the complexity of the information to be processed. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. The relationship between visual working memory and attention: retention of precise colour information in the absence of effects on perceptual selection.

    Science.gov (United States)

    Hollingworth, Andrew; Hwang, Seongmin

    2013-10-19

    We examined the conditions under which a feature value in visual working memory (VWM) recruits visual attention to matching stimuli. Previous work has suggested that VWM supports two qualitatively different states of representation: an active state that interacts with perceptual selection and a passive (or accessory) state that does not. An alternative hypothesis is that VWM supports a single form of representation, with the precision of feature memory controlling whether or not the representation interacts with perceptual selection. The results of three experiments supported the dual-state hypothesis. We established conditions under which participants retained a relatively precise representation of a parcticular colour. If the colour was immediately task relevant, it reliably recruited attention to matching stimuli. However, if the colour was not immediately task relevant, it failed to interact with perceptual selection. Feature maintenance in VWM is not necessarily equivalent with feature-based attentional selection.

  3. Precision pharmacology for Alzheimer's disease.

    Science.gov (United States)

    Hampel, Harald; Vergallo, Andrea; Aguilar, Lisi Flores; Benda, Norbert; Broich, Karl; Cuello, A Claudio; Cummings, Jeffrey; Dubois, Bruno; Federoff, Howard J; Fiandaca, Massimo; Genthon, Remy; Haberkamp, Marion; Karran, Eric; Mapstone, Mark; Perry, George; Schneider, Lon S; Welikovitch, Lindsay A; Woodcock, Janet; Baldacci, Filippo; Lista, Simone

    2018-04-01

    The complex multifactorial nature of polygenic Alzheimer's disease (AD) presents significant challenges for drug development. AD pathophysiology is progressing in a non-linear dynamic fashion across multiple systems levels - from molecules to organ systems - and through adaptation, to compensation, and decompensation to systems failure. Adaptation and compensation maintain homeostasis: a dynamic equilibrium resulting from the dynamic non-linear interaction between genome, epigenome, and environment. An individual vulnerability to stressors exists on the basis of individual triggers, drivers, and thresholds accounting for the initiation and failure of adaptive and compensatory responses. Consequently, the distinct pattern of AD pathophysiology in space and time must be investigated on the basis of the individual biological makeup. This requires the implementation of systems biology and neurophysiology to facilitate Precision Medicine (PM) and Precision Pharmacology (PP). The regulation of several processes at multiple levels of complexity from gene expression to cellular cycle to tissue repair and system-wide network activation has different time delays (temporal scale) according to the affected systems (spatial scale). The initial failure might originate and occur at every level potentially affecting the whole dynamic interrelated systems within an organism. Unraveling the spatial and temporal dynamics of non-linear pathophysiological mechanisms across the continuum of hierarchical self-organized systems levels and from systems homeostasis to systems failure is key to understand AD. Measuring and, possibly, controlling space- and time-scaled adaptive and compensatory responses occurring during AD will represent a crucial step to achieve the capacity to substantially modify the disease course and progression at the best suitable timepoints, thus counteracting disrupting critical pathophysiological inputs. This approach will provide the conceptual basis for effective

  4. Spatial modelling and ecology of Echinococcus multilocularis transmission in China.

    Science.gov (United States)

    Danson, F Mark; Giraudoux, Patrick; Craig, Philip S

    2006-01-01

    Recent research in central China has suggested that the most likely transmission mechanism for Echinococcus multilocularis to humans is via domestic dogs which are allowed to roam freely and hunt (infected) small mammals within areas close to villages or in areas of tented pasture. This assertion has led to the hypothesis that there is a landscape control on transmission risk since the proximity of suitable habitat for susceptible small mammals appears to be the key. We have tested this hypothesis in a number of endemic areas in China, notably south Gansu Province and the Tibetan region of western Sichuan Province. The fundamental landscape control is its effect at a regional scale on small mammal species assemblages (susceptible species are not ubiquitous) and, at a local scale, the spatial distributions of small mammal populations. To date the research has examined relationships between landscape composition and patterns of human infection, landscape and small mammal distributions and recently the relationships between landscape and dog infection rates. The key tool to characterize landscape is satellite remote sensing and these data are used as inputs to drive spatial models of transmission risk. This paper reviews the progress that has been made so far in spatial modeling of the ecology of E. multilocularis with particular reference to China, outlines current research issues, and describes a framework for building a spatial-temporal model of transmission ecology.

  5. Comparison of Three Plot Selection Methods for Estimating Change in Temporally Variable, Spatially Clustered Populations.

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, William L. [Bonneville Power Administration, Portland, OR (US). Environment, Fish and Wildlife

    2001-07-01

    Monitoring population numbers is important for assessing trends and meeting various legislative mandates. However, sampling across time introduces a temporal aspect to survey design in addition to the spatial one. For instance, a sample that is initially representative may lose this attribute if there is a shift in numbers and/or spatial distribution in the underlying population that is not reflected in later sampled plots. Plot selection methods that account for this temporal variability will produce the best trend estimates. Consequently, I used simulation to compare bias and relative precision of estimates of population change among stratified and unstratified sampling designs based on permanent, temporary, and partial replacement plots under varying levels of spatial clustering, density, and temporal shifting of populations. Permanent plots produced more precise estimates of change than temporary plots across all factors. Further, permanent plots performed better than partial replacement plots except for high density (5 and 10 individuals per plot) and 25% - 50% shifts in the population. Stratified designs always produced less precise estimates of population change for all three plot selection methods, and often produced biased change estimates and greatly inflated variance estimates under sampling with partial replacement. Hence, stratification that remains fixed across time should be avoided when monitoring populations that are likely to exhibit large changes in numbers and/or spatial distribution during the study period. Key words: bias; change estimation; monitoring; permanent plots; relative precision; sampling with partial replacement; temporary plots.

  6. The role of gestures in spatial working memory and speech.

    Science.gov (United States)

    Morsella, Ezequiel; Krauss, Robert M

    2004-01-01

    Co-speech gestures traditionally have been considered communicative, but they may also serve other functions. For example, hand-arm movements seem to facilitate both spatial working memory and speech production. It has been proposed that gestures facilitate speech indirectly by sustaining spatial representations in working memory. Alternatively, gestures may affect speech production directly by activating embodied semantic representations involved in lexical search. Consistent with the first hypothesis, we found participants gestured more when describing visual objects from memory and when describing objects that were difficult to remember and encode verbally. However, they also gestured when describing a visually accessible object, and gesture restriction produced dysfluent speech even when spatial memory was untaxed, suggesting that gestures can directly affect both spatial memory and lexical retrieval.

  7. Spatial scale and β-diversity of terrestrial vertebrates in Mexico

    OpenAIRE

    Ochoa-Ochoa, Leticia M.; Munguía, Mariana; Lira-Noriega, Andrés; Sánchez-Cordero, Víctor; Flores-Villela, Oscar; Navarro-Sigüenza, Adolfo; Rodríguez, Pilar

    2014-01-01

    Patterns of diversity are scale dependent and beta-diversity is not the exception. Mexico is megadiverse due to its high beta diversity, but little is known if it is scale-dependent and/or taxonomic-dependent. We explored these questions based on the self-similarity hypothesis of beta-diversity across spatial scales. Using geographic distribution ranges of 2 513 species, we compared the beta-diversity patterns of 4 groups of terrestrial vertebrates, across 7 spatial scales (from ~10 km² to 16...

  8. Remote sensing and GIS integration: Towards intelligent imagery within a spatial data infrastructure

    Science.gov (United States)

    Abdelrahim, Mohamed Mahmoud Hosny

    2001-11-01

    In this research, an "Intelligent Imagery System Prototype" (IISP) was developed. IISP is an integration tool that facilitates the environment for active, direct, and on-the-fly usage of high resolution imagery, internally linked to hidden GIS vector layers, to query the real world phenomena and, consequently, to perform exploratory types of spatial analysis based on a clear/undisturbed image scene. The IISP was designed and implemented using the software components approach to verify the hypothesis that a fully rectified, partially rectified, or even unrectified digital image can be internally linked to a variety of different hidden vector databases/layers covering the end user area of interest, and consequently may be reliably used directly as a base for "on-the-fly" querying of real-world phenomena and for performing exploratory types of spatial analysis. Within IISP, differentially rectified, partially rectified (namely, IKONOS GEOCARTERRA(TM)), and unrectified imagery (namely, scanned aerial photographs and captured video frames) were investigated. The system was designed to handle four types of spatial functions, namely, pointing query, polygon/line-based image query, database query, and buffering. The system was developed using ESRI MapObjects 2.0a as the core spatial component within Visual Basic 6.0. When used to perform the pre-defined spatial queries using different combinations of image and vector data, the IISP provided the same results as those obtained by querying pre-processed vector layers even when the image used was not orthorectified and the vector layers had different parameters. In addition, the real-time pixel location orthorectification technique developed and presented within the IKONOS GEOCARTERRA(TM) case provided a horizontal accuracy (RMSE) of +/- 2.75 metres. This accuracy is very close to the accuracy level obtained when purchasing the orthorectified IKONOS PRECISION products (RMSE of +/- 1.9 metre). The latter cost approximately four

  9. Putting people on the map: protecting confidentiality with linked social-spatial data

    National Research Council Canada - National Science Library

    Panel on Confidentiality Issues Arising from the Integration of Remotely Sensed and Self-Identifying Data, National Research Council

    2007-01-01

    Precise, accurate spatial information linked to social and behavioral data is revolutionizing social science by opening new questions for investigation and improving understanding of human behavior...

  10. a High-Precision Branching-Ratio Measurement for the Superallowed β+ Emitter 74Rb

    Science.gov (United States)

    Dunlop, R.; Chagnon-Lessard, S.; Finlay, P.; Garrett, P. E.; Hadinia, B.; Leach, K. G.; Svensson, C. E.; Wong, J.; Ball, G.; Garnsworthy, A. B.; Glister, J.; Hackman, G.; Tardiff, E. R.; Triambak, S.; Williams, S. J.; Leslie, J. R.; Andreoiu, C.; Chester, A.; Cross, D.; Starosta, K.; Yates, S. W.; Zganjar, E. F.

    2013-03-01

    Precision measurements of superallowed Fermi beta decay allow for tests of the Cabibbo-Kobayashi-Maskawa matrix (CKM) unitarity, the conserved vector current hypothesis, and the magnitude of isospin-symmetry-breaking effects in nuclei. A high-precision measurement of the branching ratio for the β+ decay of 74Rb has been performed at the Isotope Separator and ACcelerator (ISAC) facility at TRIUMF. The 8π spectrometer, an array of 20 close-packed HPGe detectors, was used to detect gamma rays emitted following the decay of 74Rb. PACES, an array of 5 Si(Li) detectors, was used to detect emitted conversion electrons, while SCEPTAR, an array of plastic scintillators, was used to detect emitted beta particles. A total of 51γ rays have been identified following the decay of 21 excited states in the daughter nucleus 74Kr.

  11. THE EFFECT OF BASIC MOTOR ABILITIES ON DRIBBLING SPEED AND PRECISION IN SOCCER GAME

    OpenAIRE

    Ismail Selimović; Mehmeti Ejup

    2011-01-01

    Effects of basic motor skills on situational-motor abilities for speed dribble and ball control precision assessment in soccer game at boys aged 12-14 years were analyzed with regression analysis. For this purpose, 17 variables for basic motor parameters were selected, as well as three situational tests. In every example of the regression analysis results, the results obtained showed confirmation of the hypothesis of significant effects of the morphological characteristics on the results in a...

  12. SPATIAL UNCERTAINTY IN LINE-SURFACE INTERSECTIONS WITH APPLICATIONS TO PHOTOGRAMMETRY

    Directory of Open Access Journals (Sweden)

    J. Marshall

    2012-07-01

    Full Text Available The fields of photogrammetry and computer vision routinely use line-surface intersections to determine the point where a line intersects with a surface. The object coordinates of the intersection point can be found using standard geometric and numeric algorithms, however expressing the spatial uncertainty at the intersection point may be challenging, especially when the surface morphology is complex. This paper describes an empirical method to characterize the unknown spatial uncertainty at the intersection point by propagating random errors in the stochastic model using repeated random sampling methods. These methods accommodate complex surface morphology and nonlinearities in the functional model, however the penalty is the resulting probability density function associated with the intersection point may be non-Gaussian in nature. A formal hypothesis test is presented to show that straightforward statistical inference tools are available whether the data is Gaussian or not. The hypothesis test determines whether the computed intersection point is consistent with an externally-derived known truth point. A numerical example demonstrates the approach in a photogrammetric setting with a single frame image and a gridded terrain elevation model. The results show that uncertainties produced by the proposed empirical method are intuitive and can be assessed with conventional methods found in textbook hypothesis testing.

  13. A spatial scan statistic for nonisotropic two-level risk cluster.

    Science.gov (United States)

    Li, Xiao-Zhou; Wang, Jin-Feng; Yang, Wei-Zhong; Li, Zhong-Jie; Lai, Sheng-Jie

    2012-01-30

    Spatial scan statistic methods are commonly used for geographical disease surveillance and cluster detection. The standard spatial scan statistic does not model any variability in the underlying risks of subregions belonging to a detected cluster. For a multilevel risk cluster, the isotonic spatial scan statistic could model a centralized high-risk kernel in the cluster. Because variations in disease risks are anisotropic owing to different social, economical, or transport factors, the real high-risk kernel will not necessarily take the central place in a whole cluster area. We propose a spatial scan statistic for a nonisotropic two-level risk cluster, which could be used to detect a whole cluster and a noncentralized high-risk kernel within the cluster simultaneously. The performance of the three methods was evaluated through an intensive simulation study. Our proposed nonisotropic two-level method showed better power and geographical precision with two-level risk cluster scenarios, especially for a noncentralized high-risk kernel. Our proposed method is illustrated using the hand-foot-mouth disease data in Pingdu City, Shandong, China in May 2009, compared with two other methods. In this practical study, the nonisotropic two-level method is the only way to precisely detect a high-risk area in a detected whole cluster. Copyright © 2011 John Wiley & Sons, Ltd.

  14. Reassessing the Trade-off Hypothesis

    DEFF Research Database (Denmark)

    Rosas, Guillermo; Manzetti, Luigi

    2015-01-01

    Do economic conditions drive voters to punish politicians that tolerate corruption? Previous scholarly work contends that citizens in young democracies support corrupt governments that are capable of promoting good economic outcomes, the so-called trade-off hypothesis. We test this hypothesis based...

  15. Spatially coupled LDPC coding in cooperative wireless networks

    NARCIS (Netherlands)

    Jayakody, D.N.K.; Skachek, V.; Chen, B.

    2016-01-01

    This paper proposes a novel technique of spatially coupled low-density parity-check (SC-LDPC) code-based soft forwarding relaying scheme for a two-way relay system. We introduce an array-based optimized SC-LDPC codes in relay channels. A more precise model is proposed to characterize the residual

  16. Regionalisation of a distributed method for flood quantiles estimation: Revaluation of local calibration hypothesis to enhance the spatial structure of the optimised parameter

    Science.gov (United States)

    Odry, Jean; Arnaud, Patrick

    2016-04-01

    The SHYREG method (Aubert et al., 2014) associates a stochastic rainfall generator and a rainfall-runoff model to produce rainfall and flood quantiles on a 1 km2 mesh covering the whole French territory. The rainfall generator is based on the description of rainy events by descriptive variables following probability distributions and is characterised by a high stability. This stochastic generator is fully regionalised, and the rainfall-runoff transformation is calibrated with a single parameter. Thanks to the stability of the approach, calibration can be performed against only flood quantiles associated with observated frequencies which can be extracted from relatively short time series. The aggregation of SHYREG flood quantiles to the catchment scale is performed using an areal reduction factor technique unique on the whole territory. Past studies demonstrated the accuracy of SHYREG flood quantiles estimation for catchments where flow data are available (Arnaud et al., 2015). Nevertheless, the parameter of the rainfall-runoff model is independently calibrated for each target catchment. As a consequence, this parameter plays a corrective role and compensates approximations and modelling errors which makes difficult to identify its proper spatial pattern. It is an inherent objective of the SHYREG approach to be completely regionalised in order to provide a complete and accurate flood quantiles database throughout France. Consequently, it appears necessary to identify the model configuration in which the calibrated parameter could be regionalised with acceptable performances. The revaluation of some of the method hypothesis is a necessary step before the regionalisation. Especially the inclusion or the modification of the spatial variability of imposed parameters (like production and transfer reservoir size, base flow addition and quantiles aggregation function) should lead to more realistic values of the only calibrated parameter. The objective of the work presented

  17. Precision of jaw-closing movements for different jaw gaps.

    Science.gov (United States)

    Hellmann, Daniel; Becker, Georg; Giannakopoulos, Nikolaos N; Eberhard, Lydia; Fingerhut, Christopher; Rammelsberg, Peter; Schindler, Hans J

    2014-02-01

    Jaw-closing movements are basic components of physiological motor actions precisely achieving intercuspation without significant interference. The main purpose of this study was to test the hypothesis that, despite an imperfect intercuspal position, the precision of jaw-closing movements fluctuates within the range of physiological closing movements indispensable for meeting intercuspation without significant interference. For 35 healthy subjects, condylar and incisal point positions for fast and slow jaw-closing, interrupted at different jaw gaps by the use of frontal occlusal plateaus, were compared with uninterrupted physiological jaw closing, with identical jaw gaps, using a telemetric system for measuring jaw position. Examiner-guided centric relation served as a clinically relevant reference position. For jaw gaps ≤4 mm, no significant horizontal or vertical displacement differences were observed for the incisal or condylar points among physiological, fast, and slow jaw-closing. However, the jaw positions under these three closing conditions differed significantly from guided centric relation for nearly all experimental jaw gaps. The findings provide evidence of stringent neuromuscular control of jaw-closing movements in the vicinity of intercuspation. These results might be of clinical relevance to occlusal intervention with different objectives. © 2013 Eur J Oral Sci.

  18. Spatial attention improves the quality of population codes in human visual cortex.

    Science.gov (United States)

    Saproo, Sameer; Serences, John T

    2010-08-01

    Selective attention enables sensory input from behaviorally relevant stimuli to be processed in greater detail, so that these stimuli can more accurately influence thoughts, actions, and future goals. Attention has been shown to modulate the spiking activity of single feature-selective neurons that encode basic stimulus properties (color, orientation, etc.). However, the combined output from many such neurons is required to form stable representations of relevant objects and little empirical work has formally investigated the relationship between attentional modulations on population responses and improvements in encoding precision. Here, we used functional MRI and voxel-based feature tuning functions to show that spatial attention induces a multiplicative scaling in orientation-selective population response profiles in early visual cortex. In turn, this multiplicative scaling correlates with an improvement in encoding precision, as evidenced by a concurrent increase in the mutual information between population responses and the orientation of attended stimuli. These data therefore demonstrate how multiplicative scaling of neural responses provides at least one mechanism by which spatial attention may improve the encoding precision of population codes. Increased encoding precision in early visual areas may then enhance the speed and accuracy of perceptual decisions computed by higher-order neural mechanisms.

  19. Spatial Working Memory Is Necessary for Actions to Guide Thought

    Science.gov (United States)

    Thomas, Laura E.

    2013-01-01

    Directed actions can play a causal role in cognition, shaping thought processes. What drives this cross-talk between action and thought? I investigated the hypothesis that representations in spatial working memory mediate interactions between directed actions and problem solving. Participants attempted to solve an insight problem while…

  20. The Interrelationship of Sex, Visual Spatial Abilities, and Mathematical Problem Solving Ability in Grade Seven. Parts 1, 2, and 3.

    Science.gov (United States)

    Schonberger, Ann Koch

    This three-volume report deals with the hypothesis that males are more successful at solving mathematical and spatial problems than females. The general relationship between visual spatial abilities and mathematical problem-solving ability is also investigated. The research sample consisted of seventh graders. Each pupil took five spatial tests…

  1. Tests of the lunar hypothesis

    Science.gov (United States)

    Taylor, S. R.

    1984-01-01

    The concept that the Moon was fissioned from the Earth after core separation is the most readily testable hypothesis of lunar origin, since direct comparisons of lunar and terrestrial compositions can be made. Differences found in such comparisons introduce so many ad hoc adjustments to the fission hypothesis that it becomes untestable. Further constraints may be obtained from attempting to date the volatile-refractory element fractionation. The combination of chemical and isotopic problems suggests that the fission hypothesis is no longer viable, and separate terrestrial and lunar accretion from a population of fractionated precursor planetesimals provides a more reasonable explanation.

  2. Effective monopole potential for SU(2) lattice gluodynamics in spatial maximal Abelian gauge

    International Nuclear Information System (INIS)

    Chernodub, M.N.; Polikarpov, M.I.; Veselov, A.I.

    1999-01-01

    We investigate the dual superconductor hypothesis in finite-temperature SU(2) lattice gluodynamics in the Spatial Maximal Abelian gauge. This gauge is more physical than the ordinary Maximal Abelian gauge due to absence of non-localities in temporal direction. We shown numerically that in the Spatial Maximal Abelian gauge the probability distribution of the abelian monopole field is consistent with the dual superconductor mechanism of confinement [ru

  3. The impact of path crossing on visuo-spatial serial memory: encoding or rehearsal effect?

    Science.gov (United States)

    Parmentier, Fabrice B R; Andrés, Pilar

    2006-11-01

    The determinants of visuo-spatial serial memory have been the object of little research, despite early evidence that not all sequences are equally remembered. Recently, empirical evidence was reported indicating that the complexity of the path formed by the to-be-remembered locations impacted on recall performance, defined for example by the presence of crossings in the path formed by successive locations (Parmentier, Elford, & Maybery, 2005). In this study, we examined whether this effect reflects rehearsal or encoding processes. We examined the effect of a retention interval and spatial interference on the ordered recall of spatial sequences with and without path crossings. Path crossings decreased recall performance, as did a retention interval. In line with the encoding hypothesis, but in contrast with the rehearsal hypothesis, the effect of crossing was not affected by the retention interval nor by tapping. The possible nature of the impact of path crossing on encoding mechanisms is discussed.

  4. Motor Asymmetry and Substantia Nigra Volume Are Related to Spatial Delayed Response Performance in Parkinson Disease

    Science.gov (United States)

    Foster, Erin R.; Black, Kevin J.; Antenor-Dorsey, Jo Ann V.; Perlmutter, Joel S.; Hershey, Tamara

    2008-01-01

    Studies suggest motor deficit asymmetry may help predict the pattern of cognitive impairment in individuals with Parkinson disease (PD). We tested this hypothesis using a highly validated and sensitive spatial memory task, spatial delayed response (SDR), and clinical and neuroimaging measures of PD asymmetry. We predicted SDR performance would be…

  5. A Self-Organizing Spatial Clustering Approach to Support Large-Scale Network RTK Systems.

    Science.gov (United States)

    Shen, Lili; Guo, Jiming; Wang, Lei

    2018-06-06

    The network real-time kinematic (RTK) technique can provide centimeter-level real time positioning solutions and play a key role in geo-spatial infrastructure. With ever-increasing popularity, network RTK systems will face issues in the support of large numbers of concurrent users. In the past, high-precision positioning services were oriented towards professionals and only supported a few concurrent users. Currently, precise positioning provides a spatial foundation for artificial intelligence (AI), and countless smart devices (autonomous cars, unmanned aerial-vehicles (UAVs), robotic equipment, etc.) require precise positioning services. Therefore, the development of approaches to support large-scale network RTK systems is urgent. In this study, we proposed a self-organizing spatial clustering (SOSC) approach which automatically clusters online users to reduce the computational load on the network RTK system server side. The experimental results indicate that both the SOSC algorithm and the grid algorithm can reduce the computational load efficiently, while the SOSC algorithm gives a more elastic and adaptive clustering solution with different datasets. The SOSC algorithm determines the cluster number and the mean distance to cluster center (MDTCC) according to the data set, while the grid approaches are all predefined. The side-effects of clustering algorithms on the user side are analyzed with real global navigation satellite system (GNSS) data sets. The experimental results indicate that 10 km can be safely used as the cluster radius threshold for the SOSC algorithm without significantly reducing the positioning precision and reliability on the user side.

  6. Validity of the Taylor hypothesis for linear kinetic waves in the weakly collisional solar wind

    International Nuclear Information System (INIS)

    Howes, G. G.; Klein, K. G.; TenBarge, J. M.

    2014-01-01

    The interpretation of single-point spacecraft measurements of solar wind turbulence is complicated by the fact that the measurements are made in a frame of reference in relative motion with respect to the turbulent plasma. The Taylor hypothesis—that temporal fluctuations measured by a stationary probe in a rapidly flowing fluid are dominated by the advection of spatial structures in the fluid rest frame—is often assumed to simplify the analysis. But measurements of turbulence in upcoming missions, such as Solar Probe Plus, threaten to violate the Taylor hypothesis, either due to slow flow of the plasma with respect to the spacecraft or to the dispersive nature of the plasma fluctuations at small scales. Assuming that the frequency of the turbulent fluctuations is characterized by the frequency of the linear waves supported by the plasma, we evaluate the validity of the Taylor hypothesis for the linear kinetic wave modes in the weakly collisional solar wind. The analysis predicts that a dissipation range of solar wind turbulence supported by whistler waves is likely to violate the Taylor hypothesis, while one supported by kinetic Alfvén waves is not.

  7. The Role of Spatial Memory and Frames of Reference in the Precision of Angular Path Integration

    OpenAIRE

    Arthur, Joeanna C.; Philbeck, John W.; Kleene, Nicholas J.; Chichka, David

    2012-01-01

    Angular path integration refers to the ability to maintain an estimate of self-location after a rotational displacement by integrating internally-generated (idiothetic) self-motion signals over time. Previous work has found that non-sensory inputs, namely spatial memory, can play a powerful role in angular path integration (Arthur et al., 2007, 2009). Here we investigated the conditions under which spatial memory facilitates angular path integration. We hypothesized that the benefit of spatia...

  8. Inverse meta-modelling to estimate soil available water capacity at high spatial resolution across a farm

    NARCIS (Netherlands)

    Florin, M.J.; McBratney, A.B.; Whelan, B.M.; Minasny, B.

    2011-01-01

    Geo-referenced information on crop production that is both spatially- and temporally-dense would be useful for management in precision agriculture (PA). Crop yield monitors provide spatially but not temporally dense information. Crop growth simulation modelling can provide temporal density, but

  9. Women match men when learning a spatial skill.

    Science.gov (United States)

    Spence, Ian; Yu, Jingjie Jessica; Feng, Jing; Marshman, Jeff

    2009-07-01

    Meta-analytic studies have concluded that although training improves spatial cognition in both sexes, the male advantage generally persists. However, because some studies run counter to this pattern, a closer examination of the anomaly is warranted. The authors investigated the acquisition of a basic skill (spatial selective attention) using a matched-pair two-wave longitudinal design. Participants were screened with the use of an attentional visual field task, with the objective of selecting and matching 10 male-female pairs, over a wide range (30% to 57% correct). Subsequently, 20 participants 17-23 years of age (selected from 43 screened) were trained for 10 hr (distributed over several sessions) by playing a first-person shooter video game. This genre is known to be highly effective in enhancing spatial skills. All 20 participants improved, with matched members of the male-female pairs achieving very similar gains, independent of starting level. This is consistent with the hypothesis that the learning trajectory of women is not inferior to that of men when acquiring a basic spatial skill. Training methods that develop basic spatial skills may be essential to achieve gender parity in both basic and complex spatial tasks.

  10. Spatial Forecast of Landslides in Three Gorges Based On Spatial Data Mining

    Directory of Open Access Journals (Sweden)

    Xianmin Wang

    2009-03-01

    Full Text Available The Three Gorges is a region with a very high landslide distribution density and a concentrated population. In Three Gorges there are often landslide disasters, and the potential risk of landslides is tremendous. In this paper, focusing on Three Gorges, which has a complicated landform, spatial forecasting of landslides is studied by establishing 20 forecast factors (spectra, texture, vegetation coverage, water level of reservoir, slope structure, engineering rock group, elevation, slope, aspect, etc. China-Brazil Earth Resources Satellite (Cbers images were adopted based on C4.5 decision tree to mine spatial forecast landslide criteria in Guojiaba Town (Zhigui County in Three Gorges and based on this knowledge, perform intelligent spatial landslide forecasts for Guojiaba Town. All landslides lie in the dangerous and unstable regions, so the forecast result is good. The method proposed in the paper is compared with seven other methods: IsoData, K-Means, Mahalanobis Distance, Maximum Likelihood, Minimum Distance, Parallelepiped and Information Content Model. The experimental results show that the method proposed in this paper has a high forecast precision, noticeably higher than that of the other seven methods.

  11. Virtual Reality Learning Activities for Multimedia Students to Enhance Spatial Ability

    Directory of Open Access Journals (Sweden)

    Rafael Molina-Carmona

    2018-04-01

    Full Text Available Virtual Reality is an incipient technology that is proving very useful for training different skills. Our hypothesis is that it is possible to design virtual reality learning activities that can help students to develop their spatial ability. To prove the hypothesis, we have conducted an experiment consisting of training the students using an on-purpose learning activity based on a virtual reality application and assessing the possible improvement of the students’ spatial ability through a widely accepted spatial visualization test. The learning activity consists of a virtual environment where some simple polyhedral shapes are shown and manipulated by moving, rotating and scaling them. The students participating in the experiment are divided into a control and an experimental group, carrying out the same learning activity with the only difference of the device used for the interaction: a traditional computer with screen, keyboard and mouse for the control group, and virtual reality goggles with a smartphone for the experimental group. To assess the experience, all the students have completed a spatial visualization test twice: just before performing the activities and four weeks later, once all the activities were performed. Specifically, we have used the well-known and widely used Purdue Spatial Visualization Test—Rotation (PSVT-R, designed to test rotational visualization ability. The results of the test show that there is an improvement in the test results for both groups, but the improvement is significantly higher in the case of the experimental group. The conclusion is that the virtual reality learning activities have shown to improve the spatial ability of the experimental group.

  12. THE EFFECT OF BASIC MOTOR ABILITIES ON DRIBBLING SPEED AND PRECISION IN SOCCER GAME

    Directory of Open Access Journals (Sweden)

    Ismail Selimović

    2011-03-01

    Full Text Available Effects of basic motor skills on situational-motor abilities for speed dribble and ball control precision assessment in soccer game at boys aged 12-14 years were analyzed with regression analysis. For this purpose, 17 variables for basic motor parameters were selected, as well as three situational tests. In every example of the regression analysis results, the results obtained showed confirmation of the hypothesis of significant effects of the morphological characteristics on the results in analyzed situational- motor tests.

  13. Outlook for the Next Generation’s Precision Forestry in Finland

    Directory of Open Access Journals (Sweden)

    Markus Holopainen

    2014-07-01

    Full Text Available During the past decade in forest mapping and monitoring applications, the ability to acquire spatially accurate, 3D remote-sensing information by means of laser scanning, digital stereo imagery and radar imagery has been a major turning point. These 3D data sets that use single- or multi-temporal point clouds enable a wide range of applications when combined with other geoinformation and logging machine-measured data. New technologies enable precision forestry, which can be defined as a method to accurately determine characteristics of forests and treatments at stand, sub-stand or individual tree level. In precision forestry, even individual tree-level assessments can be used for simulation and optimization models of the forest management decision support system. At the moment, the forest industry in Finland is looking forward to next generation’s forest inventory techniques to improve the current wood procurement practices. Our vision is that in the future, the data solution for detailed forest management and wood procurement will be to use multi-source and -sensor information. In this communication, we review our recent findings and describe our future vision in precision forestry research in Finland.

  14. Positive spatial curvature does not falsify the landscape

    Science.gov (United States)

    Horn, B.

    2017-12-01

    We present a simple cosmological model where the quantum tunneling of a scalar field rearranges the energetics of the matter sector, sending a stable static ancestor vacuum with positive spatial curvature into an inating solution with positive curvature. This serves as a proof of principle that an observation of positive spatial curvature does not falsify the hypothesis that our current observer patch originated from false vacuum tunneling in a string or field theoretic landscape. This poster submission is a summary of the work, and was presented at the 3rd annual ICPPA held in Moscow from October 2 to 5, 2017, by Prof. Rostislav Konoplich on behalf of the author.

  15. Experience, but not distance, influences the recruitment precision in the stingless bee Scaptotrigona mexicana

    Science.gov (United States)

    Sánchez, Daniel; Kraus, F. Bernhard; Hernández, Manuel De Jesús; Vandame, Rémy

    2007-07-01

    Recruitment precision, i.e. the proportion of recruits that reach an advertised food source, is a crucial adaptation of social bees to their environment. Studies with honeybees showed that recruitment precision is not a fixed feature, but it may be enhanced by factors like experience and distance. However, little is known regarding the recruitment precision of stingless bees. Hence, in this study, we examined the effects of experience and spatial distance on the precision of the food communication system of the stingless bee Scaptotrigona mexicana. We conducted the experiments by training bees to a three-dimensional artificial patch at several distances from the colony. We recorded the choices of individual recruited foragers, either being newcomers (foragers without experience with the advertised food source) or experienced (foragers that had previously visited the feeder). We found that the average precision of newcomers (95.6 ± 2.61%) was significantly higher than that of experienced bees (80.2 ± 1.12%). While this might seem counter-intuitive on first sight, this “loss” of precision can be explained by the tendency of experienced recruits to explore nearby areas to find new rewarding food sources after they had initially learned the exact location of the food source. Increasing the distance from the colony had no significant effect on the precision of the foraging bees. Thus, our data show that experience, but not the distance of the food source, affected the patch precision of S. mexicana foragers.

  16. Analysis of spatial count data using Kalman smoothing

    DEFF Research Database (Denmark)

    Dethlefsen, Claus

    2007-01-01

    We consider spatial count data from an agricultural field experiment. Counts of weed plants in a field have been recorded in a project on precision farming. Interest is in mapping the weed intensity so that the dose of herbicide applied at any location can be adjusted to the amount of weed present...

  17. The Development and Temporal Dynamics of Spatial Orienting in Infants.

    Science.gov (United States)

    Johnson, Mark H.; Tucker, Leslie A.

    1996-01-01

    Discusses changes occurring in two-, four-, and six-month-old infants' visual attention span, through a series of experiments examining their ability to orient to peripheral visual stimuli. The results obtained were consistent with the hypothesis that infants get faster with age in shifting attention to a spatial location. (AA)

  18. Women and Spatial Change: Learning Resources for Social Science Courses.

    Science.gov (United States)

    Rengert, Arlene C., Ed.; Monk, Janice J., Ed.

    Six units focusing on the effects of spatial change on women are designed to supplement college introductory courses in geography and the social sciences. Unit 1, Woman and Agricultural Landscapes, focuses on how women contributed to landscape change in prehistory, women's impact on the environment, and the hypothesis that women developed…

  19. Precision analysis for standard deviation measurements of immobile single fluorescent molecule images.

    Science.gov (United States)

    DeSantis, Michael C; DeCenzo, Shawn H; Li, Je-Luen; Wang, Y M

    2010-03-29

    Standard deviation measurements of intensity profiles of stationary single fluorescent molecules are useful for studying axial localization, molecular orientation, and a fluorescence imaging system's spatial resolution. Here we report on the analysis of the precision of standard deviation measurements of intensity profiles of single fluorescent molecules imaged using an EMCCD camera.We have developed an analytical expression for the standard deviation measurement error of a single image which is a function of the total number of detected photons, the background photon noise, and the camera pixel size. The theoretical results agree well with the experimental, simulation, and numerical integration results. Using this expression, we show that single-molecule standard deviation measurements offer nanometer precision for a large range of experimental parameters.

  20. Functional equivalence of spatial images from touch and vision: evidence from spatial updating in blind and sighted individuals.

    Science.gov (United States)

    Giudice, Nicholas A; Betty, Maryann R; Loomis, Jack M

    2011-05-01

    This research examined whether visual and haptic map learning yield functionally equivalent spatial images in working memory, as evidenced by similar encoding bias and updating performance. In 3 experiments, participants learned 4-point routes either by seeing or feeling the maps. At test, blindfolded participants made spatial judgments about the maps from imagined perspectives that were either aligned or misaligned with the maps as represented in working memory. Results from Experiments 1 and 2 revealed a highly similar pattern of latencies and errors between visual and haptic conditions. These findings extend the well-known alignment biases for visual map learning to haptic map learning, provide further evidence of haptic updating, and most important, show that learning from the 2 modalities yields very similar performance across all conditions. Experiment 3 found the same encoding biases and updating performance with blind individuals, demonstrating that functional equivalence cannot be due to visual recoding and is consistent with an amodal hypothesis of spatial images.

  1. The Qualitative Expectations Hypothesis

    DEFF Research Database (Denmark)

    Frydman, Roman; Johansen, Søren; Rahbek, Anders

    We introduce the Qualitative Expectations Hypothesis (QEH) as a new approach to modeling macroeconomic and financial outcomes. Building on John Muth's seminal insight underpinning the Rational Expectations Hypothesis (REH), QEH represents the market's forecasts to be consistent with the predictions...... of an economist's model. However, by assuming that outcomes lie within stochastic intervals, QEH, unlike REH, recognizes the ambiguity faced by an economist and market participants alike. Moreover, QEH leaves the model open to ambiguity by not specifying a mechanism determining specific values that outcomes take...

  2. Accounting for the measurement error of spectroscopically inferred soil carbon data for improved precision of spatial predictions.

    Science.gov (United States)

    Somarathna, P D S N; Minasny, Budiman; Malone, Brendan P; Stockmann, Uta; McBratney, Alex B

    2018-08-01

    Spatial modelling of environmental data commonly only considers spatial variability as the single source of uncertainty. In reality however, the measurement errors should also be accounted for. In recent years, infrared spectroscopy has been shown to offer low cost, yet invaluable information needed for digital soil mapping at meaningful spatial scales for land management. However, spectrally inferred soil carbon data are known to be less accurate compared to laboratory analysed measurements. This study establishes a methodology to filter out the measurement error variability by incorporating the measurement error variance in the spatial covariance structure of the model. The study was carried out in the Lower Hunter Valley, New South Wales, Australia where a combination of laboratory measured, and vis-NIR and MIR inferred topsoil and subsoil soil carbon data are available. We investigated the applicability of residual maximum likelihood (REML) and Markov Chain Monte Carlo (MCMC) simulation methods to generate parameters of the Matérn covariance function directly from the data in the presence of measurement error. The results revealed that the measurement error can be effectively filtered-out through the proposed technique. When the measurement error was filtered from the data, the prediction variance almost halved, which ultimately yielded a greater certainty in spatial predictions of soil carbon. Further, the MCMC technique was successfully used to define the posterior distribution of measurement error. This is an important outcome, as the MCMC technique can be used to estimate the measurement error if it is not explicitly quantified. Although this study dealt with soil carbon data, this method is amenable for filtering the measurement error of any kind of continuous spatial environmental data. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Demonstration of a Fast, Precise Propane Measurement Using Infrared Spectroscopy

    Science.gov (United States)

    Zahniser, M. S.; Roscioli, J. R.; Nelson, D. D.; Herndon, S. C.

    2016-12-01

    Propane is one of the primary components of emissions from natural gas extraction and processing activities. In addition to being an air pollutant, its ratio to other hydrocarbons such as methane and ethane can serve as a "fingerprint" of a particular facility or process, aiding in identifying emission sources. Quantifying propane has typically required laboratory analysis of flask samples, resulting in low temporal resolution and making plume-based measurements infeasible. Here we demonstrate fast (1-second), high precision (infrared spectroscopy at 2967 wavenumbers. In addition, we explore the impact of nearby water and ethane absorption lines on the accuracy and precision of the propane measurement. Finally, we discuss development of a dual-laser instrument capable of simultaneous measurements of methane, ethane, and propane (the C1-C3 compounds), all within a small spatial package that can be easily deployed aboard a mobile platform.

  4. Optimal configuration of spatial points in the reactor cell

    International Nuclear Information System (INIS)

    Bosevski, T.

    1968-01-01

    Optimal configuration of spatial points was chosen in respect to the total number needed for integration of reactions in the reactor cell. Previously developed code VESTERN was used for numerical verification of the method on a standard reactor cell. The code applies the collision probability method for calculating the neutron flux distribution. It is shown that the total number of spatial points is twice smaller than the respective number of spatial zones needed for determination of number of reactions in the cell, with the preset precision. This result shows the direction for further condensing of the procedure for calculating the space-energy distribution of the neutron flux in a reactors cell [sr

  5. The environmental convergence hypothesis: Carbon dioxide emissions according to the source of energy

    International Nuclear Information System (INIS)

    Herrerias, M.J.

    2013-01-01

    The aim of this paper is to investigate the environmental convergence hypothesis in carbon dioxide emissions for a large group of developed and developing countries from 1980 to 2009. The novel aspect of this work is that we distinguish among carbon dioxide emissions according to the source of energy (coal, natural gas and petroleum) instead of considering the aggregate measure of per capita carbon dioxide emissions, where notable interest is given to the regional dimension due to the application of new club convergence tests. This allows us to determine the convergence behaviour of emissions in a more precise way and to detect it according to the source of energy used, thereby helping to address the environmental targets. More specifically, the convergence hypothesis is examined with a pair-wise test and another one is used to test for the existence of club convergence. Our results from using the pair-wise test indicate that carbon dioxide emissions for each type of energy diverge. However, club convergence is found for a large group of countries, although some still display divergence. These findings point to the need to apply specific environmental policies to each club detected, since specific countries converge to different clubs. - Highlights: • The environmental convergence hypothesis is investigated across countries. • We perform a pair-wise test and a club convergence test. • Results from the first of these two tests suggest that carbon dioxide emissions are diverging. • However, we find that carbon dioxide emissions are converging within groups of countries. • Active environmental policies are required

  6. Acoustic grating fringe projector for high-speed and high-precision three-dimensional shape measurements

    International Nuclear Information System (INIS)

    Yin Xuebing; Zhao Huijie; Zeng Junyu; Qu Yufu

    2007-01-01

    A new acoustic grating fringe projector (AGFP) was developed for high-speed and high-precision 3D measurement. A new acoustic grating fringe projection theory is also proposed to describe the optical system. The AGFP instrument can adjust the spatial phase and period of fringes with unprecedented speed and accuracy. Using rf power proportional-integral-derivative (PID) control and CCD synchronous control, we obtain fringes with fine sinusoidal characteristics and realize high-speed acquisition of image data. Using the device, we obtained a precise phase map for a 3D profile. In addition, the AGFP can work in running fringe mode, which could be applied in other measurement fields

  7. Precision medicine in myasthenia graves: begin from the data precision

    Science.gov (United States)

    Hong, Yu; Xie, Yanchen; Hao, Hong-Jun; Sun, Ren-Cheng

    2016-01-01

    Myasthenia gravis (MG) is a prototypic autoimmune disease with overt clinical and immunological heterogeneity. The data of MG is far from individually precise now, partially due to the rarity and heterogeneity of this disease. In this review, we provide the basic insights of MG data precision, including onset age, presenting symptoms, generalization, thymus status, pathogenic autoantibodies, muscle involvement, severity and response to treatment based on references and our previous studies. Subgroups and quantitative traits of MG are discussed in the sense of data precision. The role of disease registries and scientific bases of precise analysis are also discussed to ensure better collection and analysis of MG data. PMID:27127759

  8. The Qualitative Expectations Hypothesis

    DEFF Research Database (Denmark)

    Frydman, Roman; Johansen, Søren; Rahbek, Anders

    2017-01-01

    We introduce the Qualitative Expectations Hypothesis (QEH) as a new approach to modeling macroeconomic and financial outcomes. Building on John Muth's seminal insight underpinning the Rational Expectations Hypothesis (REH), QEH represents the market's forecasts to be consistent with the predictions...... of an economistís model. However, by assuming that outcomes lie within stochastic intervals, QEH, unlike REH, recognizes the ambiguity faced by an economist and market participants alike. Moreover, QEH leaves the model open to ambiguity by not specifying a mechanism determining specific values that outcomes take...

  9. LOFAR Lightning Imaging: Mapping Lightning With Nanosecond Precision

    Science.gov (United States)

    Hare, B. M.; Scholten, O.; Bonardi, A.; Buitink, S.; Corstanje, A.; Ebert, U.; Falcke, H.; Hörandel, J. R.; Leijnse, H.; Mitra, P.; Mulrey, K.; Nelles, A.; Rachen, J. P.; Rossetto, L.; Rutjes, C.; Schellart, P.; Thoudam, S.; Trinh, T. N. G.; ter Veen, S.; Winchen, T.

    2018-03-01

    Lightning mapping technology has proven instrumental in understanding lightning. In this work we present a pipeline that can use lightning observed by the LOw-Frequency ARray (LOFAR) radio telescope to construct a 3-D map of the flash. We show that LOFAR has unparalleled precision, on the order of meters, even for lightning flashes that are over 20 km outside the area enclosed by LOFAR antennas (˜3,200 km2), and can potentially locate over 10,000 sources per lightning flash. We also show that LOFAR is the first lightning mapping system that is sensitive to the spatial structure of the electrical current during individual lightning leader steps.

  10. Positional information generated by spatially distributed signaling cascades.

    Directory of Open Access Journals (Sweden)

    Javier Muñoz-García

    2009-03-01

    Full Text Available The temporal and stationary behavior of protein modification cascades has been extensively studied, yet little is known about the spatial aspects of signal propagation. We have previously shown that the spatial separation of opposing enzymes, such as a kinase and a phosphatase, creates signaling activity gradients. Here we show under what conditions signals stall in the space or robustly propagate through spatially distributed signaling cascades. Robust signal propagation results in activity gradients with long plateaus, which abruptly decay at successive spatial locations. We derive an approximate analytical solution that relates the maximal amplitude and propagation length of each activation profile with the cascade level, protein diffusivity, and the ratio of the opposing enzyme activities. The control of the spatial signal propagation appears to be very different from the control of transient temporal responses for spatially homogenous cascades. For spatially distributed cascades where activating and deactivating enzymes operate far from saturation, the ratio of the opposing enzyme activities is shown to be a key parameter controlling signal propagation. The signaling gradients characteristic for robust signal propagation exemplify a pattern formation mechanism that generates precise spatial guidance for multiple cellular processes and conveys information about the cell size to the nucleus.

  11. Validity of Linder Hypothesis in Bric Countries

    Directory of Open Access Journals (Sweden)

    Rana Atabay

    2016-03-01

    Full Text Available In this study, the theory of similarity in preferences (Linder hypothesis has been introduced and trade in BRIC countries has been examined whether the trade between these countries was valid for this hypothesis. Using the data for the period 1996 – 2010, the study applies to panel data analysis in order to provide evidence regarding the empirical validity of the Linder hypothesis for BRIC countries’ international trade. Empirical findings show that the trade between BRIC countries is in support of Linder hypothesis.

  12. Does foreign direct investment affect environmental pollution in China's cities? A spatial econometric perspective.

    Science.gov (United States)

    Liu, Qianqian; Wang, Shaojian; Zhang, Wenzhong; Zhan, Dongsheng; Li, Jiaming

    2018-02-01

    Environmental pollution has aroused extensive concern worldwide. Existing literature on the relationship between foreign direct investment (FDI) and environmental pollution has, however, seldom taken into account spatial effects. Addressing this gap, this paper investigated the spatial agglomeration effects and dynamics at work in FDI and environmental pollution (namely, in waste soot and dust, sulfur dioxide, and wastewater) in 285 Chinese cities during the period 2003-2014, using global and local measures of spatial autocorrelation. Our results showed significant spatial autocorrelation in FDI and environmental pollution levels, both of which demonstrated obvious path dependence characteristics in their geographical distribution. A range of agglomeration regions were observed. The high-value and low-value agglomeration areas of FDI were not fully consistent with those of environmental pollution. This result indicates that higher inflows of FDI did not necessarily lead to greater environmental pollution from a geographic perspective, and vice versa. Spatial panel data models were further adopted to explore the impact of FDI on environmental pollution. The results of a spatial lag model (SLM) and a spatial error model (SEM) revealed that the inflow of FDI had distinct effects on different environmental pollutants, thereby confirming the Pollution Heaven Hypothesis and Pollution Halo Hypothesis. The inflow of FDI was found to have reduced waste soot and dust pollution to a certain extent, while it increased the degree of wastewater and sulfur dioxide pollution. The findings set out in this paper hold significant implications for Chinese environmental pollution protection. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Visual attention and the apprehension of spatial relations: the case of depth.

    Science.gov (United States)

    Moore, C M; Elsinger, C L; Lleras, A

    2001-05-01

    Several studies have shown that targets defined on the basis of the spatial relations between objects yield highly inefficient visual search performance (e.g., Logan, 1994; Palmer, 1994), suggesting that the apprehension of spatial relations may require the selective allocation of attention within the scene. In the present study, we tested the hypothesis that depth relations might be different in this regard and might support efficient visual search. This hypothesis was based, in part, on the fact that many perceptual organization processes that are believed to occur early and in parallel, such as figure-ground segregation and perceptual completion, seem to depend on the assignment of depth relations. Despite this, however, using increasingly salient cues to depth (Experiments 2-4) and including a separate test of the sufficiency of the most salient depth cue used (Experiment 5), no evidence was found to indicate that search for a target defined by depth relations is any different than search for a target defined by other types of spatial relations, with regard to efficiency of search. These findings are discussed within the context of the larger literature on early processing of three-dimensional characteristics of visual scenes.

  14. Generation of spectral–temporal response surfaces by combining multispectral satellite and hyperspectral UAV imagery for precision agriculture applications

    NARCIS (Netherlands)

    Gevaert, C.; Suomalainen, J.M.; Tang, J.; Kooistra, L.

    2015-01-01

    Precision agriculture requires detailed crop status information at high spatial and temporal resolutions. Remote sensing can provide such information, but single sensor observations are often incapable of meeting all data requirements. Spectral–temporal response surfaces (STRSs) provide continuous

  15. A High-precision Motion Compensation Method for SAR Based on Image Intensity Optimization

    Directory of Open Access Journals (Sweden)

    Hu Ke-bin

    2015-02-01

    Full Text Available Owing to the platform instability and precision limitations of motion sensors, motion errors negatively affect the quality of synthetic aperture radar (SAR images. The autofocus Back Projection (BP algorithm based on the optimization of image sharpness compensates for motion errors through phase error estimation. This method can attain relatively good performance, while assuming the same phase error for all pixels, i.e., it ignores the spatial variance of motion errors. To overcome this drawback, a high-precision motion error compensation method is presented in this study. In the proposed method, the Antenna Phase Centers (APC are estimated via optimization using the criterion of maximum image intensity. Then, the estimated APCs are applied for BP imaging. Because the APC estimation equals the range history estimation for each pixel, high-precision phase compensation for every pixel can be achieved. Point-target simulations and processing of experimental data validate the effectiveness of the proposed method.

  16. A Self-Organizing Spatial Clustering Approach to Support Large-Scale Network RTK Systems

    Directory of Open Access Journals (Sweden)

    Lili Shen

    2018-06-01

    Full Text Available The network real-time kinematic (RTK technique can provide centimeter-level real time positioning solutions and play a key role in geo-spatial infrastructure. With ever-increasing popularity, network RTK systems will face issues in the support of large numbers of concurrent users. In the past, high-precision positioning services were oriented towards professionals and only supported a few concurrent users. Currently, precise positioning provides a spatial foundation for artificial intelligence (AI, and countless smart devices (autonomous cars, unmanned aerial-vehicles (UAVs, robotic equipment, etc. require precise positioning services. Therefore, the development of approaches to support large-scale network RTK systems is urgent. In this study, we proposed a self-organizing spatial clustering (SOSC approach which automatically clusters online users to reduce the computational load on the network RTK system server side. The experimental results indicate that both the SOSC algorithm and the grid algorithm can reduce the computational load efficiently, while the SOSC algorithm gives a more elastic and adaptive clustering solution with different datasets. The SOSC algorithm determines the cluster number and the mean distance to cluster center (MDTCC according to the data set, while the grid approaches are all predefined. The side-effects of clustering algorithms on the user side are analyzed with real global navigation satellite system (GNSS data sets. The experimental results indicate that 10 km can be safely used as the cluster radius threshold for the SOSC algorithm without significantly reducing the positioning precision and reliability on the user side.

  17. Transitional geomagnetic impulse hypothesis: Geomagnetic fact or rock-magnetic artifact?

    Science.gov (United States)

    Camps, Pierre; Coe, Robert S.; PréVot, Michel

    1999-08-01

    A striking feature of the Steens Mountain (Oregon) geomagnetic polarity reversal is the two (maybe three) extremely rapid field directional changes (6 degrees per day) proposed to account for unusual behavior in direction of remanent magnetization in a single lava flow. Each of these very fast field changes, or impulses, is associated with a large directional gap (some 90°) in the record. In order to check the spatial reproducibility of the paleomagnetic signal over distances up to several kilometers, we have carried out a paleomagnetic investigation of two new sections (B and F) in the Steens summit region which cover the second and the third directional gap. The main result is the description of two new directions, which are located between the pre second and post second impulse directions. These findings weigh against the hypothesis that the geomagnetic field cause the unusual intraflow fluctuations, which now appears to be more ad hoc as an explanation of the paleomagnetic data. However, the alternative baking hypothesis remains also ad hoc since we have to assume variable rock magnetic properties that we have not yet been able to detect within the flows at the original section Steens A and D 1.5 km to the north. In addition, new results for 22 transitional and normal lava flows in section B are presented that correlate well with earlier results from section A.

  18. Precision machining commercialization

    International Nuclear Information System (INIS)

    1978-01-01

    To accelerate precision machining development so as to realize more of the potential savings within the next few years of known Department of Defense (DOD) part procurement, the Air Force Materials Laboratory (AFML) is sponsoring the Precision Machining Commercialization Project (PMC). PMC is part of the Tri-Service Precision Machine Tool Program of the DOD Manufacturing Technology Five-Year Plan. The technical resources supporting PMC are provided under sponsorship of the Department of Energy (DOE). The goal of PMC is to minimize precision machining development time and cost risk for interested vendors. PMC will do this by making available the high precision machining technology as developed in two DOE contractor facilities, the Lawrence Livermore Laboratory of the University of California and the Union Carbide Corporation, Nuclear Division, Y-12 Plant, at Oak Ridge, Tennessee

  19. Technology in precision viticulture: a state of the art review

    Directory of Open Access Journals (Sweden)

    Matese A

    2015-05-01

    Full Text Available Alessandro Matese,1 Salvatore Filippo Di Gennaro1,2 1Institute of Biometeorology, National Research Council (IBIMET-CNR, Florence, Italy; 2Department of Agricultural, Food and Environmental Sciences, University of Perugia, Perugia, Italy Abstract: Precision viticulture aims to maximize the oenological potential of vineyards. This is especially true in regions where the high quality standards of wine production justify the adoption of site-specific management practices to simultaneously increase both quality and yield. The introduction of new technologies for supporting vineyard management allows the efficiency and quality of production to be improved and, at the same time, reduces the environmental impact. The rapid evolution of information communication technologies and geographical science offers enormous potential for the development of optimized solutions for distributed information for precision viticulture. Recent technological developments have allowed useful tools to be elaborated that help in the monitoring and control of many aspects of vine growth. Precision viticulture thus seeks to exploit the widest range of available observations to describe the vineyard spatial variability with high resolution, and provide recommendations to improve management efficiency in terms of quality, production, and sustainability. This review presents a brief outline of state of the art of technologies in precision viticulture. It is divided in two sections, the first focusing on monitoring technologies such as geolocating and remote and proximal sensing; the second focuses on variable-rate technologies and the new agricultural robots. Keywords: remote sensing, proximal sensing, variable-rate technology, robot 

  20. Opportunities for Maturing Precision Metrology with Ultracold Gas Studies Aboard the ISS

    Science.gov (United States)

    Williams, Jason; D'Incao, Jose

    2017-04-01

    Precision atom interferometers (AI) in space are expected to become an enabling technology for future fundamental physics research, with proposals including unprecedented tests of the validity of the weak equivalence principle, measurements of the fine structure and gravitational constants, and detection of gravity waves and dark matter/dark energy. We will discuss our preparation at JPL to use NASA's Cold Atom Lab facility (CAL) to mature the technology of precision, space-based, AIs. The focus of our flight project is three-fold: a) study the controlled dynamics of heteronuclear Feshbach molecules, at temperatures of nano-Kelvins or below, as a means to overcome uncontrolled density-profile-dependent shifts in differential AIs, b) demonstrate unprecedented atom-photon coherence times with spatially constrained AIs, c) use the imaging capabilities of CAL to detect and analyze spatial fringe patterns written onto the clouds after AI and thereby measure the rotational noise of the ISS. The impact from this work, and potential for follow-on studies, will also be reviewed in the context of future space-based fundamental physics missions. This research was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration.

  1. High precision ray tracing in cylindrically symmetric electrostatics

    Energy Technology Data Exchange (ETDEWEB)

    Edwards Jr, David, E-mail: dej122842@gmail.com

    2015-11-15

    Highlights: • High precision ray tracing is formulated using power series techniques. • Ray tracing is possible for fields generated by solution to laplace's equation. • Spatial and temporal orders of 4–10 are included. • Precisions in test geometries of hemispherical deflector analyzer of ∼10{sup −20} have been obtained. • This solution offers a considerable extension to the ray tracing accuracy over the current state of art. - Abstract: With the recent availability of a high order FDM solution to the curved boundary value problem, it is now possible to determine potentials in such geometries with considerably greater accuracy than had been available with the FDM method. In order for the algorithms used in the accurate potential calculations to be useful in ray tracing, an integration of those algorithms needs to be placed into the ray trace process itself. The object of this paper is to incorporate these algorithms into a solution of the equations of motion of the ray and, having done this, to demonstrate its efficacy. The algorithm incorporation has been accomplished by using power series techniques and the solution constructed has been tested by tracing the medial ray through concentric sphere geometries. The testing has indicated that precisions of ray calculations of 10{sup −20} are now possible. This solution offers a considerable extension to the ray tracing accuracy over the current state of art.

  2. Using an Automatic Resistivity Profiler Soil Sensor On-The-Go in Precision Viticulture

    Directory of Open Access Journals (Sweden)

    Mariana Amato

    2013-01-01

    Full Text Available Spatial information on vineyard soil properties can be useful in precision viticulture. In this paper a combination of high resolution soil spatial information of soil electrical resistivity (ER and ancillary topographic attributes, such as elevation and slope, were integrated to assess the spatial variability patterns of vegetative growth and yield of a commercial vineyard (Vitis vinifera L. cv. Tempranillo located in the wine-producing region of La Rioja, Spain. High resolution continuous geoelectrical mapping was accomplished by an Automatic Resistivity Profiler (ARP on-the-go sensor with an on-board GPS system; rolling electrodes enabled ER to be measured for a depth of investigation approximately up to 0.5, 1 and 2 m. Regression analysis and cluster analysis algorithm were used to jointly process soil resistivity data, landscape attributes and grapevine variables. ER showed a structured variability that matched well with trunk circumference spatial pattern and yield. Based on resistivity and a simple terrain attribute uniform management units were delineated. Once a spatial relationship to target variables is found, the integration of point measurement with continuous soil resistivity mapping is a useful technique to identify within-plots areas of vineyard with similar status.

  3. Moving the Weber Fraction: The Perceptual Precision for Moment of Inertia Increases with Exploration Force

    Science.gov (United States)

    Debats, Nienke B.; Kingma, Idsart; Beek, Peter J.; Smeets, Jeroen B. J.

    2012-01-01

    How does the magnitude of the exploration force influence the precision of haptic perceptual estimates? To address this question, we examined the perceptual precision for moment of inertia (i.e., an object's “angular mass”) under different force conditions, using the Weber fraction to quantify perceptual precision. Participants rotated a rod around a fixed axis and judged its moment of inertia in a two-alternative forced-choice task. We instructed different levels of exploration force, thereby manipulating the magnitude of both the exploration force and the angular acceleration. These are the two signals that are needed by the nervous system to estimate moment of inertia. Importantly, one can assume that the absolute noise on both signals increases with an increase in the signals' magnitudes, while the relative noise (i.e., noise/signal) decreases with an increase in signal magnitude. We examined how the perceptual precision for moment of inertia was affected by this neural noise. In a first experiment we found that a low exploration force caused a higher Weber fraction (22%) than a high exploration force (13%), which suggested that the perceptual precision was constrained by the relative noise. This hypothesis was supported by the result of a second experiment, in which we found that the relationship between exploration force and Weber fraction had a similar shape as the theoretical relationship between signal magnitude and relative noise. The present study thus demonstrated that the amount of force used to explore an object can profoundly influence the precision by which its properties are perceived. PMID:23028437

  4. Figure/ground segregation from temporal delay is best at high spatial frequencies.

    Science.gov (United States)

    Kojima, H

    1998-12-01

    Two experiments investigated the role of spatial frequency in performance of a figure/ground segregation task based on temporal cues. Figure orientation was much easier to judge when figure and ground portions of the target were defined exclusively by random texture composed entirely of high spatial frequencies. When target components were defined by low spatial frequencies only, the task was nearly impossible except with long temporal delay between figure and ground. These results are inconsistent with the hypothesis that M-cell activity is primarily responsible for figure/ground segregation from temporal delay. Instead, these results point to a distinction between temporal integration and temporal differentiation. Additionally, the present results can be related to recent work on the binding of spatial features over time.

  5. Spatial part-set cuing facilitation.

    Science.gov (United States)

    Kelley, Matthew R; Parasiuk, Yuri; Salgado-Benz, Jennifer; Crocco, Megan

    2016-07-01

    Cole, Reysen, and Kelley [2013. Part-set cuing facilitation for spatial information. Journal of Experimental Psychology: Learning, Memory, & Cognition, 39, 1615-1620] reported robust part-set cuing facilitation for spatial information using snap circuits (a colour-coded electronics kit designed for children to create rudimentary circuit boards). In contrast, Drinkwater, Dagnall, and Parker [2006. Effects of part-set cuing on experienced and novice chess players' reconstruction of a typical chess midgame position. Perceptual and Motor Skills, 102(3), 645-653] and Watkins, Schwartz, and Lane [1984. Does part-set cuing test for memory organization? Evidence from reconstructions of chess positions. Canadian Journal of Psychology/Revue Canadienne de Psychologie, 38(3), 498-503] showed no influence of part-set cuing for spatial information when using chess boards. One key difference between the two procedures was that the snap circuit stimuli were explicitly connected to one another, whereas chess pieces were not. Two experiments examined the effects of connection type (connected vs. unconnected) and cue type (cued vs. uncued) on memory for spatial information. Using chess boards (Experiment 1) and snap circuits (Experiment 2), part-set cuing facilitation only occurred when the stimuli were explicitly connected; there was no influence of cuing with unconnected stimuli. These results are potentially consistent with the retrieval strategy disruption hypothesis, as well as the two- and three-mechanism accounts of part-set cuing.

  6. Einstein's Revolutionary Light-Quantum Hypothesis

    Science.gov (United States)

    Stuewer, Roger H.

    2005-05-01

    The paper in which Albert Einstein proposed his light-quantum hypothesis was the only one of his great papers of 1905 that he himself termed ``revolutionary.'' Contrary to widespread belief, Einstein did not propose his light-quantum hypothesis ``to explain the photoelectric effect.'' Instead, he based his argument for light quanta on the statistical interpretation of the second law of thermodynamics, with the photoelectric effect being only one of three phenomena that he offered as possible experimental support for it. I will discuss Einstein's light-quantum hypothesis of 1905 and his introduction of the wave-particle duality in 1909 and then turn to the reception of his work on light quanta by his contemporaries. We will examine the reasons that prominent physicists advanced to reject Einstein's light-quantum hypothesis in succeeding years. Those physicists included Robert A. Millikan, even though he provided convincing experimental proof of the validity of Einstein's equation of the photoelectric effect in 1915. The turning point came after Arthur Holly Compton discovered the Compton effect in late 1922, but even then Compton's discovery was contested both on experimental and on theoretical grounds. Niels Bohr, in particular, had never accepted the reality of light quanta and now, in 1924, proposed a theory, the Bohr-Kramers-Slater theory, which assumed that energy and momentum were conserved only statistically in microscopic interactions. Only after that theory was disproved experimentally in 1925 was Einstein's revolutionary light-quantum hypothesis generally accepted by physicists---a full two decades after Einstein had proposed it.

  7. Local high precision 3D measurement based on line laser measuring instrument

    Science.gov (United States)

    Zhang, Renwei; Liu, Wei; Lu, Yongkang; Zhang, Yang; Ma, Jianwei; Jia, Zhenyuan

    2018-03-01

    In order to realize the precision machining and assembly of the parts, the geometrical dimensions of the surface of the local assembly surfaces need to be strictly guaranteed. In this paper, a local high-precision three-dimensional measurement method based on line laser measuring instrument is proposed to achieve a high degree of accuracy of the three-dimensional reconstruction of the surface. Aiming at the problem of two-dimensional line laser measuring instrument which lacks one-dimensional high-precision information, a local three-dimensional profile measuring system based on an accurate single-axis controller is proposed. First of all, a three-dimensional data compensation method based on spatial multi-angle line laser measuring instrument is proposed to achieve the high-precision measurement of the default axis. Through the pretreatment of the 3D point cloud information, the measurement points can be restored accurately. Finally, the target spherical surface is needed to make local three-dimensional scanning measurements for accuracy verification. The experimental results show that this scheme can get the local three-dimensional information of the target quickly and accurately, and achieves the purpose of gaining the information and compensating the error for laser scanner information, and improves the local measurement accuracy.

  8. Spatial and environmental connectivity analysis in a cholera vaccine trial.

    Science.gov (United States)

    Emch, Michael; Ali, Mohammad; Root, Elisabeth D; Yunus, Mohammad

    2009-02-01

    This paper develops theory and methods for vaccine trials that utilize spatial and environmental information. Satellite imagery is used to identify whether households are connected to one another via water bodies in a study area in rural Bangladesh. Then relationships between neighborhood-level cholera vaccine coverage and placebo incidence and neighborhood-level spatial variables are measured. The study hypothesis is that unvaccinated people who are environmentally connected to people who have been vaccinated will be at lower risk compared to unvaccinated people who are environmentally connected to people who have not been vaccinated. We use four datasets including: a cholera vaccine trial database, a longitudinal demographic database of the rural population from which the vaccine trial participants were selected, a household-level geographic information system (GIS) database of the same study area, and high resolution Quickbird satellite imagery. An environmental connectivity metric was constructed by integrating the satellite imagery with the vaccine and demographic databases linked with GIS. The results show that there is a relationship between neighborhood rates of cholera vaccination and placebo incidence. Thus, people are indirectly protected when more people in their environmentally connected neighborhood are vaccinated. This result is similar to our previous work that used a simpler Euclidean distance neighborhood to measure neighborhood vaccine coverage [Ali, M., Emch, M., von Seidlein, L., Yunus, M., Sack, D. A., Holmgren, J., et al. (2005). Herd immunity conferred by killed oral cholera vaccines in Bangladesh. Lancet, 366(9479), 44-49]. Our new method of measuring environmental connectivity is more precise since it takes into account the transmission mode of cholera and therefore this study validates our assertion that the oral cholera vaccine provides indirect protection in addition to direct protection.

  9. Why precision?

    Energy Technology Data Exchange (ETDEWEB)

    Bluemlein, Johannes

    2012-05-15

    Precision measurements together with exact theoretical calculations have led to steady progress in fundamental physics. A brief survey is given on recent developments and current achievements in the field of perturbative precision calculations in the Standard Model of the Elementary Particles and their application in current high energy collider data analyses.

  10. Why precision?

    International Nuclear Information System (INIS)

    Bluemlein, Johannes

    2012-05-01

    Precision measurements together with exact theoretical calculations have led to steady progress in fundamental physics. A brief survey is given on recent developments and current achievements in the field of perturbative precision calculations in the Standard Model of the Elementary Particles and their application in current high energy collider data analyses.

  11. Using experimental design and spatial analyses to improve the precision of NDVI estimates in upland cotton field trials

    Science.gov (United States)

    Controlling for spatial variability is important in high-throughput phenotyping studies that enable large numbers of genotypes to be evaluated across time and space. In the current study, we compared the efficacy of different experimental designs and spatial models in the analysis of canopy spectral...

  12. Behavioral Variability and Somatic Mosaicism: A Cytogenomic Hypothesis.

    Science.gov (United States)

    Vorsanova, Svetlana G; Zelenova, Maria A; Yurov, Yuri B; Iourov, Ivan Y

    2018-04-01

    Behavioral sciences are inseparably related to genetics. A variety of neurobehavioral phenotypes are suggested to result from genomic variations. However, the contribution of genetic factors to common behavioral disorders (i.e. autism, schizophrenia, intellectual disability) remains to be understood when an attempt to link behavioral variability to a specific genomic change is made. Probably, the least appreciated genetic mechanism of debilitating neurobehavioral disorders is somatic mosaicism or the occurrence of genetically diverse (neuronal) cells in an individual's brain. Somatic mosaicism is assumed to affect directly the brain being associated with specific behavioral patterns. As shown in studies of chromosome abnormalities (syndromes), genetic mosaicism is able to change dynamically the phenotype due to inconsistency of abnormal cell proportions. Here, we hypothesize that brain-specific postzygotic changes of mosaicism levels are able to modulate variability of behavioral phenotypes. More precisely, behavioral phenotype variability in individuals exhibiting somatic mosaicism might correlate with changes in the amount of genetically abnormal cells throughout the lifespan. If proven, the hypothesis can be used as a basis for therapeutic interventions through regulating levels of somatic mosaicism to increase functioning and to improve overall condition of individuals with behavioral problems.

  13. Spatial variability in intertidal macroalgal assemblages on the North Portuguese coast: consistence between species and functional group approaches

    Science.gov (United States)

    Veiga, P.; Rubal, M.; Vieira, R.; Arenas, F.; Sousa-Pinto, I.

    2013-03-01

    Natural assemblages are variable in space and time; therefore, quantification of their variability is imperative to identify relevant scales for investigating natural or anthropogenic processes shaping these assemblages. We studied the variability of intertidal macroalgal assemblages on the North Portuguese coast, considering three spatial scales (from metres to 10 s of kilometres) following a hierarchical design. We tested the hypotheses that (1) spatial pattern will be invariant at all the studied scales and (2) spatial variability of macroalgal assemblages obtained by using species will be consistent with that obtained using functional groups. This was done considering as univariate variables: total biomass and number of taxa as well as biomass of the most important species and functional groups and as multivariate variables the structure of macroalgal assemblages, both considering species and functional groups. Most of the univariate results confirmed the first hypothesis except for the total number of taxa and foliose macroalgae that showed significant variability at the scale of site and area, respectively. In contrast, when multivariate patterns were examined, the first hypothesis was rejected except at the scale of 10 s of kilometres. Both uni- and multivariate results indicated that variation was larger at the smallest scale, and thus, small-scale processes seem to have more effect on spatial variability patterns. Macroalgal assemblages, both considering species and functional groups as surrogate, showed consistent spatial patterns, and therefore, the second hypothesis was confirmed. Consequently, functional groups may be considered a reliable biological surrogate to study changes on macroalgal assemblages at least along the investigated Portuguese coastline.

  14. TLD array for precise dose measurements in stereotactic radiation techniques

    International Nuclear Information System (INIS)

    Ertl, A.; Kitz, K.; Griffitt, W.; Hartl, R.F.E.; Zehetmayer, M.

    1996-01-01

    We developed a new TLD array for precise dose measurement and verification of the spatial dose distribution in small radiation targets. It consists of a hemicylindrical, tissue-equivalent rod made of polystyrene with 17 parallel moulds for an exact positioning of each TLD. The spatial resolution of the TLD array was evaluated using the Leskell spherical phantom. Dose planning was performed with KULA 4.4 under stereotactic conditions on axial CT images. In the Leksell gamma unit the TLD array was irradiated with a maximal dose of 10 Gy with an unplugged 14 mm collimator. The doses delivered to the TLDs were rechecked by diode detector and film dosimetry and compared to the computer-generated dose profile. We found excellent agreement of our measured values, even at the critical penumbra decline. For the 14 mm and 18 mm collimator and for the 11 mm collimator combination we compared the measured and calculated data at full width at half maximum. This TLD array may be useful for phantom or tissue model studies on the spatial dose distribution in confined radiation targets as used in stereotactic radiotherapy. (author)

  15. Precision mechatronics based on high-precision measuring and positioning systems and machines

    Science.gov (United States)

    Jäger, Gerd; Manske, Eberhard; Hausotte, Tino; Mastylo, Rostyslav; Dorozhovets, Natalja; Hofmann, Norbert

    2007-06-01

    Precision mechatronics is defined in the paper as the science and engineering of a new generation of high precision systems and machines. Nanomeasuring and nanopositioning engineering represents important fields of precision mechatronics. The nanometrology is described as the today's limit of the precision engineering. The problem, how to design nanopositioning machines with uncertainties as small as possible will be discussed. The integration of several optical and tactile nanoprobes makes the 3D-nanopositioning machine suitable for various tasks, such as long range scanning probe microscopy, mask and wafer inspection, nanotribology, nanoindentation, free form surface measurement as well as measurement of microoptics, precision molds, microgears, ring gauges and small holes.

  16. High-precision spatial localization of mouse vocalizations during social interaction.

    Science.gov (United States)

    Heckman, Jesse J; Proville, Rémi; Heckman, Gert J; Azarfar, Alireza; Celikel, Tansu; Englitz, Bernhard

    2017-06-07

    Mice display a wide repertoire of vocalizations that varies with age, sex, and context. Especially during courtship, mice emit ultrasonic vocalizations (USVs) of high complexity, whose detailed structure is poorly understood. As animals of both sexes vocalize, the study of social vocalizations requires attributing single USVs to individuals. The state-of-the-art in sound localization for USVs allows spatial localization at centimeter resolution, however, animals interact at closer ranges, involving tactile, snout-snout exploration. Hence, improved algorithms are required to reliably assign USVs. We develop multiple solutions to USV localization, and derive an analytical solution for arbitrary vertical microphone positions. The algorithms are compared on wideband acoustic noise and single mouse vocalizations, and applied to social interactions with optically tracked mouse positions. A novel, (frequency) envelope weighted generalised cross-correlation outperforms classical cross-correlation techniques. It achieves a median error of ~1.4 mm for noise and ~4-8.5 mm for vocalizations. Using this algorithms in combination with a level criterion, we can improve the assignment for interacting mice. We report significant differences in mean USV properties between CBA mice of different sexes during social interaction. Hence, the improved USV attribution to individuals lays the basis for a deeper understanding of social vocalizations, in particular sequences of USVs.

  17. Testing for changes in spatial relative risk.

    Science.gov (United States)

    Hazelton, Martin L

    2017-07-30

    The spatial relative risk function is a useful tool for describing geographical variation in disease incidence. We consider the problem of comparing relative risk functions between two time periods, with the idea of detecting alterations in the spatial pattern of disease risk irrespective of whether there has been a change in the overall incidence rate. Using case-control datasets for each period, we use kernel smoothing methods to derive a test statistic based on the difference between the log-relative risk functions, which we term the log-relative risk ratio. For testing a null hypothesis of an unchanging spatial pattern of risk, we show how p-values can be computed using both randomization methods and an asymptotic normal approximation. The methodology is applied to data on campylobacteriosis from 2006 to 2013 in a region of New Zealand. We find clear evidence of a change in the spatial pattern of risk between those years, which can be explained in differences by response to a public health initiative between urban and rural communities. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  18. Commissioning and proof of functionality of the OPERA precision tracker, especially of the time measuring system

    International Nuclear Information System (INIS)

    Janutta, Benjamin

    2008-10-01

    The commissioning and the proof of functionality of the Precision Tracker of the OPERA experiment is subject of this thesis. The timing system of the precision tracker is of major concern here. At first the time.resolution of the timing electronics was characterized additionally general running parameters were studied. Afterwards the installation and commissioning were carried out. The precision tracker is supposed to determine the momentum of throughgoing myons with an accuracy of Δp/p<0.25 as well as the sign of their charge. The commissioning is finished by now and it was shown, that the data acquisition system runs very reliable and only 1.5% show an slightly higher number of hits. The nominal spatial track resolution of σ<600 μm was also reached. (orig.)

  19. Estimating chlorophyll with thermal and broadband multispectral high resolution imagery from an unmanned aerial system using relevance vector machines for precision agriculture

    Science.gov (United States)

    Elarab, Manal; Ticlavilca, Andres M.; Torres-Rua, Alfonso F.; Maslova, Inga; McKee, Mac

    2015-12-01

    Precision agriculture requires high-resolution information to enable greater precision in the management of inputs to production. Actionable information about crop and field status must be acquired at high spatial resolution and at a temporal frequency appropriate for timely responses. In this study, high spatial resolution imagery was obtained through the use of a small, unmanned aerial system called AggieAirTM. Simultaneously with the AggieAir flights, intensive ground sampling for plant chlorophyll was conducted at precisely determined locations. This study reports the application of a relevance vector machine coupled with cross validation and backward elimination to a dataset composed of reflectance from high-resolution multi-spectral imagery (VIS-NIR), thermal infrared imagery, and vegetative indices, in conjunction with in situ SPAD measurements from which chlorophyll concentrations were derived, to estimate chlorophyll concentration from remotely sensed data at 15-cm resolution. The results indicate that a relevance vector machine with a thin plate spline kernel type and kernel width of 5.4, having LAI, NDVI, thermal and red bands as the selected set of inputs, can be used to spatially estimate chlorophyll concentration with a root-mean-squared-error of 5.31 μg cm-2, efficiency of 0.76, and 9 relevance vectors.

  20. Accounting for regional background and population size in the detection of spatial clusters and outliers using geostatistical filtering and spatial neutral models: the case of lung cancer in Long Island, New York

    Directory of Open Access Journals (Sweden)

    Goovaerts Pierre

    2004-07-01

    Full Text Available Abstract Background Complete Spatial Randomness (CSR is the null hypothesis employed by many statistical tests for spatial pattern, such as local cluster or boundary analysis. CSR is however not a relevant null hypothesis for highly complex and organized systems such as those encountered in the environmental and health sciences in which underlying spatial pattern is present. This paper presents a geostatistical approach to filter the noise caused by spatially varying population size and to generate spatially correlated neutral models that account for regional background obtained by geostatistical smoothing of observed mortality rates. These neutral models were used in conjunction with the local Moran statistics to identify spatial clusters and outliers in the geographical distribution of male and female lung cancer in Nassau, Queens, and Suffolk counties, New York, USA. Results We developed a typology of neutral models that progressively relaxes the assumptions of null hypotheses, allowing for the presence of spatial autocorrelation, non-uniform risk, and incorporation of spatially heterogeneous population sizes. Incorporation of spatial autocorrelation led to fewer significant ZIP codes than found in previous studies, confirming earlier claims that CSR can lead to over-identification of the number of significant spatial clusters or outliers. Accounting for population size through geostatistical filtering increased the size of clusters while removing most of the spatial outliers. Integration of regional background into the neutral models yielded substantially different spatial clusters and outliers, leading to the identification of ZIP codes where SMR values significantly depart from their regional background. Conclusion The approach presented in this paper enables researchers to assess geographic relationships using appropriate null hypotheses that account for the background variation extant in real-world systems. In particular, this new

  1. Gender differences in the use of external landmarks versus spatial representations updated by self-motion.

    Science.gov (United States)

    Lambrey, Simon; Berthoz, Alain

    2007-09-01

    Numerous data in the literature provide evidence for gender differences in spatial orientation. In particular, it has been suggested that spatial representations of large-scale environments are more accurate in terms of metric information in men than in women but are richer in landmark information in women than in men. One explanatory hypothesis is that men and women differ in terms of navigational processes they used in daily life. The present study investigated this hypothesis by distinguishing two navigational processes: spatial updating by self-motion and landmark-based orientation. Subjects were asked to perform a pointing task in three experimental conditions, which differed in terms of reliability of the external landmarks that could be used. Two groups of subjects were distinguished, a mobile group and an immobile group, in which spatial updating of environmental locations did not have the same degree of importance for the correct performance of the pointing task. We found that men readily relied on an internal egocentric representation of where landmarks were expected to be in order to perform the pointing task, a representation that could be updated during self-motion (spatial updating). In contrast, women seemed to take their bearings more readily on the basis of the stable landmarks of the external world. We suggest that this gender difference in spatial orientation is not due to differences in information processing abilities but rather due to the differences in higher level strategies.

  2. Spatial manipulation with microfluidics

    Directory of Open Access Journals (Sweden)

    Benjamin eLin

    2015-04-01

    Full Text Available Biochemical gradients convey information through space, time, and concentration, and are ultimately capable of spatially resolving distinct cellular phenotypes, such as differentiation, proliferation, and migration. How these gradients develop, evolve, and function during development, homeostasis, and various disease states is a subject of intense interest across a variety of disciplines. Microfluidic technologies have become essential tools for investigating gradient sensing in vitro due to their ability to precisely manipulate fluids on demand in well controlled environments at cellular length scales. This minireview will highlight their utility for studying gradient sensing along with relevant applications to biology.

  3. FROM UNEMPLOYMENT TO WORK: AN ECONOMETRIC ANALYSIS WITH SPATIAL CONSTRAINTS

    Directory of Open Access Journals (Sweden)

    Oana Calavrezo

    2009-03-01

    Full Text Available The aim of our research is to analyze how the urban organization affects the unemployment-to-work transitions by considering several spatial indicators. This permits to capture two separate effects: "spatial mismatch" and "neighbourhood effects". In order to study the unemployment-to-work transitions, we implement survival models. They are applied on a sample obtained by merging three French databases: the "Trajectoires des demandeurs d'emplois" survey, the 1999 French census and finally, a database containing town inventory information. More precisely, in this paper, we analyze the duration of the first observed employment episode by using spatial indicators and by controlling three potential biases (endogeneity bias, selection bias and attrition bias.

  4. The Focus of Spatial Attention Determines the Number and Precision of Face Representations in Working Memory.

    Science.gov (United States)

    Towler, John; Kelly, Maria; Eimer, Martin

    2016-06-01

    The capacity of visual working memory for faces is extremely limited, but the reasons for these limitations remain unknown. We employed event-related brain potential measures to demonstrate that individual faces have to be focally attended in order to be maintained in working memory, and that attention is allocated to only a single face at a time. When 2 faces have to be memorized simultaneously in a face identity-matching task, the focus of spatial attention during encoding predicts which of these faces can be successfully maintained in working memory and matched to a subsequent test face. We also show that memory representations of attended faces are maintained in a position-dependent fashion. These findings demonstrate that the limited capacity of face memory is directly linked to capacity limits of spatial attention during the encoding and maintenance of individual face representations. We suggest that the capacity and distribution of selective spatial attention is a dynamic resource that constrains the capacity and fidelity of working memory for faces. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. Development of Neutron Interferometer with Wide-Gapped ''BSE''s for Precision Measurements

    International Nuclear Information System (INIS)

    Seki, Y.; Kitaguchi, M.; Hino, M.; Funahashi, H.; Taketani, K.; Otake, Y.; Shimizu, H. M.

    2007-01-01

    We are developing large-dimensional cold-neutron interferometers with multilayer mirrors in order to investigate small interactions. In particular Jamin type interferometers composed of wide-gapped 'BSE's, which divide the beam completely, can realize the precision measurement of topological Aharonov-Casher effect. We have made a prototype with 200 μm gapped BSEs and confirmed the spatial separation of its two paths at monochromatic cold-neutron beamline MINE2 on JRR-3M reactor in JAEA

  6. Is a matrix exponential specification suitable for the modeling of spatial correlation structures?

    Science.gov (United States)

    Strauß, Magdalena E; Mezzetti, Maura; Leorato, Samantha

    2017-05-01

    This paper investigates the adequacy of the matrix exponential spatial specifications (MESS) as an alternative to the widely used spatial autoregressive models (SAR). To provide as complete a picture as possible, we extend the analysis to all the main spatial models governed by matrix exponentials comparing them with their spatial autoregressive counterparts. We propose a new implementation of Bayesian parameter estimation for the MESS model with vague prior distributions, which is shown to be precise and computationally efficient. Our implementations also account for spatially lagged regressors. We further allow for location-specific heterogeneity, which we model by including spatial splines. We conclude by comparing the performances of the different model specifications in applications to a real data set and by running simulations. Both the applications and the simulations suggest that the spatial splines are a flexible and efficient way to account for spatial heterogeneities governed by unknown mechanisms.

  7. The Stem Cell Hypothesis of Aging

    Directory of Open Access Journals (Sweden)

    Anna Meiliana

    2010-04-01

    Full Text Available BACKGROUND: There is probably no single way to age. Indeed, so far there is no single accepted explanation or mechanisms of aging (although more than 300 theories have been proposed. There is an overall decline in tissue regenerative potential with age, and the question arises as to whether this is due to the intrinsic aging of stem cells or rather to the impairment of stem cell function in the aged tissue environment. CONTENT: Recent data suggest that we age, in part, because our self-renewing stem cells grow old as a result of heritable intrinsic events, such as DNA damage, as well as extrinsic forces, such as changes in their supporting niches. Mechanisms that suppress the development of cancer, such as senescence and apoptosis, which rely on telomere shortening and the activities of p53 and p16INK4a may also induce an unwanted consequence: a decline in the replicative function of certain stem cells types with advancing age. This decrease regenerative capacity appears to pointing to the stem cell hypothesis of aging. SUMMARY: Recent evidence suggested that we grow old partly because of our stem cells grow old as a result of mechanisms that suppress the development of cancer over a lifetime. We believe that a further, more precise mechanistic understanding of this process will be required before this knowledge can be translated into human anti-aging therapies. KEYWORDS: stem cells, senescence, telomere, DNA damage, epigenetic, aging.

  8. Rejecting the equilibrium-point hypothesis.

    Science.gov (United States)

    Gottlieb, G L

    1998-01-01

    The lambda version of the equilibrium-point (EP) hypothesis as developed by Feldman and colleagues has been widely used and cited with insufficient critical understanding. This article offers a small antidote to that lack. First, the hypothesis implicitly, unrealistically assumes identical transformations of lambda into muscle tension for antagonist muscles. Without that assumption, its definitions of command variables R, C, and lambda are incompatible and an EP is not defined exclusively by R nor is it unaffected by C. Second, the model assumes unrealistic and unphysiological parameters for the damping properties of the muscles and reflexes. Finally, the theory lacks rules for two of its three command variables. A theory of movement should offer insight into why we make movements the way we do and why we activate muscles in particular patterns. The EP hypothesis offers no unique ideas that are helpful in addressing either of these questions.

  9. Stereological analysis of spatial structures

    DEFF Research Database (Denmark)

    Hansen, Linda Vadgård

    The thesis deals with stereological analysis of spatial structures. One area of focus has been to improve the precision of well-known stereological estimators by including information that is available via automatic image analysis. Furthermore, the thesis presents a stochastic model for star......-shaped three-dimensional objects using the radial function. It appears that the model is highly fleksiblel in the sense that it can be used to describe an object with arbitrary irregular surface. Results on the distribution of well-known local stereological volume estimators are provided....

  10. Practical precision measurement

    International Nuclear Information System (INIS)

    Kwak, Ho Chan; Lee, Hui Jun

    1999-01-01

    This book introduces basic knowledge of precision measurement, measurement of length, precision measurement of minor diameter, measurement of angles, measurement of surface roughness, three dimensional measurement, measurement of locations and shapes, measurement of screw, gear testing, cutting tools testing, rolling bearing testing, and measurement of digitalisation. It covers height gauge, how to test surface roughness, measurement of plan and straightness, external and internal thread testing, gear tooth measurement, milling cutter, tab, rotation precision measurement, and optical transducer.

  11. Linguistic and Perceptual Mapping in Spatial Representations: An Attentional Account.

    Science.gov (United States)

    Valdés-Conroy, Berenice; Hinojosa, José A; Román, Francisco J; Romero-Ferreiro, Verónica

    2018-03-01

    Building on evidence for embodied representations, we investigated whether Spanish spatial terms map onto the NEAR/FAR perceptual division of space. Using a long horizontal display, we measured congruency effects during the processing of spatial terms presented in NEAR or FAR space. Across three experiments, we manipulated the task demands in order to investigate the role of endogenous attention in linguistic and perceptual space mapping. We predicted congruency effects only when spatial properties were relevant for the task (reaching estimation task, Experiment 1) but not when attention was allocated to other features (lexical decision, Experiment 2; and color, Experiment 3). Results showed faster responses for words presented in Near-space in all experiments. Consistent with our hypothesis, congruency effects were observed only when a reaching estimate was requested. Our results add important evidence for the role of top-down processing in congruency effects from embodied representations of spatial terms. Copyright © 2017 Cognitive Science Society, Inc.

  12. Hypothesis Testing in the Real World

    Science.gov (United States)

    Miller, Jeff

    2017-01-01

    Critics of null hypothesis significance testing suggest that (a) its basic logic is invalid and (b) it addresses a question that is of no interest. In contrast to (a), I argue that the underlying logic of hypothesis testing is actually extremely straightforward and compelling. To substantiate that, I present examples showing that hypothesis…

  13. Does the stress-gradient hypothesis hold water? Disentangling spatial and temporal variation in plant effects on soil moisture in dryland systems

    Science.gov (United States)

    Butterfield, Bradley J.; Bradford, John B.; Armas, Cristina; Prieto, Ivan; Pugnaire, Francisco I.

    2016-01-01

    The nature of the relationship between water limitation and facilitation has been one of the most contentious debates surrounding the stress-gradient hypothesis (SGH), which states that plant-plant interactions shift from competition to facilitation with increasing environmental stress.

  14. Precision of tibial cartilage morphometry with a coronal water-excitation MR sequence

    Energy Technology Data Exchange (ETDEWEB)

    Hyhlik-Duerr, A. [Musculoskeletal Research Group, Institute of Anatomy, Ludwig-Maximilians-Universitaet, Muenchen (Germany); Klinik fuer Orthopaedie und Sportorthopaedie der Technischen Universitaet, Muenchen (Germany); Faber, S.; Reiser, M. [Klinik fuer Orthopaedie und Sportorthopaedie der Technischen Universitaet, Muenchen (Germany); Burgkart, R. [Institut fuer Medizinische Informatik und Systemforschung (MEDIS), GSF-Forschungszentrum fuer Umwelt und Gesundheit, Neuherberg, Oberschleissheim (Germany); Stammberger, T.; Englmeier, K.H. [Institut fuer Medizinische Informationsverarbeitung, Biometrie und Epidemiologie, Klinikum Grosshadern, Marchioninistrasse 15, D-81377 Munich (Germany); Maag, K.P. [Institut fuer Radiologische Diagnostik, Klinikum der Ludwig-Maximilians-Universitaet, Muenchen (Germany); Eckstein, F. [Musculoskeletal Research Group, Institute of Anatomy, Ludwig-Maximilians-Universitaet, Muenchen (Germany)

    2000-02-01

    The aim of this study was to analyze the precision of tibial cartilage morphometry, by using a fast, coronal water-excitation sequence with high spatial resolution, to compare the reproducibility of 3D thickness vs volume estimates, and to test the technique in patients with severe osteoarthritis. The tibiae of 8 healthy volunteers and 3 patients selected for total knee arthroplasty were imaged repeatedly with a water-excitation sequence (image time 6 h 19 min, resolution 1.2 x 0.31 x 0.31 mm{sup 3}), with the knee being repositioned between each replicate acquisition. After 3D reconstruction, the cartilage volume, the mean, and the maximal tibial cartilage thickness were determined by 3D Euclidean distance transformation. In the volunteers, the precision of the volume measurements was 2.3 % (CV%) in the medial and 2.6 % in the lateral tibia. The reproducibility of the mean cartilage thickness was similar (2.6 and 2.5 %, respectively), and that of the maximal thickness lower (6.5 and 4.4 %). The patients showed a considerable reduction in volume and thickness, the precision being comparable with that in the volunteers. We find that, using a new imaging protocol and computational algorithm, it is possible to determine tibial cartilage morphometry with high precision in healthy individuals as well as in patients with osteoarthritis. (orig.)

  15. Optimal Audiovisual Integration in the Ventriloquism Effect But Pervasive Deficits in Unisensory Spatial Localization in Amblyopia.

    Science.gov (United States)

    Richards, Michael D; Goltz, Herbert C; Wong, Agnes M F

    2018-01-01

    Classically understood as a deficit in spatial vision, amblyopia is increasingly recognized to also impair audiovisual multisensory processing. Studies to date, however, have not determined whether the audiovisual abnormalities reflect a failure of multisensory integration, or an optimal strategy in the face of unisensory impairment. We use the ventriloquism effect and the maximum-likelihood estimation (MLE) model of optimal integration to investigate integration of audiovisual spatial information in amblyopia. Participants with unilateral amblyopia (n = 14; mean age 28.8 years; 7 anisometropic, 3 strabismic, 4 mixed mechanism) and visually normal controls (n = 16, mean age 29.2 years) localized brief unimodal auditory, unimodal visual, and bimodal (audiovisual) stimuli during binocular viewing using a location discrimination task. A subset of bimodal trials involved the ventriloquism effect, an illusion in which auditory and visual stimuli originating from different locations are perceived as originating from a single location. Localization precision and bias were determined by psychometric curve fitting, and the observed parameters were compared with predictions from the MLE model. Spatial localization precision was significantly reduced in the amblyopia group compared with the control group for unimodal visual, unimodal auditory, and bimodal stimuli. Analyses of localization precision and bias for bimodal stimuli showed no significant deviations from the MLE model in either the amblyopia group or the control group. Despite pervasive deficits in localization precision for visual, auditory, and audiovisual stimuli, audiovisual integration remains intact and optimal in unilateral amblyopia.

  16. The atomic hypothesis: physical consequences

    International Nuclear Information System (INIS)

    Rivas, Martin

    2008-01-01

    The hypothesis that matter is made of some ultimate and indivisible objects, together with the restricted relativity principle, establishes a constraint on the kind of variables we are allowed to use for the variational description of elementary particles. We consider that the atomic hypothesis not only states the indivisibility of elementary particles, but also that these ultimate objects, if not annihilated, cannot be modified by any interaction so that all allowed states of an elementary particle are only kinematical modifications of any one of them. Therefore, an elementary particle cannot have excited states. In this way, the kinematical group of spacetime symmetries not only defines the symmetries of the system, but also the variables in terms of which the mathematical description of the elementary particles can be expressed in either the classical or the quantum mechanical description. When considering the interaction of two Dirac particles, the atomic hypothesis restricts the interaction Lagrangian to a kind of minimal coupling interaction

  17. A power comparison of generalized additive models and the spatial scan statistic in a case-control setting

    Directory of Open Access Journals (Sweden)

    Ozonoff Al

    2010-07-01

    Full Text Available Abstract Background A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. Results This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. Conclusions The GAM

  18. A power comparison of generalized additive models and the spatial scan statistic in a case-control setting.

    Science.gov (United States)

    Young, Robin L; Weinberg, Janice; Vieira, Verónica; Ozonoff, Al; Webster, Thomas F

    2010-07-19

    A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM) which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. The GAM permutation testing methods provide a regression

  19. Disentangling the effects of spatial inconsistency of targets and distractors when searching in realistic scenes.

    Science.gov (United States)

    Spotorno, Sara; Malcolm, George L; Tatler, Benjamin W

    2015-02-10

    Previous research has suggested that correctly placed objects facilitate eye guidance, but also that objects violating spatial associations within scenes may be prioritized for selection and subsequent inspection. We analyzed the respective eye guidance of spatial expectations and target template (precise picture or verbal label) in visual search, while taking into account any impact of object spatial inconsistency on extrafoveal or foveal processing. Moreover, we isolated search disruption due to misleading spatial expectations about the target from the influence of spatial inconsistency within the scene upon search behavior. Reliable spatial expectations and precise target template improved oculomotor efficiency across all search phases. Spatial inconsistency resulted in preferential saccadic selection when guidance by template was insufficient to ensure effective search from the outset and the misplaced object was bigger than the objects consistently placed in the same scene region. This prioritization emerged principally during early inspection of the region, but the inconsistent object also tended to be preferentially fixated overall across region viewing. These results suggest that objects are first selected covertly on the basis of their relative size and that subsequent overt selection is made considering object-context associations processed in extrafoveal vision. Once the object was fixated, inconsistency resulted in longer first fixation duration and longer total dwell time. As a whole, our findings indicate that observed impairment of oculomotor behavior when searching for an implausibly placed target is the combined product of disruption due to unreliable spatial expectations and prioritization of inconsistent objects before and during object fixation. © 2015 ARVO.

  20. Action compatibility in spatial knowledge developed through virtual navigation.

    Science.gov (United States)

    Wang, Qi; Taylor, Holly A; Brunyé, Tad T

    2018-01-09

    Action-compatibility effects (ACEs) arise due to incongruity between perceptuo-motor traces stored in memory and the perceptuo-motor demands of a retrieval task. Recent research has suggested that ACEs arising during spatial memory retrieval are additionally modulated by individual differences in how experienced participants are with a college campus environment. However, the extent and nature of experience with a real-world environment is difficult to assess and control, and characteristics of the retrieval task itself might modulate ACEs during spatial memory retrieval. The present study provides a more controlled and in-depth examination of how individual differences and task-based factors interact to shape ACEs when participants retrieve spatial memories. In two experiments, participants with varied video game experience learned a virtual environment and then used the computer mouse to verify spatial relationships from different perspectives. Mouse trajectories demonstrated ACEs, differing by retrieval perspective and video game experience. Videogame experts demonstrated the ACE based on learned spatial relationships during egocentric retrieval only, whereas videogame novices showed the ACE based on semantic processing of directional terms only. Specifically, gaming experts invoke perspective-specific perceptuo-motor associations to retrieve spatial knowledge, whereas non-experts are influenced by semantically based associations specific to the retrieval task. Results are discussed in the context of action-compatibility effects, the intentional weighting hypothesis, and the flexible encoding and retrieval of spatial information.

  1. Visual working memory and number sense: Testing the double deficit hypothesis in mathematics.

    Science.gov (United States)

    Toll, Sylke W M; Kroesbergen, Evelyn H; Van Luit, Johannes E H

    2016-09-01

    Evidence exists that there are two main underlying cognitive factors in mathematical difficulties: working memory and number sense. It is suggested that real math difficulties appear when both working memory and number sense are weak, here referred to as the double deficit (DD) hypothesis. The aim of this study was to test the DD hypothesis within a longitudinal time span of 2 years. A total of 670 children participated. The mean age was 4.96 years at the start of the study and 7.02 years at the end of the study. At the end of the first year of kindergarten, both visual-spatial working memory and number sense were measured by two different tasks. At the end of first grade, mathematical performance was measured with two tasks, one for math facts and one for math problems. Multiple regressions revealed that both visual working memory and symbolic number sense are predictors of mathematical performance in first grade. Symbolic number sense appears to be the strongest predictor for both math areas (math facts and math problems). Non-symbolic number sense only predicts performance in math problems. Multivariate analyses of variance showed that a combination of visual working memory and number sense deficits (NSDs) leads to the lowest performance on mathematics. Our DD hypothesis was confirmed. Both visual working memory and symbolic number sense in kindergarten are related to mathematical performance 2 years later, and a combination of visual working memory and NSDs leads to low performance in mathematical performance. © 2016 The British Psychological Society.

  2. Effective Connectivity Reveals Right-Hemisphere Dominance in Audiospatial Perception: Implications for Models of Spatial Neglect

    Science.gov (United States)

    Friston, Karl J.; Mattingley, Jason B.; Roepstorff, Andreas; Garrido, Marta I.

    2014-01-01

    Detecting the location of salient sounds in the environment rests on the brain's ability to use differences in sounds arriving at both ears. Functional neuroimaging studies in humans indicate that the left and right auditory hemispaces are coded asymmetrically, with a rightward attentional bias that reflects spatial attention in vision. Neuropsychological observations in patients with spatial neglect have led to the formulation of two competing models: the orientation bias and right-hemisphere dominance models. The orientation bias model posits a symmetrical mapping between one side of the sensorium and the contralateral hemisphere, with mutual inhibition of the ipsilateral hemisphere. The right-hemisphere dominance model introduces a functional asymmetry in the brain's coding of space: the left hemisphere represents the right side, whereas the right hemisphere represents both sides of the sensorium. We used Dynamic Causal Modeling of effective connectivity and Bayesian model comparison to adjudicate between these alternative network architectures, based on human electroencephalographic data acquired during an auditory location oddball paradigm. Our results support a hemispheric asymmetry in a frontoparietal network that conforms to the right-hemisphere dominance model. We show that, within this frontoparietal network, forward connectivity increases selectively in the hemisphere contralateral to the side of sensory stimulation. We interpret this finding in light of hierarchical predictive coding as a selective increase in attentional gain, which is mediated by feedforward connections that carry precision-weighted prediction errors during perceptual inference. This finding supports the disconnection hypothesis of unilateral neglect and has implications for theories of its etiology. PMID:24695717

  3. A Future Vertex Locator with Precise Timing for the LHCb Experiment

    CERN Multimedia

    Mitreska, Biljana

    2017-01-01

    The LHCb experiment is designed to perform high precision measurements of matter-antimatter asymmetries and searches for rare and forbidden decays, with the aim of discovering new and unexpected particles and forces. In 2030 the LHC beam intensity will increase by a factor of 50 compared to current operations. This means increased samples of the particles we need to study, but it also presents experimental challenges. In particular, with current technology it becomes impossible to differentiate the many (>50) separate proton-proton collisions which occur for each bunch crossing. A Monte Carlo simulation was developed to model the operation of a silicon pixel vertex detector surrounding the collision region at LHCb, under the conditions expected after 2030, after the second upgrade of the Vertex Locator (VELO). The main goal was studying the effect of adding '4D' detectors which save high-precision timing information, in addition to the usual three spatial coordinates, as charged particles pass through them. W...

  4. Participatory boat tracking reveals spatial fishing patterns in an Indonesian artisanal fishery

    DEFF Research Database (Denmark)

    Forero, Gabriela Navarrete; Miñarro, Sara; Mildenberger, Tobias

    2017-01-01

    for the coral reef ecosystem, contributing to its overall degradation. Estimations on the ecological impacts of different levels of fishing pressure, as well as fisheries stock assessments and marine resource management require precise information of the spatial distribution of fishing effort, which involves...

  5. Precise positioning method for multi-process connecting based on binocular vision

    Science.gov (United States)

    Liu, Wei; Ding, Lichao; Zhao, Kai; Li, Xiao; Wang, Ling; Jia, Zhenyuan

    2016-01-01

    With the rapid development of aviation and aerospace, the demand for metal coating parts such as antenna reflector, eddy-current sensor and signal transmitter, etc. is more and more urgent. Such parts with varied feature dimensions, complex three-dimensional structures, and high geometric accuracy are generally fabricated by the combination of different manufacturing technology. However, it is difficult to ensure the machining precision because of the connection error between different processing methods. Therefore, a precise positioning method is proposed based on binocular micro stereo vision in this paper. Firstly, a novel and efficient camera calibration method for stereoscopic microscope is presented to solve the problems of narrow view field, small depth of focus and too many nonlinear distortions. Secondly, the extraction algorithms for law curve and free curve are given, and the spatial position relationship between the micro vision system and the machining system is determined accurately. Thirdly, a precise positioning system based on micro stereovision is set up and then embedded in a CNC machining experiment platform. Finally, the verification experiment of the positioning accuracy is conducted and the experimental results indicated that the average errors of the proposed method in the X and Y directions are 2.250 μm and 1.777 μm, respectively.

  6. Water developments and canids in two North American deserts: a test of the indirect effect of water hypothesis.

    Directory of Open Access Journals (Sweden)

    Lucas K Hall

    Full Text Available Anthropogenic modifications to landscapes intended to benefit wildlife may negatively influence wildlife communities. Anthropogenic provisioning of free water (water developments to enhance abundance and distribution of wildlife is a common management practice in arid regions where water is limiting. Despite the long-term and widespread use of water developments, little is known about how they influence native species. Water developments may negatively influence arid-adapted species (e.g., kit fox, Vulpes macrotis by enabling water-dependent competitors (e.g., coyote, Canis latrans to expand distribution in arid landscapes (i.e., indirect effect of water hypothesis. We tested the two predictions of the indirect effect of water hypothesis (i.e., coyotes will visit areas with free water more frequently and kit foxes will spatially and temporally avoid coyotes and evaluated relative use of free water by canids in the Great Basin and Mojave Deserts from 2010 to 2012. We established scent stations in areas with (wet and without (dry free water and monitored visitation by canids to these sites and visitation to water sources using infrared-triggered cameras. There was no difference in the proportions of visits to scent stations in wet or dry areas by coyotes or kit foxes at either study area. We did not detect spatial (no negative correlation between visits to scent stations or temporal (no difference between times when stations were visited segregation between coyotes and kit foxes. Visitation to water sources was not different for coyotes between study areas, but kit foxes visited water sources more in Mojave than Great Basin. Our results did not support the indirect effect of water hypothesis in the Great Basin or Mojave Deserts for these two canids.

  7. Toward accurate and precise estimates of lion density.

    Science.gov (United States)

    Elliot, Nicholas B; Gopalaswamy, Arjun M

    2017-08-01

    Reliable estimates of animal density are fundamental to understanding ecological processes and population dynamics. Furthermore, their accuracy is vital to conservation because wildlife authorities rely on estimates to make decisions. However, it is notoriously difficult to accurately estimate density for wide-ranging carnivores that occur at low densities. In recent years, significant progress has been made in density estimation of Asian carnivores, but the methods have not been widely adapted to African carnivores, such as lions (Panthera leo). Although abundance indices for lions may produce poor inferences, they continue to be used to estimate density and inform management and policy. We used sighting data from a 3-month survey and adapted a Bayesian spatially explicit capture-recapture (SECR) model to estimate spatial lion density in the Maasai Mara National Reserve and surrounding conservancies in Kenya. Our unstructured spatial capture-recapture sampling design incorporated search effort to explicitly estimate detection probability and density on a fine spatial scale, making our approach robust in the context of varying detection probabilities. Overall posterior mean lion density was estimated to be 17.08 (posterior SD 1.310) lions >1 year old/100 km 2 , and the sex ratio was estimated at 2.2 females to 1 male. Our modeling framework and narrow posterior SD demonstrate that SECR methods can produce statistically rigorous and precise estimates of population parameters, and we argue that they should be favored over less reliable abundance indices. Furthermore, our approach is flexible enough to incorporate different data types, which enables robust population estimates over relatively short survey periods in a variety of systems. Trend analyses are essential to guide conservation decisions but are frequently based on surveys of differing reliability. We therefore call for a unified framework to assess lion numbers in key populations to improve management and

  8. Text Mining for Precision Medicine: Bringing structure to EHRs and biomedical literature to understand genes and health

    Science.gov (United States)

    Simmons, Michael; Singhal, Ayush; Lu, Zhiyong

    2018-01-01

    The key question of precision medicine is whether it is possible to find clinically actionable granularity in diagnosing disease and classifying patient risk. The advent of next generation sequencing and the widespread adoption of electronic health records (EHRs) have provided clinicians and researchers a wealth of data and made possible the precise characterization of individual patient genotypes and phenotypes. Unstructured text — found in biomedical publications and clinical notes — is an important component of genotype and phenotype knowledge. Publications in the biomedical literature provide essential information for interpreting genetic data. Likewise, clinical notes contain the richest source of phenotype information in EHRs. Text mining can render these texts computationally accessible and support information extraction and hypothesis generation. This chapter reviews the mechanics of text mining in precision medicine and discusses several specific use cases, including database curation for personalized cancer medicine, patient outcome prediction from EHR-derived cohorts, and pharmacogenomic research. Taken as a whole, these use cases demonstrate how text mining enables effective utilization of existing knowledge sources and thus promotes increased value for patients and healthcare systems. Text mining is an indispensable tool for translating genotype-phenotype data into effective clinical care that will undoubtedly play an important role in the eventual realization of precision medicine. PMID:27807747

  9. Text Mining for Precision Medicine: Bringing Structure to EHRs and Biomedical Literature to Understand Genes and Health.

    Science.gov (United States)

    Simmons, Michael; Singhal, Ayush; Lu, Zhiyong

    2016-01-01

    The key question of precision medicine is whether it is possible to find clinically actionable granularity in diagnosing disease and classifying patient risk. The advent of next-generation sequencing and the widespread adoption of electronic health records (EHRs) have provided clinicians and researchers a wealth of data and made possible the precise characterization of individual patient genotypes and phenotypes. Unstructured text-found in biomedical publications and clinical notes-is an important component of genotype and phenotype knowledge. Publications in the biomedical literature provide essential information for interpreting genetic data. Likewise, clinical notes contain the richest source of phenotype information in EHRs. Text mining can render these texts computationally accessible and support information extraction and hypothesis generation. This chapter reviews the mechanics of text mining in precision medicine and discusses several specific use cases, including database curation for personalized cancer medicine, patient outcome prediction from EHR-derived cohorts, and pharmacogenomic research. Taken as a whole, these use cases demonstrate how text mining enables effective utilization of existing knowledge sources and thus promotes increased value for patients and healthcare systems. Text mining is an indispensable tool for translating genotype-phenotype data into effective clinical care that will undoubtedly play an important role in the eventual realization of precision medicine.

  10. Interference between perennial grassland and Lavandula stoechas subsp. pedunculata seedlings: a case of spatial segregation cause by competition

    Science.gov (United States)

    Sánchez, Ana M.; Peco, Begoña

    2004-07-01

    This paper analyses the relationship between Lavandula stoechas subsp. pedunculata, a common Mediterranean scrub species in central Iberia, and perennial grasslands. While Lavandula gives rise to almost monospecific formations in intermediate and upper hill zones, perennial grasses occupy the low areas. The proposed explanatory hypothesis for this spatial distribution is that the scrub is unable to establish itself in grasslands with heavy spatial occupation. We designed two experiments to test this hypothesis, one which analysed the effect of perennial grass cover on Lavandula establishment, and another which focused on its influence on previously implanted seedling survival and growth, distinguishing the effect of shoot and root interference. The results show negative interference during establishment and later in the use of light and nutrients. This results in a very low overall survival probability, with only 1.4% of seedlings surviving the first growth period. This low success rate explains the existence of a clear spatial segregation between scrub patches and perennial-dominated grasslands.

  11. Search for new physics in a precise 20F beta spectrum shape measurement

    Science.gov (United States)

    George, Elizabeth; Voytas, Paul; Chuna, Thomas; Naviliat-Cuncic, Oscar; Gade, Alexandra; Hughes, Max; Huyan, Xueying; Liddick, Sean; Minamisono, Kei; Paulauskas, Stanley; Weisshaar, Dirk; Ban, Gilles; Flechard, Xavier; Lienard, Etienne

    2015-10-01

    We are carrying out a measurement of the shape of the energy spectrum of β particles from 20F decay. We aim to achieve a relative precision below 3%, representing an order of magnitude improvement compared to previous experiments. This level of precision will enable a test of the so-called strong form of the conserved vector current (CVC) hypothesis, and should also enable us to place competitive limits on the contributions of exotic tensor couplings in beta decay. In order to control systematic effects, we are using a technique that takes advantage of high energy radioactive beams at the NSCL to implant the decaying nuclei in a scintillation detector deep enough that the emitted beta particles cannot escape. The β-particle energy is measured with the implantation detector after switching off the beam implantation. Ancillary detectors are used to tag the 1.633-MeV γ-rays following the β decay for coincidence measurements in order to reduce backgrounds. We will give an overview and report on the status of the experiment.

  12. Multiple sclerosis: a geographical hypothesis.

    Science.gov (United States)

    Carlyle, I P

    1997-12-01

    Multiple sclerosis remains a rare neurological disease of unknown aetiology, with a unique distribution, both geographically and historically. Rare in equatorial regions, it becomes increasingly common in higher latitudes; historically, it was first clinically recognized in the early nineteenth century. A hypothesis, based on geographical reasoning, is here proposed: that the disease is the result of a specific vitamin deficiency. Different individuals suffer the deficiency in separate and often unique ways. Evidence to support the hypothesis exists in cultural considerations, in the global distribution of the disease, and in its historical prevalence.

  13. submitter A High Precision 3D Magnetic Field Scanner for Small to Medium Size Magnets

    CERN Document Server

    Bergsma, F; Garnier, F; Giudici, P A

    2016-01-01

    A bench to measure the magnetic field of small to-medium-sized magnets with high precision was built. It uses a small-sized head with three orthogonal Hall probes, supported on a long pole at continuous movement during measurement. The head is calibrated in three dimensions by rotation over the full solid angle in a special device. From 0 to 2.5 T, the precision is ±0.2 mT in all components. The spatial range is 1 × 1 × 2 m with precision of ±0.02 mm. The bench and its controls are lightweight and easy to transport. The head can penetrate through small apertures and measure as close as 0.5 mm from the surface of a magnet. The bench can scan complicated grids in Cartesian or cylindrical coordinates, steered by a simple text file on an accompanying PC. The raw data is online converted to magnetic units and stored in a text file.

  14. Ultrathin conformal devices for precise and continuous thermal characterization of human skin

    Science.gov (United States)

    Webb, R. Chad; Bonifas, Andrew P.; Behnaz, Alex; Zhang, Yihui; Yu, Ki Jun; Cheng, Huanyu; Shi, Mingxing; Bian, Zuguang; Liu, Zhuangjian; Kim, Yun-Soung; Yeo, Woon-Hong; Park, Jae Suk; Song, Jizhou; Li, Yuhang; Huang, Yonggang; Gorbach, Alexander M.; Rogers, John A.

    2013-10-01

    Precision thermometry of the skin can, together with other measurements, provide clinically relevant information about cardiovascular health, cognitive state, malignancy and many other important aspects of human physiology. Here, we introduce an ultrathin, compliant skin-like sensor/actuator technology that can pliably laminate onto the epidermis to provide continuous, accurate thermal characterizations that are unavailable with other methods. Examples include non-invasive spatial mapping of skin temperature with millikelvin precision, and simultaneous quantitative assessment of tissue thermal conductivity. Such devices can also be implemented in ways that reveal the time-dynamic influence of blood flow and perfusion on these properties. Experimental and theoretical studies establish the underlying principles of operation, and define engineering guidelines for device design. Evaluation of subtle variations in skin temperature associated with mental activity, physical stimulation and vasoconstriction/dilation along with accurate determination of skin hydration through measurements of thermal conductivity represent some important operational examples.

  15. Precision Tiltmeter as a Reference for Slope MeasuringInstruments

    Energy Technology Data Exchange (ETDEWEB)

    Kirschman, Jonathan L.; Domning, Edward E.; Morrison, Gregory Y.; Smith, Brian V.; Yashchuk, Valeriy V.

    2007-08-01

    The next generation of synchrotrons and free electron lasers require extremely high-performance x-ray optical systems for proper focusing. The necessary optics cannot be fabricated without the use of precise optical metrology instrumentation. In particular, the Long Trace Profiler (LTP) based on the pencil-beam interferometer is a valuable tool for low-spatial-frequency slope measurement with x-ray optics. The limitations of such a device are set by the amount of systematic errors and noise. A significant improvement of LTP performance was the addition of an optical reference channel, which allowed to partially account for systematic errors associated with wiggling and wobbling of the LTP carriage. However, the optical reference is affected by changing optical path length, non-homogeneous optics, and air turbulence. In the present work, we experimentally investigate the questions related to the use of a precision tiltmeter as a reference channel. Dependence of the tiltmeter performance on horizontal acceleration, temperature drift, motion regime, and kinematical scheme of the translation stage has been investigated. It is shown that at an appropriate experimental arrangement, the tiltmeter provides a slope reference for the LTP system with accuracy on the level of 0.1 {micro}rad (rms).

  16. Precision Tiltmeter as a Reference for Slope Measuring Instruments

    International Nuclear Information System (INIS)

    Kirschman, Jonathan L.; Domning, Edward E.; Morrison, Gregory Y.; Smith, Brian V.; Yashchuk, Valeriy V.

    2007-01-01

    The next generation of synchrotrons and free electron lasers require extremely high-performance x-ray optical systems for proper focusing. The necessary optics cannot be fabricated without the use of precise optical metrology instrumentation. In particular, the Long Trace Profiler (LTP) based on the pencil-beam interferometer is a valuable tool for low-spatial-frequency slope measurement with x-ray optics. The limitations of such a device are set by the amount of systematic errors and noise. A significant improvement of LTP performance was the addition of an optical reference channel, which allowed to partially account for systematic errors associated with wiggling and wobbling of the LTP carriage. However, the optical reference is affected by changing optical path length, non-homogeneous optics, and air turbulence. In the present work, we experimentally investigate the questions related to the use of a precision tiltmeter as a reference channel. Dependence of the tiltmeter performance on horizontal acceleration, temperature drift, motion regime, and kinematical scheme of the translation stage has been investigated. It is shown that at an appropriate experimental arrangement, the tiltmeter provides a slope reference for the LTP system with accuracy on the level of 0.1 (micro)rad (rms)

  17. Imprecision in waggle dances of the honeybee (Apis mellifera) for nearby food sources : error or adaptation?

    OpenAIRE

    Weidenmüller, Anja; Seeley, Thomas

    1999-01-01

    A curious feature of the honeybee's waggle dance is the imprecision in the direction indication for nearby food sources. One hypothesis for the function of this imprecision is that it serves to spread recruits over a certain area and thus is an adaptation to the typical spatial configuration of the bees' food sources, i.e., flowers in sizable patches. We report an experiment that tests this tuned-error hypothesis. We measured the precision of direction indication in waggle dances advertising ...

  18. Temporal Data-Driven Sleep Scheduling and Spatial Data-Driven Anomaly Detection for Clustered Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Gang Li

    2016-09-01

    Full Text Available The spatial–temporal correlation is an important feature of sensor data in wireless sensor networks (WSNs. Most of the existing works based on the spatial–temporal correlation can be divided into two parts: redundancy reduction and anomaly detection. These two parts are pursued separately in existing works. In this work, the combination of temporal data-driven sleep scheduling (TDSS and spatial data-driven anomaly detection is proposed, where TDSS can reduce data redundancy. The TDSS model is inspired by transmission control protocol (TCP congestion control. Based on long and linear cluster structure in the tunnel monitoring system, cooperative TDSS and spatial data-driven anomaly detection are then proposed. To realize synchronous acquisition in the same ring for analyzing the situation of every ring, TDSS is implemented in a cooperative way in the cluster. To keep the precision of sensor data, spatial data-driven anomaly detection based on the spatial correlation and Kriging method is realized to generate an anomaly indicator. The experiment results show that cooperative TDSS can realize non-uniform sensing effectively to reduce the energy consumption. In addition, spatial data-driven anomaly detection is quite significant for maintaining and improving the precision of sensor data.

  19. Planning of spatial development of tourism based on the example of spatial plan of Subotica municipality

    Directory of Open Access Journals (Sweden)

    Šećerov Velimir

    2008-01-01

    Full Text Available Planning of tourism development and its spatial disposition in Europe and world today are an important segment of the overall economic development. Having in mind its important economic and social functions, as well as its capability to intensify its other economic branches (agricultural and economic potentials, services of various types, transport and other in a certain territory, it is necessary to realize a precise valorization of tourist values in the spatial plan of municipality and to conclude at what point and at which places the tourism can represent on of development components of the entire economy of the territory to be planed. The example of the spatial plan of Subotica and main guidelines, concept and planning priorities which can be expected in the forthcoming period are presented in this paper. It is without any doubt that the municipality of Subotica with its geostrategic position, the proximity of the EU and important natural and cultural tourist potentials is a suitable space for application of contemporary principles of the tourism development planning and their correlation with other segments of integral development for the whole municipality.

  20. A NONPARAMETRIC HYPOTHESIS TEST VIA THE BOOTSTRAP RESAMPLING

    OpenAIRE

    Temel, Tugrul T.

    2001-01-01

    This paper adapts an already existing nonparametric hypothesis test to the bootstrap framework. The test utilizes the nonparametric kernel regression method to estimate a measure of distance between the models stated under the null hypothesis. The bootstraped version of the test allows to approximate errors involved in the asymptotic hypothesis test. The paper also develops a Mathematica Code for the test algorithm.

  1. Fabrication of high precision metallic freeform mirrors with magnetorheological finishing (MRF)

    Science.gov (United States)

    Beier, Matthias; Scheiding, Sebastian; Gebhardt, Andreas; Loose, Roman; Risse, Stefan; Eberhardt, Ramona; Tünnermann, Andreas

    2013-09-01

    The fabrication of complex shaped metal mirrors for optical imaging is a classical application area of diamond machining techniques. Aspherical and freeform shaped optical components up to several 100 mm in diameter can be manufactured with high precision in an acceptable amount of time. However, applications are naturally limited to the infrared spectral region due to scatter losses for shorter wavelengths as a result of the remaining periodic diamond turning structure. Achieving diffraction limited performance in the visible spectrum demands for the application of additional polishing steps. Magnetorheological Finishing (MRF) is a powerful tool to improve figure and finish of complex shaped optics at the same time in a single processing step. The application of MRF as a figuring tool for precise metal mirrors is a nontrivial task since the technology was primarily developed for figuring and finishing a variety of other optical materials, such as glasses or glass ceramics. In the presented work, MRF is used as a figuring tool for diamond turned aluminum lightweight mirrors with electroless nickel plating. It is applied as a direct follow-up process after diamond machining of the mirrors. A high precision measurement setup, composed of an interferometer and an advanced Computer Generated Hologram with additional alignment features, allows for precise metrology of the freeform shaped optics in short measuring cycles. Shape deviations less than 150 nm PV / 20 nm rms are achieved reliably for freeform mirrors with apertures of more than 300 mm. Characterization of removable and induced spatial frequencies is carried out by investigating the Power Spectral Density.

  2. An automatic high precision registration method between large area aerial images and aerial light detection and ranging data

    Science.gov (United States)

    Du, Q.; Xie, D.; Sun, Y.

    2015-06-01

    The integration of digital aerial photogrammetry and Light Detetion And Ranging (LiDAR) is an inevitable trend in Surveying and Mapping field. We calculate the external orientation elements of images which identical with LiDAR coordinate to realize automatic high precision registration between aerial images and LiDAR data. There are two ways to calculate orientation elements. One is single image spatial resection using image matching 3D points that registered to LiDAR. The other one is Position and Orientation System (POS) data supported aerotriangulation. The high precision registration points are selected as Ground Control Points (GCPs) instead of measuring GCPs manually during aerotriangulation. The registration experiments indicate that the method which registering aerial images and LiDAR points has a great advantage in higher automation and precision compare with manual registration.

  3. Spatial prevalence of intellectual disability and related socio-demographic factors in Iran, using GWR: Case study (2006

    Directory of Open Access Journals (Sweden)

    Ali Goli

    2014-01-01

    Full Text Available Background: Although intellectual disability (ID is a common disability in Iran, there is no investigation on the spatial distribution pattern of these patients in national level and the spatial maps for recognition the areas with higher prevalence of IDs and local neighborhoods of these regions or effect of socio-demographic factor on this scattering is not still available. This proposition motivated us to assess the population with ID in our country. Methods: In a cross-sectional study, we applied Moran′s Index (Moran′s I which includes information about the strength of the neighboring association between counties, as global univariate distribution assessment. A geographically weighted regression was used to explore relation between ID patient′s prevalence and some socio-demographic factors (migration and illiteracy rate, physician number (PN/10,000 people and health-care centers (HCCs/10,000 people. Results: We found that spatial clusters of ID patients exist among Iran counties (Moran′s I = 0.36, P 0.3. Conclusions: According to the results, our Initial hypothesis about the existence of spatial clusters in distribution of people with ID in Iran was proven. Spatial autocorrelation between migration and illiteracy rate and prevalence of patients with ID was shown and was in agreement with our hypothesis. However, our supposition that the prevalence should have inverse relationship with PN and HCC was rejected.

  4. The thrifty phenotype hypothesis revisited

    DEFF Research Database (Denmark)

    Vaag, A A; Grunnet, L G; Arora, G P

    2012-01-01

    Twenty years ago, Hales and Barker along with their co-workers published some of their pioneering papers proposing the 'thrifty phenotype hypothesis' in Diabetologia (4;35:595-601 and 3;36:62-67). Their postulate that fetal programming could represent an important player in the origin of type 2...... of the underlying molecular mechanisms. Type 2 diabetes is a multiple-organ disease, and developmental programming, with its idea of organ plasticity, is a plausible hypothesis for a common basis for the widespread organ dysfunctions in type 2 diabetes and the metabolic syndrome. Only two among the 45 known type 2...

  5. Disentangling the Role of the MEC and LEC in the Processing of Spatial and Non-Spatial Information: Contribution of Lesion Studies

    Directory of Open Access Journals (Sweden)

    Etienne Save

    2017-10-01

    Full Text Available It is now widely accepted that the entorhinal cortex (EC plays a pivotal role in the processing of spatial information and episodic memory. The EC is segregated into two sub-regions, the medial EC (MEC and the lateral EC (LEC but a comprehensive understanding of their roles across multiple behavioral contexts remains unclear. Considering that it is still useful to investigate the impact of lesions of EC on behavior, we review the contribution of lesion approach to our knowledge of EC functions. We show that the MEC and LEC play different roles in the processing of spatial and non-spatial information. The MEC is necessary to the use of distal but not proximal landmarks during navigation and is crucial for path integration, in particular integration of linear movements. Consistent with predominant hypothesis, the LEC is important for combining the spatial and non-spatial aspects of the environment. However, object exploration studies suggest that the functional segregation between the MEC and the LEC is not as clearly delineated and is dependent on environmental and behavioral factors. Manipulation of environmental complexity and therefore of cognitive demand shows that the MEC and the LEC are not strictly necessary to the processing of spatial and non-spatial information. In addition we suggest that the involvement of these sub-regions can depend on the kind of behavior, i.e., navigation or exploration, exhibited by the animals. Thus, the MEC and the LEC work in a flexible manner to integrate the “what” and “where” information in episodic memory upstream the hippocampus.

  6. Spatial cluster detection using dynamic programming

    Directory of Open Access Journals (Sweden)

    Sverchkov Yuriy

    2012-03-01

    Full Text Available Abstract Background The task of spatial cluster detection involves finding spatial regions where some property deviates from the norm or the expected value. In a probabilistic setting this task can be expressed as finding a region where some event is significantly more likely than usual. Spatial cluster detection is of interest in fields such as biosurveillance, mining of astronomical data, military surveillance, and analysis of fMRI images. In almost all such applications we are interested both in the question of whether a cluster exists in the data, and if it exists, we are interested in finding the most accurate characterization of the cluster. Methods We present a general dynamic programming algorithm for grid-based spatial cluster detection. The algorithm can be used for both Bayesian maximum a-posteriori (MAP estimation of the most likely spatial distribution of clusters and Bayesian model averaging over a large space of spatial cluster distributions to compute the posterior probability of an unusual spatial clustering. The algorithm is explained and evaluated in the context of a biosurveillance application, specifically the detection and identification of Influenza outbreaks based on emergency department visits. A relatively simple underlying model is constructed for the purpose of evaluating the algorithm, and the algorithm is evaluated using the model and semi-synthetic test data. Results When compared to baseline methods, tests indicate that the new algorithm can improve MAP estimates under certain conditions: the greedy algorithm we compared our method to was found to be more sensitive to smaller outbreaks, while as the size of the outbreaks increases, in terms of area affected and proportion of individuals affected, our method overtakes the greedy algorithm in spatial precision and recall. The new algorithm performs on-par with baseline methods in the task of Bayesian model averaging. Conclusions We conclude that the dynamic

  7. Error probabilities in default Bayesian hypothesis testing

    NARCIS (Netherlands)

    Gu, Xin; Hoijtink, Herbert; Mulder, J,

    2016-01-01

    This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for

  8. Aminoglycoside antibiotics and autism: a speculative hypothesis

    Directory of Open Access Journals (Sweden)

    Manev Hari

    2001-10-01

    Full Text Available Abstract Background Recently, it has been suspected that there is a relationship between therapy with some antibiotics and the onset of autism; but even more curious, some children benefited transiently from a subsequent treatment with a different antibiotic. Here, we speculate how aminoglycoside antibiotics might be associated with autism. Presentation We hypothesize that aminoglycoside antibiotics could a trigger the autism syndrome in susceptible infants by causing the stop codon readthrough, i.e., a misreading of the genetic code of a hypothetical critical gene, and/or b improve autism symptoms by correcting the premature stop codon mutation in a hypothetical polymorphic gene linked to autism. Testing Investigate, retrospectively, whether a link exists between aminoglycoside use (which is not extensive in children and the onset of autism symptoms (hypothesis "a", or between amino glycoside use and improvement of these symptoms (hypothesis "b". Whereas a prospective study to test hypothesis "a" is not ethically justifiable, a study could be designed to test hypothesis "b". Implications It should be stressed that at this stage no direct evidence supports our speculative hypothesis and that its main purpose is to initiate development of new ideas that, eventually, would improve our understanding of the pathobiology of autism.

  9. Biostatistics series module 2: Overview of hypothesis testing

    Directory of Open Access Journals (Sweden)

    Avijit Hazra

    2016-01-01

    Full Text Available Hypothesis testing (or statistical inference is one of the major applications of biostatistics. Much of medical research begins with a research question that can be framed as a hypothesis. Inferential statistics begins with a null hypothesis that reflects the conservative position of no change or no difference in comparison to baseline or between groups. Usually, the researcher has reason to believe that there is some effect or some difference which is the alternative hypothesis. The researcher therefore proceeds to study samples and measure outcomes in the hope of generating evidence strong enough for the statistician to be able to reject the null hypothesis. The concept of the P value is almost universally used in hypothesis testing. It denotes the probability of obtaining by chance a result at least as extreme as that observed, even when the null hypothesis is true and no real difference exists. Usually, if P is < 0.05 the null hypothesis is rejected and sample results are deemed statistically significant. With the increasing availability of computers and access to specialized statistical software, the drudgery involved in statistical calculations is now a thing of the past, once the learning curve of the software has been traversed. The life sciences researcher is therefore free to devote oneself to optimally designing the study, carefully selecting the hypothesis tests to be applied, and taking care in conducting the study well. Unfortunately, selecting the right test seems difficult initially. Thinking of the research hypothesis as addressing one of five generic research questions helps in selection of the right hypothesis test. In addition, it is important to be clear about the nature of the variables (e.g., numerical vs. categorical; parametric vs. nonparametric and the number of groups or data sets being compared (e.g., two or more than two at a time. The same research question may be explored by more than one type of hypothesis test

  10. Rapid subsidence in damaging sinkholes: Measurement by high-precision leveling and the role of salt dissolution

    Science.gov (United States)

    Desir, G.; Gutiérrez, F.; Merino, J.; Carbonel, D.; Benito-Calvo, A.; Guerrero, J.; Fabregat, I.

    2018-02-01

    Investigations dealing with subsidence monitoring in active sinkholes are very scarce, especially when compared with other ground instability phenomena like landslides. This is largely related to the catastrophic behaviour that typifies most sinkholes in carbonate karst areas. Active subsidence in five sinkholes up to ca. 500 m across has been quantitatively characterised by means of high-precision differential leveling. The sinkholes occur on poorly indurated alluvium underlain by salt-bearing evaporites and cause severe damage on various human structures. The leveling data have provided accurate information on multiple features of the subsidence phenomena with practical implications: (1) precise location of the vaguely-defined edges of the subsidence zones and their spatial relationships with surveyed surface deformation features; (2) spatial deformation patterns and relative contribution of subsidence mechanisms (sagging versus collapse); (3) accurate subsidence rates and their spatial variability with maximum and mean vertical displacement rates ranging from 1.0 to 11.8 cm/yr and 1.9 to 26.1 cm/yr, respectively; (4) identification of sinkholes that experience continuous subsidence at constant rates or with significant temporal changes; and (5) rates of volumetric surface changes as an approximation to rates of dissolution-induced volumetric depletion in the subsurface, reaching as much as 10,900 m3/yr in the largest sinkhole. The high subsidence rates as well as the annual volumetric changes are attributed to rapid dissolution of high-solubility salts.

  11. [Precision and personalized medicine].

    Science.gov (United States)

    Sipka, Sándor

    2016-10-01

    The author describes the concept of "personalized medicine" and the newly introduced "precision medicine". "Precision medicine" applies the terms of "phenotype", "endotype" and "biomarker" in order to characterize more precisely the various diseases. Using "biomarkers" the homogeneous type of a disease (a "phenotype") can be divided into subgroups called "endotypes" requiring different forms of treatment and financing. The good results of "precision medicine" have become especially apparent in relation with allergic and autoimmune diseases. The application of this new way of thinking is going to be necessary in Hungary, too, in the near future for participants, controllers and financing boards of healthcare. Orv. Hetil., 2016, 157(44), 1739-1741.

  12. The newest precision measurement

    International Nuclear Information System (INIS)

    Lee, Jing Gu; Lee, Jong Dae

    1974-05-01

    This book introduces basic of precision measurement, measurement of length, limit gauge, measurement of angles, measurement of surface roughness, measurement of shapes and locations, measurement of outline, measurement of external and internal thread, gear testing, accuracy inspection of machine tools, three dimension coordinate measuring machine, digitalisation of precision measurement, automation of precision measurement, measurement of cutting tools, measurement using laser, and point of choosing length measuring instrument.

  13. Mastery Learning and the Decreasing Variability Hypothesis.

    Science.gov (United States)

    Livingston, Jennifer A.; Gentile, J. Ronald

    1996-01-01

    This report results from studies that tested two variations of Bloom's decreasing variability hypothesis using performance on successive units of achievement in four graduate classrooms that used mastery learning procedures. Data do not support the decreasing variability hypothesis; rather, they show no change over time. (SM)

  14. The linear hypothesis and radiation carcinogenesis

    International Nuclear Information System (INIS)

    Roberts, P.B.

    1981-10-01

    An assumption central to most estimations of the carcinogenic potential of low levels of ionising radiation is that the risk always increases in direct proportion to the dose received. This assumption (the linear hypothesis) has been both strongly defended and attacked on several counts. It appears unlikely that conclusive, direct evidence on the validity of the hypothesis will be forthcoming. We review the major indirect arguments used in the debate. All of them are subject to objections that can seriously weaken their case. In the present situation, retention of the linear hypothesis as the basis of extrapolations from high to low dose levels can lead to excessive fears, over-regulation and unnecessarily expensive protection measures. To offset these possibilities, support is given to suggestions urging a cut-off dose, probably some fraction of natural background, below which risks can be deemed acceptable

  15. Precision Agriculture Technologies Positively Contributing to GHG Emissions Mitigation, Farm Productivity and Economics

    Directory of Open Access Journals (Sweden)

    Athanasios Balafoutis

    2017-07-01

    Full Text Available Agriculture is one of the economic sectors that affect climate change contributing to greenhouse gas emissions directly and indirectly. There is a trend of agricultural greenhouse gas emissions reduction, but any practice in this direction should not affect negatively farm productivity and economics because this would limit its implementation, due to the high global food and feed demand and the competitive environment in this sector. Precision agriculture practices using high-tech equipment has the ability to reduce agricultural inputs by site-specific applications, as it better target inputs to spatial and temporal needs of the fields, which can result in lower greenhouse gas emissions. Precision agriculture can also have a positive impact on farm productivity and economics, as it provides higher or equal yields with lower production cost than conventional practices. In this work, precision agriculture technologies that have the potential to mitigate greenhouse gas emissions are presented providing a short description of the technology and the impacts that have been reported in literature on greenhouse gases reduction and the associated impacts on farm productivity and economics. The technologies presented span all agricultural practices, including variable rate sowing/planting, fertilizing, spraying, weeding and irrigation.

  16. Spatial and temporal clustering of dengue virus transmission in Thai villages.

    Science.gov (United States)

    Mammen, Mammen P; Pimgate, Chusak; Koenraadt, Constantianus J M; Rothman, Alan L; Aldstadt, Jared; Nisalak, Ananda; Jarman, Richard G; Jones, James W; Srikiatkhachorn, Anon; Ypil-Butac, Charity Ann; Getis, Arthur; Thammapalo, Suwich; Morrison, Amy C; Libraty, Daniel H; Green, Sharone; Scott, Thomas W

    2008-11-04

    Transmission of dengue viruses (DENV), the leading cause of arboviral disease worldwide, is known to vary through time and space, likely owing to a combination of factors related to the human host, virus, mosquito vector, and environment. An improved understanding of variation in transmission patterns is fundamental to conducting surveillance and implementing disease prevention strategies. To test the hypothesis that DENV transmission is spatially and temporally focal, we compared geographic and temporal characteristics within Thai villages where DENV are and are not being actively transmitted. Cluster investigations were conducted within 100 m of homes where febrile index children with (positive clusters) and without (negative clusters) acute dengue lived during two seasons of peak DENV transmission. Data on human infection and mosquito infection/density were examined to precisely (1) define the spatial and temporal dimensions of DENV transmission, (2) correlate these factors with variation in DENV transmission, and (3) determine the burden of inapparent and symptomatic infections. Among 556 village children enrolled as neighbors of 12 dengue-positive and 22 dengue-negative index cases, all 27 DENV infections (4.9% of enrollees) occurred in positive clusters (p availability of piped water in negative clusters (p < 0.01) and greater number of Ae. aegypti pupae per person in positive clusters (p = 0.04). During primarily DENV-4 transmission seasons, the ratio of inapparent to symptomatic infections was nearly 1:1 among child enrollees. Study limitations included inability to sample all children and mosquitoes within each cluster and our reliance on serologic rather than virologic evidence of interval infections in enrollees given restrictions on the frequency of blood collections in children. Our data reveal the remarkably focal nature of DENV transmission within a hyperendemic rural area of Thailand. These data suggest that active school-based dengue case detection

  17. A robust null hypothesis for the potential causes of megadrought in western North America

    Science.gov (United States)

    Ault, T.; St George, S.; Smerdon, J. E.; Coats, S.; Mankin, J. S.; Cruz, C. C.; Cook, B.; Stevenson, S.

    2017-12-01

    The western United States was affected by several megadroughts during the last 1200 years, most prominently during the Medieval Climate Anomaly (MCA: 800 to 1300 CE). A null hypothesis is developed to test the possibility that, given a sufficiently long period of time, these events are inevitable and occur purely as a consequence of internal climate variability. The null distribution of this hypothesis is populated by a linear inverse model (LIM) constructed from global sea-surface temperature anomalies and self-calibrated Palmer Drought Severity Index data for North America. Despite being trained only on seasonal data from the late 20th century, the LIM produces megadroughts that are comparable in their duration, spatial scale, and magnitude as the most severe events of the last 12 centuries. The null hypothesis therefore cannot be rejected with much confidence when considering these features of megadrought, meaning that similar events are possible today, even without any changes to boundary conditions. In contrast, the observed clustering of megadroughts in the MCA, as well as the change in mean hydroclimate between the MCA and the 1500-2000 period, are more likely to have been caused by either external forcing or by internal climate variability not well sampled during the latter half of the Twentieth Century. Finally, the results demonstrate the LIM is a viable tool for determining whether paleoclimate reconstructions events should be ascribed to external forcings, "out of sample" climate mechanisms, or if they are consistent with the variability observed during the recent period.

  18. Is PMI the Hypothesis or the Null Hypothesis?

    Science.gov (United States)

    Tarone, Aaron M; Sanford, Michelle R

    2017-09-01

    Over the past several decades, there have been several strident exchanges regarding whether forensic entomologists estimate the postmortem interval (PMI), minimum PMI, or something else. During that time, there has been a proliferation of terminology reflecting this concern regarding "what we do." This has been a frustrating conversation for some in the community because much of this debate appears to be centered on what assumptions are acknowledged directly and which are embedded within a list of assumptions (or ignored altogether) in the literature and in case reports. An additional component of the conversation centers on a concern that moving away from the use of certain terminology like PMI acknowledges limitations and problems that would make the application of entomology appear less useful in court-a problem for lawyers, but one that should not be problematic for scientists in the forensic entomology community, as uncertainty is part of science that should and can be presented effectively in the courtroom (e.g., population genetic concepts in forensics). Unfortunately, a consequence of the way this conversation is conducted is that even as all involved in the debate acknowledge the concerns of their colleagues, parties continue to talk past one another advocating their preferred terminology. Progress will not be made until the community recognizes that all of the terms under consideration take the form of null hypothesis statements and that thinking about "what we do" as a null hypothesis has useful legal and scientific ramifications that transcend arguments over the usage of preferred terminology. © The Authors 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. The Lehman Sisters Hypothesis

    NARCIS (Netherlands)

    I.P. van Staveren (Irene)

    2014-01-01

    markdownabstract__Abstract__ This article explores the Lehman Sisters Hypothesis. It reviews empirical literature about gender differences in behavioral, experimental, and neuro-economics as well as in other fields of behavioral research. It discusses gender differences along three dimensions of

  20. High-precision half-life measurements of the T =1 /2 mirror β decays 17F and 33Cl

    Science.gov (United States)

    Grinyer, J.; Grinyer, G. F.; Babo, M.; Bouzomita, H.; Chauveau, P.; Delahaye, P.; Dubois, M.; Frigot, R.; Jardin, P.; Leboucher, C.; Maunoury, L.; Seiffert, C.; Thomas, J. C.; Traykov, E.

    2015-10-01

    Background: Measurements of the f t values for T =1 /2 mirror β+ decays offer a method to test the conserved vector current hypothesis and to determine Vud, the up-down matrix element of the Cabibbo-Kobayashi-Maskawa matrix. In most mirror decays used for these tests, uncertainties in the f t values are dominated by the uncertainties in the half-lives. Purpose: Two precision half-life measurements were performed for the T =1 /2 β+ emitters, 17F and 33Cl, in order to eliminate the half-life as the leading source of uncertainty in their f t values. Method: Half-lives of 17F and 33Cl were determined using β counting of implanted radioactive ion beam samples on a moving tape transport system at the Système de Production d'Ions Radioactifs Accélérés en Ligne low-energy identification station at the Grand Accélérateur National d'Ions Lourds. Results: The 17F half-life result, 64.347 (35) s, precise to ±0.05 % , is a factor of 5 times more precise than the previous world average. The half-life of 33Cl was determined to be 2.5038 (22) s. The current precision of ±0.09 % is nearly 2 times more precise compared to the previous world average. Conclusions: The precision achieved during the present measurements implies that the half-life no longer dominates the uncertainty of the f t values for both T =1 /2 mirror decays 17F and 33Cl.

  1. Whiplash and the compensation hypothesis.

    Science.gov (United States)

    Spearing, Natalie M; Connelly, Luke B

    2011-12-01

    Review article. To explain why the evidence that compensation-related factors lead to worse health outcomes is not compelling, either in general, or in the specific case of whiplash. There is a common view that compensation-related factors lead to worse health outcomes ("the compensation hypothesis"), despite the presence of important, and unresolved sources of bias. The empirical evidence on this question has ramifications for the design of compensation schemes. Using studies on whiplash, this article outlines the methodological problems that impede attempts to confirm or refute the compensation hypothesis. Compensation studies are prone to measurement bias, reverse causation bias, and selection bias. Errors in measurement are largely due to the latent nature of whiplash injuries and health itself, a lack of clarity over the unit of measurement (specific factors, or "compensation"), and a lack of appreciation for the heterogeneous qualities of compensation-related factors and schemes. There has been a failure to acknowledge and empirically address reverse causation bias, or the likelihood that poor health influences the decision to pursue compensation: it is unclear if compensation is a cause or a consequence of poor health, or both. Finally, unresolved selection bias (and hence, confounding) is evident in longitudinal studies and natural experiments. In both cases, between-group differences have not been addressed convincingly. The nature of the relationship between compensation-related factors and health is unclear. Current approaches to testing the compensation hypothesis are prone to several important sources of bias, which compromise the validity of their results. Methods that explicitly test the hypothesis and establish whether or not a causal relationship exists between compensation factors and prolonged whiplash symptoms are needed in future studies.

  2. Component deficits of visual neglect: "Magnetic" attraction of attention vs. impaired spatial working memory.

    Science.gov (United States)

    Toba, Monica N; Rabuffetti, Marco; Duret, Christophe; Pradat-Diehl, Pascale; Gainotti, Guido; Bartolomeo, Paolo

    2018-01-31

    Visual neglect is a disabling consequence of right hemisphere damage, whereby patients fail to detect left-sided objects. Its precise mechanisms are debated, but there is some consensus that distinct component deficits may variously associate and interact in different patients. Here we used a touch-screen based procedure to study two putative component deficits of neglect, rightward "magnetic" attraction of attention and impaired spatial working memory, in a group of 47 right brain-damaged patients, of whom 33 had signs of left neglect. Patients performed a visual search task on three distinct conditions, whereby touched targets could (1) be tagged, (2) disappear or (3) show no change. Magnetic attraction of attention was defined as more left neglect on the tag condition than on the disappear condition, where right-sided disappeared targets could not capture patients' attention. Impaired spatial working memory should instead produce more neglect on the no change condition, where no external cue indicated that a target had already been explored, than on the tag condition. Using a specifically developed analysis algorithm, we identified significant differences of performance between the critical conditions. Neglect patients as a group performed better on the disappear condition than on the no change condition and also better in the tag condition comparing with the no change condition. No difference was found between the tag condition and the disappear condition. Some of our neglect patients had dissociated patterns of performance, with predominant magnetic attraction or impaired spatial working memory. Anatomical results issued from both grey matter analysis and fiber tracking were consistent with the typical patterns of fronto-parietal and occipito-frontal disconnection in neglect, but did not identify lesional patterns specifically associated with one or another deficit, thus suggesting the possible co-localization of attentional and working memory processes in

  3. A prospective study of severe hypoglycemia and long-term spatial memory in children with type 1 diabetes.

    Science.gov (United States)

    Hershey, Tamara; Lillie, Rema; Sadler, Michelle; White, Neil H

    2004-06-01

    In a previous retrospective study, severe hypoglycemia (SH) was associated with decreased long-term spatial memory in children with type 1 diabetes mellitus (T1DM). In this study, we tested the hypothesis that prospectively ascertained SH would also be associated with decreased spatial long-term memory over time. Children with T1DM (n = 42) and sibling controls (n = 25) performed a spatial delayed response (SDR) task with short and long delays and other neuropsychological tests at baseline and after 15 months of monitoring. Extreme glycemic events and other medical complications were recorded prospectively during follow-up. Fourteen T1DM children experienced at least one episode of SH during the follow-up period (range = 1-5). After controlling for long-delay SDR performance at baseline, age, gender, and age of onset, the presence of SH during the prospective period was statistically associated with decreased long-delay SDR performance at follow-up (semipartial r = -0.38, p = 0.017). This relationship was not seen with short-delay SDR or with verbal or object memory, attention, or motor speed. These results, together with previously reported data, support the hypothesis that SH has specific, negative effects on spatial memory skills in T1DM children.

  4. Precise and Arbitrary Deposition of Biomolecules onto Biomimetic Fibrous Matrices for Spatially Controlled Cell Distribution and Functions.

    Science.gov (United States)

    Jia, Chao; Luo, Bowen; Wang, Haoyu; Bian, Yongqian; Li, Xueyong; Li, Shaohua; Wang, Hongjun

    2017-09-01

    Advances in nano-/microfabrication allow the fabrication of biomimetic substrates for various biomedical applications. In particular, it would be beneficial to control the distribution of cells and relevant biomolecules on an extracellular matrix (ECM)-like substrate with arbitrary micropatterns. In this regard, the possibilities of patterning biomolecules and cells on nanofibrous matrices are explored here by combining inkjet printing and electrospinning. Upon investigation of key parameters for patterning accuracy and reproducibility, three independent studies are performed to demonstrate the potential of this platform for: i) transforming growth factor (TGF)-β1-induced spatial differentiation of fibroblasts, ii) spatiotemporal interactions between breast cancer cells and stromal cells, and iii) cancer-regulated angiogenesis. The results show that TGF-β1 induces local fibroblast-to-myofibroblast differentiation in a dose-dependent fashion, and breast cancer clusters recruit activated stromal cells and guide the sprouting of endothelial cells in a spatially resolved manner. The established platform not only provides strategies to fabricate ECM-like interfaces for medical devices, but also offers the capability of spatially controlling cell organization for fundamental studies, and for high-throughput screening of various biomolecules for stem cell differentiation and cancer therapeutics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. The Hypothesis-Driven Physical Examination.

    Science.gov (United States)

    Garibaldi, Brian T; Olson, Andrew P J

    2018-05-01

    The physical examination remains a vital part of the clinical encounter. However, physical examination skills have declined in recent years, in part because of decreased time at the bedside. Many clinicians question the relevance of physical examinations in the age of technology. A hypothesis-driven approach to teaching and practicing the physical examination emphasizes the performance of maneuvers that can alter the likelihood of disease. Likelihood ratios are diagnostic weights that allow clinicians to estimate the post-probability of disease. This hypothesis-driven approach to the physical examination increases its value and efficiency, while preserving its cultural role in the patient-physician relationship. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Discussion of the Porter hypothesis

    International Nuclear Information System (INIS)

    1999-11-01

    In the reaction to the long-range vision of RMNO, published in 1996, The Dutch government posed the question whether a far-going and progressive modernization policy will lead to competitive advantages of high-quality products on partly new markets. Such a question is connected to the so-called Porter hypothesis: 'By stimulating innovation, strict environmental regulations can actually enhance competitiveness', from which statement it can be concluded that environment and economy can work together quite well. A literature study has been carried out in order to determine under which conditions that hypothesis is endorsed in the scientific literature and policy documents. Recommendations are given for further studies. refs

  7. The (not so) Immortal Strand Hypothesis

    OpenAIRE

    Tomasetti, Cristian; Bozic, Ivana

    2015-01-01

    Background: Non-random segregation of DNA strands during stem cell replication has been proposed as a mechanism to minimize accumulated genetic errors in stem cells of rapidly dividing tissues. According to this hypothesis, an “immortal” DNA strand is passed to the stem cell daughter and not the more differentiated cell, keeping the stem cell lineage replication error-free. After it was introduced, experimental evidence both in favor and against the hypothesis has been presented. Principal...

  8. Revisiting the Dutch hypothesis

    NARCIS (Netherlands)

    Postma, Dirkje S.; Weiss, Scott T.; van den Berge, Maarten; Kerstjens, Huib A. M.; Koppelman, Gerard H.

    The Dutch hypothesis was first articulated in 1961, when many novel and advanced scientific techniques were not available, such as genomics techniques for pinpointing genes, gene expression, lipid and protein profiles, and the microbiome. In addition, computed tomographic scans and advanced analysis

  9. Precision Medicine in Cardiovascular Diseases

    Directory of Open Access Journals (Sweden)

    Yan Liu

    2017-02-01

    Full Text Available Since President Obama announced the Precision Medicine Initiative in the United States, more and more attention has been paid to precision medicine. However, clinicians have already used it to treat conditions such as cancer. Many cardiovascular diseases have a familial presentation, and genetic variants are associated with the prevention, diagnosis, and treatment of cardiovascular diseases, which are the basis for providing precise care to patients with cardiovascular diseases. Large-scale cohorts and multiomics are critical components of precision medicine. Here we summarize the application of precision medicine to cardiovascular diseases based on cohort and omic studies, and hope to elicit discussion about future health care.

  10. Understanding Galactic planetary nebulae with precise/reliable nebular abundances

    Science.gov (United States)

    García-Hernández, D. A.; Ventura, P.; Delgado-Inglada, G.; Dell'Agli, F.; di Criscienzo, M.; Yagüe, A.

    2017-10-01

    We compare recent precise/reliable nebular abundances - as derived from high-quality optical spectra and the most recent ICFs - in a sample of Galactic planetary nebulae (PNe) with nucleosynthesis predictions (HeCNOCl) from asymptotic giant branch (AGB) ATON models in the metallicity range Z ⊙/4 3.5 M⊙) solar/supersolar metallicity AGBs that experience hot bottom burning (HBB), but other formation channels in low-mass AGBs like extra mixing, stellar rotation, binary interaction, or He pre-enrichment cannot be disregarded until more accurate C/O ratios can be obtained. Two DC PNe show the imprint of advanced CNO processing and deep second dredge-up, suggesting progenitors masses close to the limit to evolve as core collapse supernovae (above 6 M⊙). Their actual C/O ratios, if confirmed, indicate contamination from the third dredge-up, rejecting the hypothesis that the chemical composition of such high-metallicity massive AGBs is modified exclusively by HBB.

  11. Non-retinotopic feature processing in the absence of retinotopic spatial layout and the construction of perceptual space from motion.

    Science.gov (United States)

    Ağaoğlu, Mehmet N; Herzog, Michael H; Oğmen, Haluk

    2012-10-15

    The spatial representation of a visual scene in the early visual system is well known. The optics of the eye map the three-dimensional environment onto two-dimensional images on the retina. These retinotopic representations are preserved in the early visual system. Retinotopic representations and processing are among the most prevalent concepts in visual neuroscience. However, it has long been known that a retinotopic representation of the stimulus is neither sufficient nor necessary for perception. Saccadic Stimulus Presentation Paradigm and the Ternus-Pikler displays have been used to investigate non-retinotopic processes with and without eye movements, respectively. However, neither of these paradigms eliminates the retinotopic representation of the spatial layout of the stimulus. Here, we investigated how stimulus features are processed in the absence of a retinotopic layout and in the presence of retinotopic conflict. We used anorthoscopic viewing (slit viewing) and pitted a retinotopic feature-processing hypothesis against a non-retinotopic feature-processing hypothesis. Our results support the predictions of the non-retinotopic feature-processing hypothesis and demonstrate the ability of the visual system to operate non-retinotopically at a fine feature processing level in the absence of a retinotopic spatial layout. Our results suggest that perceptual space is actively constructed from the perceptual dimension of motion. The implications of these findings for normal ecological viewing conditions are discussed. 2012 Elsevier Ltd. All rights reserved

  12. The Buffering Hypothesis: Growing Diversity and Declining Black-White Segregation in America’s Cities, Suburbs, and Small Towns?

    Directory of Open Access Journals (Sweden)

    Domenico Parisi

    2015-03-01

    Full Text Available The conventional wisdom is that racial diversity promotes positive race relations and reduces racial residential segregation between blacks and whites. We use data from the 1990–2010 decennial censuses and 2007–2011 ACS to test this so-called “buffering hypothesis.” We identify cities, suburbs, and small towns that are virtually all white, all black, all Asian, all Hispanic, and everything in between. The results show that the most racially diverse places—those with all four racial groups (white, black, Hispanic, and Asian present—had the lowest black-white levels of segregation in 2010. Black-white segregation also declined most rapidly in the most racially diverse places and in places that experienced the largest recent increases in diversity. Support for the buffering hypothesis, however, is counterbalanced by continuing high segregation across cities and communities and by rapid white depopulation in the most rapidly diversifying communities. We argue for a new, spatially inclusive perspective on racial residential segregation.

  13. Spatial Tuning of a RF Frequency Selective Surface through Origami (Postprint)

    Science.gov (United States)

    2016-05-12

    computational tools to systematically predict optimal folds. 15. SUBJECT TERMS origami, frequency selective surface, tuning, radio frequency 16...experimental study and motivates the development of computational tools to systematically predict optimal fold patterns for targeted frequency response...folding motions. The precise mapping of origami presents a novel method to spatially tune radio frequency (RF) devices, including adaptive antennas

  14. Probabilistic disaggregation of a spatial portfolio of exposure for natural hazard risk assessment

    DEFF Research Database (Denmark)

    Custer, Rocco; Nishijima, Kazuyoshi

    2018-01-01

    In natural hazard risk assessment situations are encountered where information on the portfolio of exposure is only available in a spatially aggregated form, hindering a precise risk assessment. Recourse might be found in the spatial disaggregation of the portfolio of exposure to the resolution...... of a portfolio of buildings in two communes in Switzerland and the results are compared to sample observations. The relevance of probabilistic disaggregation uncertainty in natural hazard risk assessment is illustrated with the example of a simple flood risk assessment....

  15. Nitrogen Oxide Emission, Economic Growth and Urbanization in China: a Spatial Econometric Analysis

    Science.gov (United States)

    Zhou, Zhimin; Zhou, Yanli; Ge, Xiangyu

    2018-01-01

    This research studies the nexus of nitrogen oxide emissions and economic development/urbanization. Under the environmental Kuznets curve (EKC) hypothesis, we apply the analysis technique of spatial panel data in the STIRPAT framework, and thus obtain the estimated impacts of income/urbanization on nitrogen oxide emission systematically. The empirical findings suggest that spatial dependence on nitrogen oxide emission distribution exist at provincial level, and the inverse N-shape EKC describes both income-nitrogen oxide and urbanization-nitrogen oxide nexuses. In addition, some well-directed policy advices are made to reduce the nitrogen oxide emission in future.

  16. The use of proximal soil sensor data fusion and digital soil mapping for precision agriculture

    OpenAIRE

    Ji, Wenjun; Adamchuk, Viacheslav; Chen, Songchao; Biswas, Asim; Leclerc, Maxime; Viscarra Rossel, Raphael

    2017-01-01

    Proximal soil sensing (PSS) is a promising approach when it comes to detailed characterization of spatial soil heterogeneity. Since none of existing PSS systems can measure all soil information needed for implementation precision agriculture, sensor data fusion can provide a reasonable al- ternative to characterize the complexity of soils. In this study, we fused the data measured using a gamma-ray sensor, an apparent electrical conductivity (ECa) sensor, and a commercial Veris MS...

  17. Manipulating Crop Density to Optimize Nitrogen and Water Use: An Application of Precision Agroecology

    Science.gov (United States)

    Brown, T. T.; Huggins, D. R.; Smith, J. L.; Keller, C. K.; Kruger, C.

    2011-12-01

    Rising levels of reactive nitrogen (Nr) in the environment coupled with increasing population positions agriculture as a major contributor for supplying food and ecosystem services to the world. The concept of Precision Agroecology (PA) explicitly recognizes the importance of time and place by combining the principles of precision farming with ecology creating a framework that can lead to improvements in Nr use efficiency. In the Palouse region of the Pacific Northwest, USA, relationships between productivity, N dynamics and cycling, water availability, and environmental impacts result from intricate spatial and temporal variations in soil, ecosystem processes, and socioeconomic factors. Our research goal is to investigate N use efficiency (NUE) in the context of factors that regulate site-specific environmental and economic conditions and to develop the concept of PA for use in sustainable agroecosystems and science-based Nr policy. Nitrogen and plant density field trials with winter wheat (Triticum aestivum L.) were conducted at the Washington State University Cook Agronomy Farm near Pullman, WA under long-term no-tillage management in 2010 and 2011. Treatments were imposed across environmentally heterogeneous field conditions to assess soil, crop and environmental interactions. Microplots with a split N application using 15N-labeled fertilizer were established in 2011 to examine the impact of N timing on uptake of fertilizer and soil N throughout the growing season for two plant density treatments. Preliminary data show that plant density manipulation combined with precision N applications regulated water and N use and resulted in greater wheat yield with less seed and N inputs. These findings indicate that improvements to NUE and agroecosystem sustainability should consider landscape-scale patterns driving productivity (e.g., spatial and temporal dynamics of water availability and N transformations) and would benefit from policy incentives that promote a PA

  18. Precise positioning with current multi-constellation Global Navigation Satellite Systems: GPS, GLONASS, Galileo and BeiDou.

    Science.gov (United States)

    Li, Xingxing; Zhang, Xiaohong; Ren, Xiaodong; Fritsche, Mathias; Wickert, Jens; Schuh, Harald

    2015-02-09

    The world of satellite navigation is undergoing dramatic changes with the rapid development of multi-constellation Global Navigation Satellite Systems (GNSSs). At the moment more than 70 satellites are already in view, and about 120 satellites will be available once all four systems (BeiDou + Galileo + GLONASS + GPS) are fully deployed in the next few years. This will bring great opportunities and challenges for both scientific and engineering applications. In this paper we develop a four-system positioning model to make full use of all available observations from different GNSSs. The significant improvement of satellite visibility, spatial geometry, dilution of precision, convergence, accuracy, continuity and reliability that a combining utilization of multi-GNSS brings to precise positioning are carefully analyzed and evaluated, especially in constrained environments.

  19. The linear hypothesis - an idea whose time has passed

    International Nuclear Information System (INIS)

    Tschaeche, A.N.

    1995-01-01

    The linear no-threshold hypothesis is the basis for radiation protection standards in the United States. In the words of the National Council on Radiation Protection and Measurements (NCRP), the hypothesis is: open-quotes In the interest of estimating effects in humans conservatively, it is not unreasonable to follow the assumption of a linear relationship between dose and effect in the low dose regions for which direct observational data are not available.close quotes The International Commission on Radiological Protection (ICRP) stated the hypothesis in a slightly different manner: open-quotes One such basic assumption ... is that ... there is ... a linear relationship without threshold between dose and the probability of an effect. The hypothesis was necessary 50 yr ago when it was first enunciated because the dose-effect curve for ionizing radiation for effects in humans was not known. The ICRP and NCRP needed a model to extrapolate high-dose effects to low-dose effects. So the linear no-threshold hypothesis was born. Certain details of the history of the development and use of the linear hypothesis are presented. In particular, use of the hypothesis by the U.S. regulatory agencies is examined. Over time, the sense of the hypothesis has been corrupted. The corruption of the hypothesis into the current paradigm of open-quote a little radiation, no matter how small, can and will harm youclose quotes is presented. The reasons the corruption occurred are proposed. The effects of the corruption are enumerated, specifically, the use of the corruption by the antinuclear forces in the United States and some of the huge costs to U.S. taxpayers due to the corruption. An alternative basis for radiation protection standards to assure public safety, based on the weight of scientific evidence on radiation health effects, is proposed

  20. Multiple hypothesis tracking for the cyber domain

    Science.gov (United States)

    Schwoegler, Stefan; Blackman, Sam; Holsopple, Jared; Hirsch, Michael J.

    2011-09-01

    This paper discusses how methods used for conventional multiple hypothesis tracking (MHT) can be extended to domain-agnostic tracking of entities from non-kinematic constraints such as those imposed by cyber attacks in a potentially dense false alarm background. MHT is widely recognized as the premier method to avoid corrupting tracks with spurious data in the kinematic domain but it has not been extensively applied to other problem domains. The traditional approach is to tightly couple track maintenance (prediction, gating, filtering, probabilistic pruning, and target confirmation) with hypothesis management (clustering, incompatibility maintenance, hypothesis formation, and Nassociation pruning). However, by separating the domain specific track maintenance portion from the domain agnostic hypothesis management piece, we can begin to apply the wealth of knowledge gained from ground and air tracking solutions to the cyber (and other) domains. These realizations led to the creation of Raytheon's Multiple Hypothesis Extensible Tracking Architecture (MHETA). In this paper, we showcase MHETA for the cyber domain, plugging in a well established method, CUBRC's INFormation Engine for Real-time Decision making, (INFERD), for the association portion of the MHT. The result is a CyberMHT. We demonstrate the power of MHETA-INFERD using simulated data. Using metrics from both the tracking and cyber domains, we show that while no tracker is perfect, by applying MHETA-INFERD, advanced nonkinematic tracks can be captured in an automated way, perform better than non-MHT approaches, and decrease analyst response time to cyber threats.

  1. Calcium Hypothesis of Alzheimer's disease and brain aging: A framework for integrating new evidence into a comprehensive theory of pathogenesis.

    Science.gov (United States)

    2017-02-01

    This article updates the Calcium Hypothesis of Alzheimer's disease and brain aging on the basis of emerging evidence since 1994 (The present article, with the subtitle "New evidence for a central role of Ca 2+ in neurodegeneration," includes three appendices that provide context and further explanations for the rationale for the revisions in the updated hypothesis-the three appendices are as follows: Appendix I "Emerging concepts on potential pathogenic roles of [Ca 2+ ]," Appendix II "Future studies to validate the central role of dysregulated [Ca 2+ ] in neurodegeneration," and Appendix III "Epilogue: towards a comprehensive hypothesis.") (Marx J. Fresh evidence points to an old suspect: calcium. Science 2007; 318:384-385). The aim is not only to re-evaluate the original key claims of the hypothesis with a critical eye but also to identify gaps in knowledge required to validate relevant claims and delineate additional studies and/or data that are needed. Some of the key challenges for this effort included examination of questions regarding (1) the temporal and spatial relationships of molecular mechanisms that regulate neuronal calcium ion (Ca 2+ ), (2) the role of changes in concentration of calcium ion [Ca 2+ ] in various subcellular compartments of neurons, (3) how alterations in Ca 2+ signaling affect the performance of neurons under various conditions, ranging from optimal functioning in a healthy state to conditions of decline and deterioration in performance during aging and in disease, and (4) new ideas about the contributions of aging, genetic, and environmental factors to the causal relationships between dysregulation of [Ca 2+ ] and the functioning of neurons (see Appendices I and II). The updated Calcium Hypothesis also includes revised postulates that are intended to promote further crucial experiments to confirm or reject the various predictions of the hypothesis (see Appendix III). Copyright © 2016 the Alzheimer's Association. All rights reserved.

  2. Autonomous Precision Spraying Trials Using a Novel Cell Spray Implement Mounted on an Armadillo Tool Carrier

    DEFF Research Database (Denmark)

    Jensen, Kjeld; Laursen, Morten Stigaard; Midtiby, Henrik

    with an Armadillo robotic tool carrier consisting of two battery powered track modules mounted on each side of the implement. This paper focus on the cell sprayer implement design including camera system, sprayer module and integration with the service robot and the robot software. The FroboMind software platform...... and Armadillo robot is used and it is hypothesized that utilizing FroboMind the cell sprayer can drive smoothly through a test field with a lateral positioning accuracy better than 50 mm. A precision spraying trial in a 1 Ha maize field using different treatment methods was used for testing the hypothesis...

  3. Deadlines in space: Selective effects of coordinate spatial processing in multitasking.

    Science.gov (United States)

    Todorov, Ivo; Del Missier, Fabio; Konke, Linn Andersson; Mäntylä, Timo

    2015-11-01

    Many everyday activities require coordination and monitoring of multiple deadlines. One way to handle these temporal demands might be to represent future goals and deadlines as a pattern of spatial relations. We examined the hypothesis that spatial ability, in addition to executive functioning, contributes to individual differences in multitasking. In two studies, participants completed a multitasking session in which they monitored four digital clocks running at different rates. In Study 1, we found that individual differences in spatial ability and executive functions were independent predictors of multiple-task performance. In Study 2, we found that individual differences in specific spatial abilities were selectively related to multiple-task performance, as only coordinate spatial processing, but not categorical, predicted multitasking, even beyond executive functioning and numeracy. In both studies, males outperformed females in spatial ability and multitasking and in Study 2 these sex differences generalized to a simulation of everyday multitasking. Menstrual changes moderated the effects on multitasking, in that sex differences in coordinate spatial processing and multitasking were observed between males and females in the luteal phase of the menstrual cycle, but not between males and females at menses. Overall, these findings suggest that multiple-task performance reflects independent contributions of spatial ability and executive functioning. Furthermore, our results support the distinction of categorical versus coordinate spatial processing, and suggest that these two basic relational processes are selectively affected by female sex hormones and differentially effective in transforming and handling temporal patterns as spatial relations in the context of multitasking.

  4. Precision Airdrop (Largage de precision)

    Science.gov (United States)

    2005-12-01

    NAVIGATION TO A PRECISION AIRDROP OVERVIEW RTO-AG-300-V24 2 - 9 the point from various compass headings. As the tests are conducted, the resultant...rate. This approach avoids including a magnetic compass for the heading reference, which has difficulties due to local changes in the magnetic field...Scientifica della Difesa ROYAUME-UNI Via XX Settembre 123 Dstl Knowledge Services ESPAGNE 00187 Roma Information Centre, Building 247 SDG TECEN / DGAM

  5. Effects of cue types on sex differences in human spatial memory.

    Science.gov (United States)

    Chai, Xiaoqian J; Jacobs, Lucia F

    2010-04-02

    We examined the effects of cue types on human spatial memory in 3D virtual environments adapted from classical animal and human tasks. Two classes of cues of different functions were investigated: those that provide directional information, and those that provide positional information. Adding a directional cue (geographical slant) to the spatial delayed-match-to-sample task improved performance in males but not in females. When the slant directional cue was removed in a hidden-target location task, male performance was impaired but female performance was unaffected. The removal of positional cues, on the other hand, impaired female performance but not male performance. These results are consistent with results from laboratory rodents and thus support the hypothesis that sex differences in spatial memory arise from the dissociation between a preferential reliance on directional cues in males and on positional cues in females. Copyright 2009 Elsevier B.V. All rights reserved.

  6. Histone deacetylase inhibition abolishes stress-induced spatial memory impairment.

    Science.gov (United States)

    Vargas-López, Viviana; Lamprea, Marisol R; Múnera, Alejandro

    2016-10-01

    Acute stress induced before spatial training impairs memory consolidation. Although non-epigenetic underpinning of such effect has been described, the epigenetic mechanisms involved have not yet been studied. Since spatial training and intense stress have opposite effects on histone acetylation balance, it is conceivable that disruption of such balance may underlie acute stress-induced spatial memory consolidation impairment and that inhibiting histone deacetylases prevents such effect. Trichostatin-A (TSA, a histone deacetylase inhibitor) was used to test its effectiveness in preventing stress' deleterious effect on memory. Male Wistar rats were trained in a spatial task in the Barnes maze; 1-h movement restraint was applied to half of them before training. Immediately after training, stressed and non-stressed animals were randomly assigned to receive either TSA (1mg/kg) or vehicle intraperitoneal injection. Twenty-four hours after training, long-term spatial memory was tested; plasma and brain tissue were collected immediately after the memory test to evaluate corticosterone levels and histone H3 acetylation in several brain areas. Stressed animals receiving vehicle displayed memory impairment, increased plasma corticosterone levels and markedly reduced histone H3 acetylation in prelimbic cortex and hippocampus. Such effects did not occur in stressed animals treated with TSA. The aforementioned results support the hypothesis that acute stress induced-memory impairment is related to histone deacetylation. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Precision medicine for nurses: 101.

    Science.gov (United States)

    Lemoine, Colleen

    2014-05-01

    To introduce the key concepts and terms associated with precision medicine and support understanding of future developments in the field by providing an overview and history of precision medicine, related ethical considerations, and nursing implications. Current nursing, medical and basic science literature. Rapid progress in understanding the oncogenic drivers associated with cancer is leading to a shift toward precision medicine, where treatment is based on targeting specific genetic and epigenetic alterations associated with a particular cancer. Nurses will need to embrace the paradigm shift to precision medicine, expend the effort necessary to learn the essential terminology, concepts and principles, and work collaboratively with physician colleagues to best position our patients to maximize the potential that precision medicine can offer. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Precision genome editing

    DEFF Research Database (Denmark)

    Steentoft, Catharina; Bennett, Eric P; Schjoldager, Katrine Ter-Borch Gram

    2014-01-01

    Precise and stable gene editing in mammalian cell lines has until recently been hampered by the lack of efficient targeting methods. While different gene silencing strategies have had tremendous impact on many biological fields, they have generally not been applied with wide success in the field...... of glycobiology, primarily due to their low efficiencies, with resultant failure to impose substantial phenotypic consequences upon the final glycosylation products. Here, we review novel nuclease-based precision genome editing techniques enabling efficient and stable gene editing, including gene disruption...... by introducing single or double-stranded breaks at a defined genomic sequence. We here compare and contrast the different techniques and summarize their current applications, highlighting cases from the field of glycobiology as well as pointing to future opportunities. The emerging potential of precision gene...

  9. Universality hypothesis breakdown at one-loop order

    Science.gov (United States)

    Carvalho, P. R. S.

    2018-05-01

    We probe the universality hypothesis by analytically computing the at least two-loop corrections to the critical exponents for q -deformed O (N ) self-interacting λ ϕ4 scalar field theories through six distinct and independent field-theoretic renormalization group methods and ɛ -expansion techniques. We show that the effect of q deformation on the one-loop corrections to the q -deformed critical exponents is null, so the universality hypothesis is broken down at this loop order. Such an effect emerges only at the two-loop and higher levels, and the validity of the universality hypothesis is restored. The q -deformed critical exponents obtained through the six methods are the same and, furthermore, reduce to their nondeformed values in the appropriated limit.

  10. Null but not void: considerations for hypothesis testing.

    Science.gov (United States)

    Shaw, Pamela A; Proschan, Michael A

    2013-01-30

    Standard statistical theory teaches us that once the null and alternative hypotheses have been defined for a parameter, the choice of the statistical test is clear. Standard theory does not teach us how to choose the null or alternative hypothesis appropriate to the scientific question of interest. Neither does it tell us that in some cases, depending on which alternatives are realistic, we may want to define our null hypothesis differently. Problems in statistical practice are frequently not as pristinely summarized as the classic theory in our textbooks. In this article, we present examples in statistical hypothesis testing in which seemingly simple choices are in fact rich with nuance that, when given full consideration, make the choice of the right hypothesis test much less straightforward. Published 2012. This article is a US Government work and is in the public domain in the USA.

  11. RDF SKETCH MAPS - KNOWLEDGE COMPLEXITY REDUCTION FOR PRECISION MEDICINE ANALYTICS.

    Science.gov (United States)

    Thanintorn, Nattapon; Wang, Juexin; Ersoy, Ilker; Al-Taie, Zainab; Jiang, Yuexu; Wang, Duolin; Verma, Megha; Joshi, Trupti; Hammer, Richard; Xu, Dong; Shin, Dmitriy

    2016-01-01

    Sketch Map of the top 30% paths retained important information about signaling cascades leading to activation of proto-oncogene BRAF, which is usually associated with a different cancer, melanoma. Recent reports of successful treatments of HCL patients by the BRAF-targeted drug vemurafenib support the validity of the RDF Sketch Maps findings. We therefore believe that RDF Sketch Maps will be invaluable for hypothesis generation for precision diagnostics and therapeutics as well as drug repurposing studies.

  12. Progressive impairment of directional and spatially precise trajectories by TgF344-AD Rats in the Morris Water Task

    OpenAIRE

    Thompson, Shannon; Harvey, Ryan; Clark, Benjamin; Drake, Emma; Berkowitz, Laura

    2018-01-01

    Spatial navigation is impaired in early stages of Alzheimers disease (AD), and may be a defining behavioral marker of preclinical AD. Nevertheless, limitations of diagnostic criteria for AD and within animal models of AD make characterization of preclinical AD difficult. A new rat model (TgF344-AD) of AD overcomes many of these limitations, though spatial navigation has not been comprehensively assessed. Using the hidden and cued platform variants of the Morris water task, a longitudinal asse...

  13. Spatial attention in written word perception.

    Science.gov (United States)

    Montani, Veronica; Facoetti, Andrea; Zorzi, Marco

    2014-01-01

    The role of attention in visual word recognition and reading aloud is a long debated issue. Studies of both developmental and acquired reading disorders provide growing evidence that spatial attention is critically involved in word reading, in particular for the phonological decoding of unfamiliar letter strings. However, studies on healthy participants have produced contrasting results. The aim of this study was to investigate how the allocation of spatial attention may influence the perception of letter strings in skilled readers. High frequency words (HFWs), low frequency words and pseudowords were briefly and parafoveally presented either in the left or the right visual field. Attentional allocation was modulated by the presentation of a spatial cue before the target string. Accuracy in reporting the target string was modulated by the spatial cue but this effect varied with the type of string. For unfamiliar strings, processing was facilitated when attention was focused on the string location and hindered when it was diverted from the target. This finding is consistent the assumptions of the CDP+ model of reading aloud, as well as with familiarity sensitivity models that argue for a flexible use of attention according with the specific requirements of the string. Moreover, we found that processing of HFWs was facilitated by an extra-large focus of attention. The latter result is consistent with the hypothesis that a broad distribution of attention is the default mode during reading of familiar words because it might optimally engage the broad receptive fields of the highest detectors in the hierarchical system for visual word recognition.

  14. Spatial attention in written word perception

    Directory of Open Access Journals (Sweden)

    Veronica eMontani

    2014-02-01

    Full Text Available The role of attention in visual word recognition and reading aloud is a long debated issue. Studies of both developmental and acquired reading disorders provide growing evidence that spatial attention is critically involved in word reading, in particular for the phonological decoding of unfamiliar letter strings. However, studies on healthy participants have produced contrasting results. The aim of this study was to investigate how the allocation of spatial attention may influence the perception of letter strings in skilled readers. High frequency words, low frequency words and pseudowords were briefly and parafoveally presented either in the left or the right visual field. Attentional allocation was modulated by the presentation of a spatial cue before the target string. Accuracy in reporting the target string was modulated by the spatial cue but this effect varied with the type of string. For unfamiliar strings, processing was facilitated when attention was focused on the string location and hindered when it was diverted from the target. This finding is consistent the assumptions of the CDP+ model of reading aloud, as well as with familiarity sensitivity models that argue for a flexible use of attention according with the specific requirements of the string. Moreover, we found that processing of high-frequency words was facilitated by an extra-large focus of attention. The latter result is consistent with the hypothesis that a broad distribution of attention is the default mode during reading of familiar words because it might optimally engage the broad receptive fields of the highest detectors in the hierarchical system for visual word recognition.

  15. SETI in vivo: testing the we-are-them hypothesis

    Science.gov (United States)

    Makukov, Maxim A.; Shcherbak, Vladimir I.

    2018-04-01

    After it was proposed that life on Earth might descend from seeding by an earlier extraterrestrial civilization motivated to secure and spread life, some authors noted that this alternative offers a testable implication: microbial seeds could be intentionally supplied with a durable signature that might be found in extant organisms. In particular, it was suggested that the optimal location for such an artefact is the genetic code, as the least evolving part of cells. However, as the mainstream view goes, this scenario is too speculative and cannot be meaningfully tested because encoding/decoding a signature within the genetic code is something ill-defined, so any retrieval attempt is doomed to guesswork. Here we refresh the seeded-Earth hypothesis in light of recent observations, and discuss the motivation for inserting a signature. We then show that `biological SETI' involves even weaker assumptions than traditional SETI and admits a well-defined methodological framework. After assessing the possibility in terms of molecular and evolutionary biology, we formalize the approach and, adopting the standard guideline of SETI that encoding/decoding should follow from first principles and be convention-free, develop a universal retrieval strategy. Applied to the canonical genetic code, it reveals a non-trivial precision structure of interlocked logical and numerical attributes of systematic character (previously we found these heuristically). To assess this result in view of the initial assumption, we perform statistical, comparison, interdependence and semiotic analyses. Statistical analysis reveals no causal connection of the result to evolutionary models of the genetic code, interdependence analysis precludes overinterpretation, and comparison analysis shows that known variations of the code lack any precision-logic structures, in agreement with these variations being post-LUCA (i.e. post-seeding) evolutionary deviations from the canonical code. Finally, semiotic

  16. Questioning the social intelligence hypothesis.

    Science.gov (United States)

    Holekamp, Kay E

    2007-02-01

    The social intelligence hypothesis posits that complex cognition and enlarged "executive brains" evolved in response to challenges that are associated with social complexity. This hypothesis has been well supported, but some recent data are inconsistent with its predictions. It is becoming increasingly clear that multiple selective agents, and non-selective constraints, must have acted to shape cognitive abilities in humans and other animals. The task now is to develop a larger theoretical framework that takes into account both inter-specific differences and similarities in cognition. This new framework should facilitate consideration of how selection pressures that are associated with sociality interact with those that are imposed by non-social forms of environmental complexity, and how both types of functional demands interact with phylogenetic and developmental constraints.

  17. Spatial attraction in migrants' settlement patterns in the city of Catania

    Directory of Open Access Journals (Sweden)

    Angelo Mazza

    2016-07-01

    Full Text Available Background: In broad terms, and apart from ethnic discriminatory rules enforced in some places and at some times, residential segregation may be ascribed both to economic inhomogeneities in the urban space (e.g., in the cost of rents, or in occupation opportunities and to spatial attraction among individuals sharing the same group identity and culture. Objective: Traditional indices of spatial segregation do not distinguish between these two sources of clustering. Furthermore, they typically rely on census tracts, a scale that does not allow for fine-grained analysis. Also, the use of alternative zoning often leads to conflicting results. The aim of this paper is to measure spatial attraction among groups of foreign migrants in Catania (Italy using individual household data. Methods: We apply a version of Ripley's K-function specially conceived for assessing spatial attraction while adjusting for the effects of spatial inhomogeneity. To avoid the risk of confounding the two sources of clustering, spatial inhomogeneity is estimated following a case-control approach. Results: Different parts of the city exhibit different suitabilities for migrants of different nationalities, with groups mainly involved in housekeeping and caregiving being more spread than the ones specialized in peddling and retailing. A significant spatial attraction has been found for Sri Lankan, Mauritians, Senegalese, and Chinese. Conversely, the settlement patterns of Tunisians and Moroccans comply with random allocation. These results seem consistent with the hypothesis of a relevant correlation between chain migration and spatial attraction.

  18. The Younger Dryas impact hypothesis: A critical review

    NARCIS (Netherlands)

    van Hoesel, A.; Hoek, W.Z.; Pennock, G.M.; Drury, Martyn

    2014-01-01

    The Younger Dryas impact hypothesis suggests that multiple extraterrestrial airbursts or impacts resulted in the Younger Dryas cooling, extensive wildfires, megafaunal extinctions and changes in human population. After the hypothesis was first published in 2007, it gained much criticism, as the

  19. Advanced bioanalytics for precision medicine.

    Science.gov (United States)

    Roda, Aldo; Michelini, Elisa; Caliceti, Cristiana; Guardigli, Massimo; Mirasoli, Mara; Simoni, Patrizia

    2018-01-01

    Precision medicine is a new paradigm that combines diagnostic, imaging, and analytical tools to produce accurate diagnoses and therapeutic interventions tailored to the individual patient. This approach stands in contrast to the traditional "one size fits all" concept, according to which researchers develop disease treatments and preventions for an "average" patient without considering individual differences. The "one size fits all" concept has led to many ineffective or inappropriate treatments, especially for pathologies such as Alzheimer's disease and cancer. Now, precision medicine is receiving massive funding in many countries, thanks to its social and economic potential in terms of improved disease prevention, diagnosis, and therapy. Bioanalytical chemistry is critical to precision medicine. This is because identifying an appropriate tailored therapy requires researchers to collect and analyze information on each patient's specific molecular biomarkers (e.g., proteins, nucleic acids, and metabolites). In other words, precision diagnostics is not possible without precise bioanalytical chemistry. This Trend article highlights some of the most recent advances, including massive analysis of multilayer omics, and new imaging technique applications suitable for implementing precision medicine. Graphical abstract Precision medicine combines bioanalytical chemistry, molecular diagnostics, and imaging tools for performing accurate diagnoses and selecting optimal therapies for each patient.

  20. Spatial Complementarity and the Coexistence of Species

    Science.gov (United States)

    Velázquez, Jorge; Garrahan, Juan P.; Eichhorn, Markus P.

    2014-01-01

    Coexistence of apparently similar species remains an enduring paradox in ecology. Spatial structure has been predicted to enable coexistence even when population-level models predict competitive exclusion if it causes each species to limit its own population more than that of its competitor. Nevertheless, existing hypotheses conflict with regard to whether clustering favours or precludes coexistence. The spatial segregation hypothesis predicts that in clustered populations the frequency of intra-specific interactions will be increased, causing each species to be self-limiting. Alternatively, individuals of the same species might compete over greater distances, known as heteromyopia, breaking down clusters and opening space for a second species to invade. In this study we create an individual-based model in homogeneous two-dimensional space for two putative sessile species differing only in their demographic rates and the range and strength of their competitive interactions. We fully characterise the parameter space within which coexistence occurs beyond population-level predictions, thereby revealing a region of coexistence generated by a previously-unrecognised process which we term the triadic mechanism. Here coexistence occurs due to the ability of a second generation of offspring of the rarer species to escape competition from their ancestors. We diagnose the conditions under which each of three spatial coexistence mechanisms operates and their characteristic spatial signatures. Deriving insights from a novel metric — ecological pressure — we demonstrate that coexistence is not solely determined by features of the numerically-dominant species. This results in a common framework for predicting, given any pair of species and knowledge of the relevant parameters, whether they will coexist, the mechanism by which they will do so, and the resultant spatial pattern of the community. Spatial coexistence arises from complementary combinations of traits in each

  1. Flight control and landing precision in the nocturnal bee Megalopta is robust to large changes in light intensity.

    Science.gov (United States)

    Baird, Emily; Fernandez, Diana C; Wcislo, William T; Warrant, Eric J

    2015-01-01

    Like their diurnal relatives, Megalopta genalis use visual information to control flight. Unlike their diurnal relatives, however, they do this at extremely low light intensities. Although Megalopta has developed optical specializations to increase visual sensitivity, theoretical studies suggest that this enhanced sensitivity does not enable them to capture enough light to use visual information to reliably control flight in the rainforest at night. It has been proposed that Megalopta gain extra sensitivity by summing visual information over time. While enhancing the reliability of vision, this strategy would decrease the accuracy with which they can detect image motion-a crucial cue for flight control. Here, we test this temporal summation hypothesis by investigating how Megalopta's flight control and landing precision is affected by light intensity and compare our findings with the results of similar experiments performed on the diurnal bumblebee Bombus terrestris, to explore the extent to which Megalopta's adaptations to dim light affect their precision. We find that, unlike Bombus, light intensity does not affect flight and landing precision in Megalopta. Overall, we find little evidence that Megalopta uses a temporal summation strategy in dim light, while we find strong support for the use of this strategy in Bombus.

  2. Cross-Industry Spatially Localized Innovation Networks

    Directory of Open Access Journals (Sweden)

    Aleksandr Evseevich Karlik

    2016-12-01

    Full Text Available This article’s objective is to develop conceptual approach to the study of key decision-making factors of cross-industry spatially localized innovation networks regularities by the application of quantitative and qualitative data of St. Petersburg Innovation and Technology Cluster of Machinery Manufacturing and Metalworking. The paper is based on the previous research findings which conclude that such networks have a set of opportunities and constraints for innovation. The hypothesis is that in the clusters, representing a special type of these networks, the spatial proximity partly offsets the negative impact of industrial distance. The authors propose a structural and logical model of strategic decision-making to analyze these effects on innovation. They specify network’s influences on performance: cognitive diversity; knowledge and expertise; structural autonomy and equivalence. The model is applied to spatially localized cross-industry cluster and then improved in accordance with the obtained results for accounting resource flows. It allowed to take into account the dynamics of innovation activity and to develop the practical implications in the particular business context. The analysis identified the peculiarities of spatially localized crossindustry innovation cooperation in perspective of the combinations of tangible resources, information and other intangible resources for the renewal of mature industries. The research results can be used in business as well as in industrial and regional economic policy. In the conclusion, the article outlines future research directions: a comprehensive empirical study with the analysis of data on the factors of cross-industry cooperation which were identified in this paper with testing of causal relations; the developing an approach to the study of spatially localized networks based on the exchange of primary resources in the economic system stability framework.

  3. Getting nowhere fast: trade-off between speed and precision in training to execute image-guided hand-tool movements

    Directory of Open Access Journals (Sweden)

    Anil Ufuk Batmaz

    2016-11-01

    Full Text Available Abstract Background The speed and precision with which objects are moved by hand or hand-tool interaction under image guidance depend on a specific type of visual and spatial sensorimotor learning. Novices have to learn to optimally control what their hands are doing in a real-world environment while looking at an image representation of the scene on a video monitor. Previous research has shown slower task execution times and lower performance scores under image-guidance compared with situations of direct action viewing. The cognitive processes for overcoming this drawback by training are not yet understood. Methods We investigated the effects of training on the time and precision of direct view versus image guided object positioning on targets of a Real-world Action Field (RAF. Two men and two women had to learn to perform the task as swiftly and as precisely as possible with their dominant hand, using a tool or not and wearing a glove or not. Individuals were trained in sessions of mixed trial blocks with no feed-back. Results As predicted, image-guidance produced significantly slower times and lesser precision in all trainees and sessions compared with direct viewing. With training, all trainees get faster in all conditions, but only one of them gets reliably more precise in the image-guided conditions. Speed-accuracy trade-offs in the individual performance data show that the highest precision scores and steepest learning curve, for time and precision, were produced by the slowest starter. Fast starters produced consistently poorer precision scores in all sessions. The fastest starter showed no sign of stable precision learning, even after extended training. Conclusions Performance evolution towards optimal precision is compromised when novices start by going as fast as they can. The findings have direct implications for individual skill monitoring in training programmes for image-guided technology applications with human operators.

  4. Performance of the ATLAS Precision Muon Chambers under LHC Operating Conditions

    CERN Document Server

    Deile, M.; Dubbert, J; Horvat, S; Kortner, O; Kroha, H; Manz, A; Mohrdieck, S; Rauscher, F; Richter, Robert; Staude, A

    2004-01-01

    For the muon spectrometer of the ATLAS detector at the large hadron collider (LHC), large drift chambers consisting of 6 to 8 layers of pressurized drift tubes are used for precision tracking covering an active area of 5000 m2 in the toroidal ?eld of superconducting air core magnets. The chambers have to provide a spatial resolution of 41 microns with Ar:CO2 (93:7) gas mixture at an absolute pressure of 3 bar and gas gain of 2?104. The environment in which the chambers will be operated is characterized by high neutron and background with counting rates of up to 100 per square cm and second. The resolution and efficiency of a chamber from the serial production for ATLAS has been investigated in a 100 GeV muon beam at photon irradiation rates as expected during LHC operation. A silicon strip detector telescope was used as external reference in the beam. The spatial resolution of a chamber is degraded by 4 ?m at the highest background rate. The detection e?ciency of the drift tubes is unchanged under irradiation...

  5. High precision during food recruitment of experienced (reactivated) foragers in the stingless bee Scaptotrigona mexicana (Apidae, Meliponini)

    Science.gov (United States)

    Sánchez, Daniel; Nieh, James C.; Hénaut, Yann; Cruz, Leopoldo; Vandame, Rémy

    Several studies have examined the existence of recruitment communication mechanisms in stingless bees. However, the spatial accuracy of location-specific recruitment has not been examined. Moreover, the location-specific recruitment of reactivated foragers, i.e., foragers that have previously experienced the same food source at a different location and time, has not been explicitly examined. However, such foragers may also play a significant role in colony foraging, particularly in small colonies. Here we report that reactivated Scaptotrigona mexicana foragers can recruit with high precision to a specific food location. The recruitment precision of reactivated foragers was evaluated by placing control feeders to the left and the right of the training feeder (direction-precision tests) and between the nest and the training feeder and beyond it (distance-precision tests). Reactivated foragers arrived at the correct location with high precision: 98.44% arrived at the training feeder in the direction trials (five-feeder fan-shaped array, accuracy of at least +/-6° of azimuth at 50 m from the nest), and 88.62% arrived at the training feeder in the distance trials (five-feeder linear array, accuracy of at least +/-5 m or +/-10% at 50 m from the nest). Thus, S. mexicana reactivated foragers can find the indicated food source at a specific distance and direction with high precision, higher than that shown by honeybees, Apis mellifera, which do not communicate food location at such close distances to the nest.

  6. Anosognosia as motivated unawareness: the 'defence' hypothesis revisited.

    Science.gov (United States)

    Turnbull, Oliver H; Fotopoulou, Aikaterini; Solms, Mark

    2014-12-01

    Anosognosia for hemiplegia has seen a century of almost continuous research, yet a definitive understanding of its mechanism remains elusive. Essentially, anosognosic patients hold quasi-delusional beliefs about their paralysed limbs, in spite of all the contrary evidence, repeated questioning, and logical argument. We review a range of findings suggesting that emotion and motivation play an important role in anosognosia. We conclude that anosognosia involves (amongst other things) a process of psychological defence. This conclusion stems from a wide variety of clinical and experimental investigations, including data on implicit awareness of deficit, fluctuations in awareness over time, and dramatic effects upon awareness of psychological interventions such as psychotherapy, reframing of the emotional consequences of the paralysis, and first versus third person perspectival manipulations. In addition, we review and refute the (eight) arguments historically raised against the 'defence' hypothesis, including the claim that a defence-based account cannot explain the lateralised nature of the disorder. We argue that damage to a well-established right-lateralised emotion regulation system, with links to psychological processes that appear to underpin allocentric spatial cognition, plays a key role in anosognosia (at least in some patients). We conclude with a discussion of implications for clinical practice. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. A Dopamine Hypothesis of Autism Spectrum Disorder.

    Science.gov (United States)

    Pavăl, Denis

    2017-01-01

    Autism spectrum disorder (ASD) comprises a group of neurodevelopmental disorders characterized by social deficits and stereotyped behaviors. While several theories have emerged, the pathogenesis of ASD remains unknown. Although studies report dopamine signaling abnormalities in autistic patients, a coherent dopamine hypothesis which could link neurobiology to behavior in ASD is currently lacking. In this paper, we present such a hypothesis by proposing that autistic behavior arises from dysfunctions in the midbrain dopaminergic system. We hypothesize that a dysfunction of the mesocorticolimbic circuit leads to social deficits, while a dysfunction of the nigrostriatal circuit leads to stereotyped behaviors. Furthermore, we discuss 2 key predictions of our hypothesis, with emphasis on clinical and therapeutic aspects. First, we argue that dopaminergic dysfunctions in the same circuits should associate with autistic-like behavior in nonautistic subjects. Concerning this, we discuss the case of PANDAS (pediatric autoimmune neuropsychiatric disorder associated with streptococcal infections) which displays behaviors similar to those of ASD, presumed to arise from dopaminergic dysfunctions. Second, we argue that providing dopamine modulators to autistic subjects should lead to a behavioral improvement. Regarding this, we present clinical studies of dopamine antagonists which seem to have improving effects on autistic behavior. Furthermore, we explore the means of testing our hypothesis by using neuroreceptor imaging, which could provide comprehensive evidence for dopamine signaling dysfunctions in autistic subjects. Lastly, we discuss the limitations of our hypothesis. Along these lines, we aim to provide a dopaminergic model of ASD which might lead to a better understanding of the ASD pathogenesis. © 2017 S. Karger AG, Basel.

  8. The challenge of building large area, high precision small-strip Thin Gap Trigger Chambers for the upgrade of the ATLAS experiment

    CERN Document Server

    Maleev, Victor; The ATLAS collaboration

    2015-01-01

    The current innermost stations of the ATLAS muon endcap system must be upgraded in 2018 and 2019 to retain the good precision tracking and trigger capabilities in the high background environment expected with the upcoming luminosity increase of the LHC. Large area small-strip Thin Gap Chambers (sTGC) up to 2 m2 in size and totaling an active area of 1200 m2 will be employed for fast and precise triggering. The precision reconstruction of tracks requires a spatial resolution of about 100 μm to allow the Level-1 trigger track segments to be reconstructed with an angular resolution of 1mrad. The upgraded detector will consist of eight layers each of Micromegas and sTGC’s detectors together forming the ATLAS New Small Wheels. The position of each strip must be known with an accuracy of 30 µm along the precision coordinate and 80 µm along the beam. On such large area detectors, the mechanical precision is a key point and then must be controlled and monitored all along the process of construction and integrati...

  9. The Challenge of Building Large Area, High Precision Small-Strip Thin Gap Trigger Chambers for the Upgrade of the ATLAS Experiment

    CERN Document Server

    Maleev, Victor; The ATLAS collaboration

    2015-01-01

    The current innermost stations of the ATLAS muon end-cap system must be upgraded in 2018 and 2019 to retain the good precision tracking and trigger capabilities in the high background environment expected with the upcoming luminosity increase of the LHC. Large area small-strip Thin Gap Chambers (sTGC) up to 2 $m^2$ in size and totaling an active area of 1200 $m^2$ will be employed for fast and precise triggering. The precision reconstruction of tracks requires a spatial resolution of about 100 $\\mu m$ while the Level-1 trigger track segments need to be reconstructed with an angular resolution of 1 mrad. The upgraded detector will consist of eight layers each of Micromegas and sTGC’s detectors together forming the ATLAS New Small Wheels. The position of each strip must be known with an accuracy of 40 $\\mu m$ along the precision coordinate and 80 $\\mu m$ along the beam. On such large area detectors, the mechanical precision is a key point and then must be controlled and monitored all along the process of cons...

  10. High-precision half-life measurements of the T=1/2 mirror beta decays F-17 and Cl-33

    OpenAIRE

    Grinyer, J; Grinyer, G. F; Babo, Mathieu; Bouzomita, H; Chauveau, P; Delahaye, P; Dubois, M; Frigot, R; Jardin, P; Leboucher, C; Maunoury, L; Seiffert, C; Thomas, J. C; Traykov, E

    2015-01-01

    Background: Measurements of the ft values for T=1/2 mirror β+ decays offer a method to test the conserved vector current hypothesis and to determine Vud, the up-down matrix element of the Cabibbo-Kobayashi-Maskawa matrix. In most mirror decays used for these tests, uncertainties in the ft values are dominated by the uncertainties in the half-lives. Purpose: Two precision half-life measurements were performed for the T=1/2β+ emitters, 17F and 33Cl, in order to eliminate the half-life as th...

  11. High-precision half-life measurements of the T=1/2 mirror β decays F17 and Cl33

    OpenAIRE

    Grinyer, J; Grinyer, G F; Babo, M; Bouzomita, H; Chauveau, P; Delahaye, P; Dubois, M; Frigot, R; Jardin, P; Leboucher, C; Maunoury, L; Seiffert, C; Thomas, J C; Traykov, E

    2015-01-01

    Background: Measurements of the ft values for T=1/2 mirror β+ decays offer a method to test the conserved vector current hypothesis and to determine Vud, the up-down matrix element of the Cabibbo-Kobayashi-Maskawa matrix. In most mirror decays used for these tests, uncertainties in the ft values are dominated by the uncertainties in the half-lives. Purpose: Two precision half-life measurements were performed for the T=1/2β+ emitters, F17 and Cl33, in order to eliminate the half-life as the le...

  12. A precision measurement of the mass of the top quark

    International Nuclear Information System (INIS)

    Abazov, V.M.

    2004-01-01

    The standard model of particle physics contains parameters -- such as particle masses -- whose origins are still unknown and which cannot be predicted, but whose values are constrained through their interactions. In particular, the masses of the top quark (M t ) and W boson (M W ) constrain the mass of the long-hypothesized, but thus far not observed, Higgs boson. A precise measurement of M t can therefore indicate where to look for the Higgs, and indeed whether the hypothesis of a standard model Higgs is consistent with experimental data. As top quarks are produced in pairs and decay in only about 10 -24 s into various final states, reconstructing their masses from their decay products is very challenging. Here we report a technique that extracts more information from each top-quark event and yields a greatly improved precision (of +- 5.3 GeV/c 2 ) when compared to previous measurements. When our new result is combined with our published measurement in a complementary decay mode and with the only other measurements available, the new world average for M t becomes 178.0 +- 4.3 GeV/c 2 . As a result, the most likely Higgs mass increases from the experimentally excluded value of 96 to 117 GeV/c 2 , which is beyond current experimental sensitivity. The upper limit on the Higgs mass at the 95% confidence level is raised from 219 to 251 GeV/c 2

  13. Testing competing forms of the Milankovitch hypothesis

    DEFF Research Database (Denmark)

    Kaufmann, Robert K.; Juselius, Katarina

    2016-01-01

    We test competing forms of the Milankovitch hypothesis by estimating the coefficients and diagnostic statistics for a cointegrated vector autoregressive model that includes 10 climate variables and four exogenous variables for solar insolation. The estimates are consistent with the physical...... ice volume and solar insolation. The estimated adjustment dynamics show that solar insolation affects an array of climate variables other than ice volume, each at a unique rate. This implies that previous efforts to test the strong form of the Milankovitch hypothesis by examining the relationship...... that the latter is consistent with a weak form of the Milankovitch hypothesis and that it should be restated as follows: Internal climate dynamics impose perturbations on glacial cycles that are driven by solar insolation. Our results show that these perturbations are likely caused by slow adjustment between land...

  14. Dynamical agents' strategies and the fractal market hypothesis

    Czech Academy of Sciences Publication Activity Database

    Vácha, Lukáš; Vošvrda, Miloslav

    2005-01-01

    Roč. 14, č. 2 (2005), s. 172-179 ISSN 1210-0455 Grant - others:GA UK(CZ) 454/2004/A EK/FSV Institutional research plan: CEZ:AV0Z10750506 Keywords : efficient market hypothesis * fractal market hypothesis * agent's investment horizons Subject RIV: AH - Economics

  15. Multivariate Receptor Models for Spatially Correlated Multipollutant Data

    KAUST Repository

    Jun, Mikyoung

    2013-08-01

    The goal of multivariate receptor modeling is to estimate the profiles of major pollution sources and quantify their impacts based on ambient measurements of pollutants. Traditionally, multivariate receptor modeling has been applied to multiple air pollutant data measured at a single monitoring site or measurements of a single pollutant collected at multiple monitoring sites. Despite the growing availability of multipollutant data collected from multiple monitoring sites, there has not yet been any attempt to incorporate spatial dependence that may exist in such data into multivariate receptor modeling. We propose a spatial statistics extension of multivariate receptor models that enables us to incorporate spatial dependence into estimation of source composition profiles and contributions given the prespecified number of sources and the model identification conditions. The proposed method yields more precise estimates of source profiles by accounting for spatial dependence in the estimation. More importantly, it enables predictions of source contributions at unmonitored sites as well as when there are missing values at monitoring sites. The method is illustrated with simulated data and real multipollutant data collected from eight monitoring sites in Harris County, Texas. Supplementary materials for this article, including data and R code for implementing the methods, are available online on the journal web site. © 2013 Copyright Taylor and Francis Group, LLC.

  16. Investigating the environmental Kuznets curve hypothesis in Vietnam

    International Nuclear Information System (INIS)

    Al-Mulali, Usama; Saboori, Behnaz; Ozturk, Ilhan

    2015-01-01

    This study investigates the existence of the environmental Kuznets curve (EKC) hypothesis in Vietnam during the period 1981–2011. To realize the goals of this study, a pollution model was established applying the Autoregressive Distributed Lag (ARDL) methodology. The results revealed that the pollution haven hypothesis does exist in Vietnam because capital increases pollution. In addition, imports also increase pollution which indicates that most of Vietnam's imported products are energy intensive and highly polluted. However, exports have no effect on pollution which indicates that the level of exports is not significant enough to affect pollution. Moreover, fossil fuel energy consumption increases pollution while renewable energy consumption has no significant effect in reducing pollution. Furthermore, labor force reduces pollution since most of Vietnam's labor force is in the agricultural and services sectors which are less energy intensive than the industrial sector. Based on the obtained results, the EKC hypothesis does not exist because the relationship between GDP and pollution is positive in both the short and long run. - Highlights: • The environmental Kuznets curve (EKC) hypothesis in Vietnam is investigated. • The Autoregressive Distributed Lag (ARDL) methodology was utilized. • The EKC hypothesis does not exist

  17. Social learning and evolution: the cultural intelligence hypothesis

    Science.gov (United States)

    van Schaik, Carel P.; Burkart, Judith M.

    2011-01-01

    If social learning is more efficient than independent individual exploration, animals should learn vital cultural skills exclusively, and routine skills faster, through social learning, provided they actually use social learning preferentially. Animals with opportunities for social learning indeed do so. Moreover, more frequent opportunities for social learning should boost an individual's repertoire of learned skills. This prediction is confirmed by comparisons among wild great ape populations and by social deprivation and enculturation experiments. These findings shaped the cultural intelligence hypothesis, which complements the traditional benefit hypotheses for the evolution of intelligence by specifying the conditions in which these benefits can be reaped. The evolutionary version of the hypothesis argues that species with frequent opportunities for social learning should more readily respond to selection for a greater number of learned skills. Because improved social learning also improves asocial learning, the hypothesis predicts a positive interspecific correlation between social-learning performance and individual learning ability. Variation among primates supports this prediction. The hypothesis also predicts that more heavily cultural species should be more intelligent. Preliminary tests involving birds and mammals support this prediction too. The cultural intelligence hypothesis can also account for the unusual cognitive abilities of humans, as well as our unique mechanisms of skill transfer. PMID:21357223

  18. Social learning and evolution: the cultural intelligence hypothesis.

    Science.gov (United States)

    van Schaik, Carel P; Burkart, Judith M

    2011-04-12

    If social learning is more efficient than independent individual exploration, animals should learn vital cultural skills exclusively, and routine skills faster, through social learning, provided they actually use social learning preferentially. Animals with opportunities for social learning indeed do so. Moreover, more frequent opportunities for social learning should boost an individual's repertoire of learned skills. This prediction is confirmed by comparisons among wild great ape populations and by social deprivation and enculturation experiments. These findings shaped the cultural intelligence hypothesis, which complements the traditional benefit hypotheses for the evolution of intelligence by specifying the conditions in which these benefits can be reaped. The evolutionary version of the hypothesis argues that species with frequent opportunities for social learning should more readily respond to selection for a greater number of learned skills. Because improved social learning also improves asocial learning, the hypothesis predicts a positive interspecific correlation between social-learning performance and individual learning ability. Variation among primates supports this prediction. The hypothesis also predicts that more heavily cultural species should be more intelligent. Preliminary tests involving birds and mammals support this prediction too. The cultural intelligence hypothesis can also account for the unusual cognitive abilities of humans, as well as our unique mechanisms of skill transfer.

  19. Hippocampal structure and human cognition: key role of spatial processing and evidence supporting the efficiency hypothesis in females

    Science.gov (United States)

    Colom, Roberto; Stein, Jason L.; Rajagopalan, Priya; Martínez, Kenia; Hermel, David; Wang, Yalin; Álvarez-Linera, Juan; Burgaleta, Miguel; Quiroga, MªÁngeles; Shih, Pei Chun; Thompson, Paul M.

    2014-01-01

    Here we apply a method for automated segmentation of the hippocampus in 3D high-resolution structural brain MRI scans. One hundred and four healthy young adults completed twenty one tasks measuring abstract, verbal, and spatial intelligence, along with working memory, executive control, attention, and processing speed. After permutation tests corrected for multiple comparisons across vertices (p related to hippocampal structural differences. PMID:25632167

  20. Personal Hypothesis Testing: The Role of Consistency and Self-Schema.

    Science.gov (United States)

    Strohmer, Douglas C.; And Others

    1988-01-01

    Studied how individuals test hypotheses about themselves. Examined extent to which Snyder's bias toward confirmation persists when negative or nonconsistent personal hypothesis is tested. Found negativity or positivity did not affect hypothesis testing directly, though hypothesis consistency did. Found cognitive schematic variable (vulnerability…

  1. Evaluating the Stage Learning Hypothesis.

    Science.gov (United States)

    Thomas, Hoben

    1980-01-01

    A procedure for evaluating the Genevan stage learning hypothesis is illustrated by analyzing Inhelder, Sinclair, and Bovet's guided learning experiments (in "Learning and the Development of Cognition." Cambridge: Harvard University Press, 1974). (Author/MP)

  2. Simulation of the K-function in the analysis of spatial clustering for non-randomly distributed locations-Exemplified by bovine virus diarrhoea virus (BVDV) infection in Denmark

    DEFF Research Database (Denmark)

    Ersbøll, Annette Kjær; Ersbøll, Bjarne Kjær

    2009-01-01

    -infected (N-N+)). The differences between the empirical and the estimated null-hypothesis version of the K-function are plotted together with the 95% simulation envelopes versus the distance, h. In this way we test if the spatial distribution of the infected herds differs from the spatial distribution...

  3. Probing spatial locality in ionic liquids with the grand canonical adaptive resolution molecular dynamics technique

    Science.gov (United States)

    Shadrack Jabes, B.; Krekeler, C.; Klein, R.; Delle Site, L.

    2018-05-01

    We employ the Grand Canonical Adaptive Resolution Simulation (GC-AdResS) molecular dynamics technique to test the spatial locality of the 1-ethyl 3-methyl imidazolium chloride liquid. In GC-AdResS, atomistic details are kept only in an open sub-region of the system while the environment is treated at coarse-grained level; thus, if spatial quantities calculated in such a sub-region agree with the equivalent quantities calculated in a full atomistic simulation, then the atomistic degrees of freedom outside the sub-region play a negligible role. The size of the sub-region fixes the degree of spatial locality of a certain quantity. We show that even for sub-regions whose radius corresponds to the size of a few molecules, spatial properties are reasonably reproduced thus suggesting a higher degree of spatial locality, a hypothesis put forward also by other researchers and that seems to play an important role for the characterization of fundamental properties of a large class of ionic liquids.

  4. A default Bayesian hypothesis test for ANOVA designs

    NARCIS (Netherlands)

    Wetzels, R.; Grasman, R.P.P.P.; Wagenmakers, E.J.

    2012-01-01

    This article presents a Bayesian hypothesis test for analysis of variance (ANOVA) designs. The test is an application of standard Bayesian methods for variable selection in regression models. We illustrate the effect of various g-priors on the ANOVA hypothesis test. The Bayesian test for ANOVA

  5. Laser precision microfabrication

    CERN Document Server

    Sugioka, Koji; Pique, Alberto

    2010-01-01

    Miniaturization and high precision are rapidly becoming a requirement for many industrial processes and products. As a result, there is greater interest in the use of laser microfabrication technology to achieve these goals. This book composed of 16 chapters covers all the topics of laser precision processing from fundamental aspects to industrial applications to both inorganic and biological materials. It reviews the sate of the art of research and technological development in the area of laser processing.

  6. Multi-agent sequential hypothesis testing

    KAUST Repository

    Kim, Kwang-Ki K.; Shamma, Jeff S.

    2014-01-01

    incorporate costs of taking private/public measurements, costs of time-difference and disagreement in actions of agents, and costs of false declaration/choices in the sequential hypothesis testing. The corresponding sequential decision processes have well

  7. Bayesian Hypothesis Testing

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, Stephen A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sigeti, David E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-11-15

    These are a set of slides about Bayesian hypothesis testing, where many hypotheses are tested. The conclusions are the following: The value of the Bayes factor obtained when using the median of the posterior marginal is almost the minimum value of the Bayes factor. The value of τ2 which minimizes the Bayes factor is a reasonable choice for this parameter. This allows a likelihood ratio to be computed with is the least favorable to H0.

  8. TESTING THE HYPOTHESIS THAT METHANOL MASER RINGS TRACE CIRCUMSTELLAR DISKS: HIGH-RESOLUTION NEAR-INFRARED AND MID-INFRARED IMAGING

    International Nuclear Information System (INIS)

    De Buizer, James M.; Bartkiewicz, Anna; Szymczak, Marian

    2012-01-01

    Milliarcsecond very long baseline interferometry maps of regions containing 6.7 GHz methanol maser emission have lead to the recent discovery of ring-like distributions of maser spots and the plausible hypothesis that they may be tracing circumstellar disks around forming high-mass stars. We aimed to test this hypothesis by imaging these regions in the near- and mid-infrared at high spatial resolution and compare the observed emission to the expected infrared morphologies as inferred from the geometries of the maser rings. In the near-infrared we used the Gemini North adaptive optics system of ALTAIR/NIRI, while in the mid-infrared we used the combination of the Gemini South instrument T-ReCS and super-resolution techniques. Resultant images had a resolution of ∼150 mas in both the near-infrared and mid-infrared. We discuss the expected distribution of circumstellar material around young and massive accreting (proto)stars and what infrared emission geometries would be expected for the different maser ring orientations under the assumption that the masers are coming from within circumstellar disks. Based upon the observed infrared emission geometries for the four targets in our sample and the results of spectral energy distribution modeling of the massive young stellar objects associated with the maser rings, we do not find compelling evidence in support of the hypothesis that methanol masers rings reside in circumstellar disks.

  9. Precision digital control systems

    Science.gov (United States)

    Vyskub, V. G.; Rozov, B. S.; Savelev, V. I.

    This book is concerned with the characteristics of digital control systems of great accuracy. A classification of such systems is considered along with aspects of stabilization, programmable control applications, digital tracking systems and servomechanisms, and precision systems for the control of a scanning laser beam. Other topics explored are related to systems of proportional control, linear devices and methods for increasing precision, approaches for further decreasing the response time in the case of high-speed operation, possibilities for the implementation of a logical control law, and methods for the study of precision digital control systems. A description is presented of precision automatic control systems which make use of electronic computers, taking into account the existing possibilities for an employment of computers in automatic control systems, approaches and studies required for including a computer in such control systems, and an analysis of the structure of automatic control systems with computers. Attention is also given to functional blocks in the considered systems.

  10. Spatial heterogeneity of climate change as an experiential basis for skepticism.

    Science.gov (United States)

    Kaufmann, Robert K; Mann, Michael L; Gopal, Sucharita; Liederman, Jackie A; Howe, Peter D; Pretis, Felix; Tang, Xiaojing; Gilmore, Michelle

    2017-01-03

    We postulate that skepticism about climate change is partially caused by the spatial heterogeneity of climate change, which exposes experiential learners to climate heuristics that differ from the global average. This hypothesis is tested by formalizing an index that measures local changes in climate using station data and comparing this index with survey-based model estimates of county-level opinion about whether global warming is happening. Results indicate that more stations exhibit cooling and warming than predicted by random chance and that spatial variations in these changes can account for spatial variations in the percentage of the population that believes that "global warming is happening." This effect is diminished in areas that have experienced more record low temperatures than record highs since 2005. Together, these results suggest that skepticism about climate change is driven partially by personal experiences; an accurate heuristic for local changes in climate identifies obstacles to communicating ongoing changes in climate to the public and how these communications might be improved.

  11. New Hypothesis for SOFC Ceramic Oxygen Electrode Mechanisms

    DEFF Research Database (Denmark)

    Mogensen, Mogens Bjerg; Chatzichristodoulou, Christodoulos; Graves, Christopher R.

    2016-01-01

    A new hypothesis for the electrochemical reaction mechanism in solid oxide cell ceramic oxygen electrodes is proposed based on literature including our own results. The hypothesis postulates that the observed thin layers of SrO-La2O3 on top of ceramic perovskite and other Ruddlesden-Popper...

  12. The Purchasing Power Parity Hypothesis:

    African Journals Online (AJOL)

    2011-10-02

    Oct 2, 2011 ... reject the unit root hypothesis in real exchange rates may simply be due to the shortness ..... Violations of Purchasing Power Parity and Their Implications for Efficient ... Official Intervention in the Foreign Exchange Market:.

  13. Can short-term oral fine motor training affect precision of task performance and induce cortical plasticity of the jaw muscles?

    DEFF Research Database (Denmark)

    Hong, Zhang; Kumar, Abhishek; Kothari, Mohit

    2016-01-01

    The aim was to test the hypothesis that short-term oral sensorimotor training of the jaw muscles would increase the precision of task performance and induce neuroplastic changes in the corticomotor pathways, related to the masseter muscle. Fifteen healthy volunteers performed six series with ten...... trials of an oral sensorimotor task. The task was to manipulate and position a spherical chocolate candy in between the anterior teeth and split it into two equal halves. The precision of the task performance was evaluated by comparing the ratio between the two split halves. A series of "hold......-and-split" tasks was also performed before and after the training. The hold force and split force along with the electromyographic (EMG) activity of jaw muscles were recorded. Motor-evoked potentials and cortical motor maps of the right masseter muscle were evoked by transcranial magnetic stimulation...

  14. Toward micro-scale spatial modeling of gentrification

    Science.gov (United States)

    O'Sullivan, David

    A simple preliminary model of gentrification is presented. The model is based on an irregular cellular automaton architecture drawing on the concept of proximal space, which is well suited to the spatial externalities present in housing markets at the local scale. The rent gap hypothesis on which the model's cell transition rules are based is discussed. The model's transition rules are described in detail. Practical difficulties in configuring and initializing the model are described and its typical behavior reported. Prospects for further development of the model are discussed. The current model structure, while inadequate, is well suited to further elaboration and the incorporation of other interesting and relevant effects.

  15. Spatial autocorrelation in farmland grasshopper assemblages (Orthoptera: Acrididae) in western France.

    Science.gov (United States)

    Badenhausser, I; Gouat, M; Goarant, A; Cornulier, T; Bretagnolle, V

    2012-10-01

    Agricultural intensification in western Europe has caused a dramatic loss of grassland surfaces in farmlands, which have resulted in strong declines in grassland invertebrates, leading to cascade effects at higher trophic levels among consumers of invertebrates. Grasshoppers are important components of grassland invertebrate assemblages in European agricultural ecosystems, particularly as prey for bird species. Understanding how grasshopper populations are distributed in fragmented landscapes with low grassland availability is critical for both studies in biodiversity conservation and insect management. We assessed the range and strength of spatial autocorrelation for two grasshopper taxa (Gomphocerinae subfamily and Calliptamus italicus L.) across an intensive farmland in western France. Data from surveys carried out over 8 yr in 1,715 grassland fields were analyzed using geostatistics. Weak spatial patterns were observed at small spatial scales, suggesting important local effects of management practices on grasshopper densities. Spatial autocorrelation patterns for both grasshopper taxa were only detected at intermediate scales. For Gomphocerinae, the range of spatial autocorrelation varied from 802 to 2,613 m according to the year, depending both on grasshopper density and on grassland surfaces in the study site, whereas spatial patterns for the Italian locust were more variable and not related to grasshopper density or grassland surfaces. Spatial patterns in the distribution of Gomphocerinae supported our hypothesis that habitat availability was a major driver of grasshopper distribution in the landscape, and suggested it was related to density-dependent processes such as dispersal.

  16. Adaptation hypothesis of biological efficiency of ionizing radiation

    International Nuclear Information System (INIS)

    Kudritskij, Yu.K.; Georgievskij, A.B.; Karpov, V.I.

    1992-01-01

    Adaptation hypothesis of biological efficiency of ionizing radiation is based on acknowledgement of invariance of fundamental laws and principles of biology related to unity of biota and media, evolution and adaptation for radiobiology. The basic arguments for adaptation hypothesis validity, its correspondence to the requirements imposed on scientific hypothes are presented

  17. Advancing Open 3D Modelling Standards in National Spatial Information Policy

    OpenAIRE

    Trakas, A.; Janssen, P.; Stoter, J.

    2012-01-01

    Individuals and organisations around the world - facing extraordinary challenges and new opportunities - are together engaged in numerous projects, involving natural and built environments. Spatial information policy is at the heart of these projects. The information technologies available enable individuals to observe, measure, describe, map and portray these environments with increasing ease, flexibility and precision. In our time, individuals create digital geographic objects that reflect ...

  18. New methods of testing nonlinear hypothesis using iterative NLLS estimator

    Science.gov (United States)

    Mahaboob, B.; Venkateswarlu, B.; Mokeshrayalu, G.; Balasiddamuni, P.

    2017-11-01

    This research paper discusses the method of testing nonlinear hypothesis using iterative Nonlinear Least Squares (NLLS) estimator. Takeshi Amemiya [1] explained this method. However in the present research paper, a modified Wald test statistic due to Engle, Robert [6] is proposed to test the nonlinear hypothesis using iterative NLLS estimator. An alternative method for testing nonlinear hypothesis using iterative NLLS estimator based on nonlinear hypothesis using iterative NLLS estimator based on nonlinear studentized residuals has been proposed. In this research article an innovative method of testing nonlinear hypothesis using iterative restricted NLLS estimator is derived. Pesaran and Deaton [10] explained the methods of testing nonlinear hypothesis. This paper uses asymptotic properties of nonlinear least squares estimator proposed by Jenrich [8]. The main purpose of this paper is to provide very innovative methods of testing nonlinear hypothesis using iterative NLLS estimator, iterative NLLS estimator based on nonlinear studentized residuals and iterative restricted NLLS estimator. Eakambaram et al. [12] discussed least absolute deviation estimations versus nonlinear regression model with heteroscedastic errors and also they studied the problem of heteroscedasticity with reference to nonlinear regression models with suitable illustration. William Grene [13] examined the interaction effect in nonlinear models disused by Ai and Norton [14] and suggested ways to examine the effects that do not involve statistical testing. Peter [15] provided guidelines for identifying composite hypothesis and addressing the probability of false rejection for multiple hypotheses.

  19. Precision Medicine and Men's Health.

    Science.gov (United States)

    Mata, Douglas A; Katchi, Farhan M; Ramasamy, Ranjith

    2017-07-01

    Precision medicine can greatly benefit men's health by helping to prevent, diagnose, and treat prostate cancer, benign prostatic hyperplasia, infertility, hypogonadism, and erectile dysfunction. For example, precision medicine can facilitate the selection of men at high risk for prostate cancer for targeted prostate-specific antigen screening and chemoprevention administration, as well as assist in identifying men who are resistant to medical therapy for prostatic hyperplasia, who may instead require surgery. Precision medicine-trained clinicians can also let couples know whether their specific cause of infertility should be bypassed by sperm extraction and in vitro fertilization to prevent abnormalities in their offspring. Though precision medicine's role in the management of hypogonadism has yet to be defined, it could be used to identify biomarkers associated with individual patients' responses to treatment so that appropriate therapy can be prescribed. Last, precision medicine can improve erectile dysfunction treatment by identifying genetic polymorphisms that regulate response to medical therapies and by aiding in the selection of patients for further cardiovascular disease screening.

  20. Coalescent Simulation and Paleodistribution Modeling for Tabebuia rosealba Do Not Support South American Dry Forest Refugia Hypothesis.

    Directory of Open Access Journals (Sweden)

    Warita Alves de Melo

    Full Text Available Studies based on contemporary plant occurrences and pollen fossil records have proposed that the current disjunct distribution of seasonally dry tropical forests (SDTFs across South America is the result of fragmentation of a formerly widespread and continuously distributed dry forest during the arid climatic conditions associated with the Last Glacial Maximum (LGM, which is known as the modern-day dry forest refugia hypothesis. We studied the demographic history of Tabebuia rosealba (Bignoniaceae to understand the disjunct geographic distribution of South American SDTFs based on statistical phylogeography and ecological niche modeling (ENM. We specifically tested the dry forest refugia hypothesis; i.e., if the multiple and isolated patches of SDTFs are current climatic relicts of a widespread and continuously distributed dry forest during the LGM. We sampled 235 individuals across 18 populations in Central Brazil and analyzed the polymorphisms at chloroplast (trnS-trnG, psbA-trnH and ycf6-trnC intergenic spacers and nuclear (ITS nrDNA genomes. We performed coalescence simulations of alternative hypotheses under demographic expectations from two a priori biogeographic hypotheses (1. the Pleistocene Arc hypothesis and, 2. a range shift to Amazon Basin and other two demographic expectances predicted by ENMs (3. expansion throughout the Neotropical South America, including Amazon Basin, and 4. retraction during the LGM. Phylogenetic analyses based on median-joining network showed haplotype sharing among populations with evidence of incomplete lineage sorting. Coalescent analyses showed smaller effective population sizes for T. roseoalba during the LGM compared to the present-day. Simulations and ENM also showed that its current spatial pattern of genetic diversity is most likely due to a scenario of range retraction during the LGM instead of the fragmentation from a once extensive and largely contiguous SDTF across South America, not supporting the

  1. Coalescent Simulation and Paleodistribution Modeling for Tabebuia rosealba Do Not Support South American Dry Forest Refugia Hypothesis.

    Science.gov (United States)

    de Melo, Warita Alves; Lima-Ribeiro, Matheus S; Terribile, Levi Carina; Collevatti, Rosane G

    2016-01-01

    Studies based on contemporary plant occurrences and pollen fossil records have proposed that the current disjunct distribution of seasonally dry tropical forests (SDTFs) across South America is the result of fragmentation of a formerly widespread and continuously distributed dry forest during the arid climatic conditions associated with the Last Glacial Maximum (LGM), which is known as the modern-day dry forest refugia hypothesis. We studied the demographic history of Tabebuia rosealba (Bignoniaceae) to understand the disjunct geographic distribution of South American SDTFs based on statistical phylogeography and ecological niche modeling (ENM). We specifically tested the dry forest refugia hypothesis; i.e., if the multiple and isolated patches of SDTFs are current climatic relicts of a widespread and continuously distributed dry forest during the LGM. We sampled 235 individuals across 18 populations in Central Brazil and analyzed the polymorphisms at chloroplast (trnS-trnG, psbA-trnH and ycf6-trnC intergenic spacers) and nuclear (ITS nrDNA) genomes. We performed coalescence simulations of alternative hypotheses under demographic expectations from two a priori biogeographic hypotheses (1. the Pleistocene Arc hypothesis and, 2. a range shift to Amazon Basin) and other two demographic expectances predicted by ENMs (3. expansion throughout the Neotropical South America, including Amazon Basin, and 4. retraction during the LGM). Phylogenetic analyses based on median-joining network showed haplotype sharing among populations with evidence of incomplete lineage sorting. Coalescent analyses showed smaller effective population sizes for T. roseoalba during the LGM compared to the present-day. Simulations and ENM also showed that its current spatial pattern of genetic diversity is most likely due to a scenario of range retraction during the LGM instead of the fragmentation from a once extensive and largely contiguous SDTF across South America, not supporting the South

  2. Precision muonium spectroscopy

    International Nuclear Information System (INIS)

    Jungmann, Klaus P.

    2016-01-01

    The muonium atom is the purely leptonic bound state of a positive muon and an electron. It has a lifetime of 2.2 µs. The absence of any known internal structure provides for precision experiments to test fundamental physics theories and to determine accurate values of fundamental constants. In particular ground state hyperfine structure transitions can be measured by microwave spectroscopy to deliver the muon magnetic moment. The frequency of the 1s–2s transition in the hydrogen-like atom can be determined with laser spectroscopy to obtain the muon mass. With such measurements fundamental physical interactions, in particular quantum electrodynamics, can also be tested at highest precision. The results are important input parameters for experiments on the muon magnetic anomaly. The simplicity of the atom enables further precise experiments, such as a search for muonium–antimuonium conversion for testing charged lepton number conservation and searches for possible antigravity of muons and dark matter. (author)

  3. Spatial Linkage and Urban Expansion: AN Urban Agglomeration View

    Science.gov (United States)

    Jiao, L. M.; Tang, X.; Liu, X. P.

    2017-09-01

    Urban expansion displays different characteristics in each period. From the perspective of the urban agglomeration, studying the spatial and temporal characteristics of urban expansion plays an important role in understanding the complex relationship between urban expansion and network structure of urban agglomeration. We analyze urban expansion in the Yangtze River Delta Urban Agglomeration (YRD) through accessibility to and spatial interaction intensity from core cities as well as accessibility of road network. Results show that: (1) Correlation between urban expansion intensity and spatial indicators such as location and space syntax variables is remarkable and positive, while it decreases after rapid expansion. (2) Urban expansion velocity displays a positive correlation with spatial indicators mentioned above in the first (1980-1990) and second (1990-2000) period. However, it exhibits a negative relationship in the third period (2000-2010), i.e., cities located in the periphery of urban agglomeration developing more quickly. Consequently, the hypothesis of convergence of urban expansion in rapid expansion stage is put forward. (3) Results of Zipf's law and Gibrat's law show urban expansion in YRD displays a convergent trend in rapid expansion stage, small and medium-sized cities growing faster. This study shows that spatial linkage plays an important but evolving role in urban expansion within the urban agglomeration. In addition, it serves as a reference to the planning of Yangtze River Delta Urban Agglomeration and regulation of urban expansion of other urban agglomerations.

  4. A large scale test of the gaming-enhancement hypothesis

    Directory of Open Access Journals (Sweden)

    Andrew K. Przybylski

    2016-11-01

    Full Text Available A growing research literature suggests that regular electronic game play and game-based training programs may confer practically significant benefits to cognitive functioning. Most evidence supporting this idea, the gaming-enhancement hypothesis, has been collected in small-scale studies of university students and older adults. This research investigated the hypothesis in a general way with a large sample of 1,847 school-aged children. Our aim was to examine the relations between young people’s gaming experiences and an objective test of reasoning performance. Using a Bayesian hypothesis testing approach, evidence for the gaming-enhancement and null hypotheses were compared. Results provided no substantive evidence supporting the idea that having preference for or regularly playing commercially available games was positively associated with reasoning ability. Evidence ranged from equivocal to very strong in support for the null hypothesis over what was predicted. The discussion focuses on the value of Bayesian hypothesis testing for investigating electronic gaming effects, the importance of open science practices, and pre-registered designs to improve the quality of future work.

  5. A large scale test of the gaming-enhancement hypothesis.

    Science.gov (United States)

    Przybylski, Andrew K; Wang, John C

    2016-01-01

    A growing research literature suggests that regular electronic game play and game-based training programs may confer practically significant benefits to cognitive functioning. Most evidence supporting this idea, the gaming-enhancement hypothesis , has been collected in small-scale studies of university students and older adults. This research investigated the hypothesis in a general way with a large sample of 1,847 school-aged children. Our aim was to examine the relations between young people's gaming experiences and an objective test of reasoning performance. Using a Bayesian hypothesis testing approach, evidence for the gaming-enhancement and null hypotheses were compared. Results provided no substantive evidence supporting the idea that having preference for or regularly playing commercially available games was positively associated with reasoning ability. Evidence ranged from equivocal to very strong in support for the null hypothesis over what was predicted. The discussion focuses on the value of Bayesian hypothesis testing for investigating electronic gaming effects, the importance of open science practices, and pre-registered designs to improve the quality of future work.

  6. A Hypothesis-Driven Approach to Site Investigation

    Science.gov (United States)

    Nowak, W.

    2008-12-01

    Variability of subsurface formations and the scarcity of data lead to the notion of aquifer parameters as geostatistical random variables. Given an information need and limited resources for field campaigns, site investigation is often put into the context of optimal design. In optimal design, the types, numbers and positions of samples are optimized under case-specific objectives to meet the information needs. Past studies feature optimal data worth (balancing maximum financial profit in an engineering task versus the cost of additional sampling), or aim at a minimum prediction uncertainty of stochastic models for a prescribed investigation budget. Recent studies also account for other sources of uncertainty outside the hydrogeological range, such as uncertain toxicity, ingestion and behavioral parameters of the affected population when predicting the human health risk from groundwater contaminations. The current study looks at optimal site investigation from a new angle. Answering a yes/no question under uncertainty directly requires recasting the original question as a hypothesis test. Otherwise, false confidence in the resulting answer would be pretended. A straightforward example is whether a recent contaminant spill will cause contaminant concentrations in excess of a legal limit at a nearby drinking water well. This question can only be answered down to a specified chance of error, i.e., based on the significance level used in hypothesis tests. Optimal design is placed into the hypothesis-driven context by using the chance of providing a false yes/no answer as new criterion to be minimized. Different configurations apply for one-sided and two-sided hypothesis tests. If a false answer entails financial liability, the hypothesis-driven context can be re-cast in the context of data worth. The remaining difference is that failure is a hard constraint in the data worth context versus a monetary punishment term in the hypothesis-driven context. The basic principle

  7. The (not so) immortal strand hypothesis.

    Science.gov (United States)

    Tomasetti, Cristian; Bozic, Ivana

    2015-03-01

    Non-random segregation of DNA strands during stem cell replication has been proposed as a mechanism to minimize accumulated genetic errors in stem cells of rapidly dividing tissues. According to this hypothesis, an "immortal" DNA strand is passed to the stem cell daughter and not the more differentiated cell, keeping the stem cell lineage replication error-free. After it was introduced, experimental evidence both in favor and against the hypothesis has been presented. Using a novel methodology that utilizes cancer sequencing data we are able to estimate the rate of accumulation of mutations in healthy stem cells of the colon, blood and head and neck tissues. We find that in these tissues mutations in stem cells accumulate at rates strikingly similar to those expected without the protection from the immortal strand mechanism. Utilizing an approach that is fundamentally different from previous efforts to confirm or refute the immortal strand hypothesis, we provide evidence against non-random segregation of DNA during stem cell replication. Our results strongly suggest that parental DNA is passed randomly to stem cell daughters and provides new insight into the mechanism of DNA replication in stem cells. Copyright © 2015. Published by Elsevier B.V.

  8. Precision engineering: an evolutionary perspective.

    Science.gov (United States)

    Evans, Chris J

    2012-08-28

    Precision engineering is a relatively new name for a technology with roots going back over a thousand years; those roots span astronomy, metrology, fundamental standards, manufacturing and money-making (literally). Throughout that history, precision engineers have created links across disparate disciplines to generate innovative responses to society's needs and wants. This review combines historical and technological perspectives to illuminate precision engineering's current character and directions. It first provides us a working definition of precision engineering and then reviews the subject's roots. Examples will be given showing the contributions of the technology to society, while simultaneously showing the creative tension between the technological convergence that spurs new directions and the vertical disintegration that optimizes manufacturing economics.

  9. Dissimilarities of reduced density matrices and eigenstate thermalization hypothesis

    Science.gov (United States)

    He, Song; Lin, Feng-Li; Zhang, Jia-ju

    2017-12-01

    We calculate various quantities that characterize the dissimilarity of reduced density matrices for a short interval of length ℓ in a two-dimensional (2D) large central charge conformal field theory (CFT). These quantities include the Rényi entropy, entanglement entropy, relative entropy, Jensen-Shannon divergence, as well as the Schatten 2-norm and 4-norm. We adopt the method of operator product expansion of twist operators, and calculate the short interval expansion of these quantities up to order of ℓ9 for the contributions from the vacuum conformal family. The formal forms of these dissimilarity measures and the derived Fisher information metric from contributions of general operators are also given. As an application of the results, we use these dissimilarity measures to compare the excited and thermal states, and examine the eigenstate thermalization hypothesis (ETH) by showing how they behave in high temperature limit. This would help to understand how ETH in 2D CFT can be defined more precisely. We discuss the possibility that all the dissimilarity measures considered here vanish when comparing the reduced density matrices of an excited state and a generalized Gibbs ensemble thermal state. We also discuss ETH for a microcanonical ensemble thermal state in a 2D large central charge CFT, and find that it is approximately satisfied for a small subsystem and violated for a large subsystem.

  10. Rayleigh's hypothesis and the geometrical optics limit.

    Science.gov (United States)

    Elfouhaily, Tanos; Hahn, Thomas

    2006-09-22

    The Rayleigh hypothesis (RH) is often invoked in the theoretical and numerical treatment of rough surface scattering in order to decouple the analytical form of the scattered field. The hypothesis stipulates that the scattered field away from the surface can be extended down onto the rough surface even though it is formed by solely up-going waves. Traditionally this hypothesis is systematically used to derive the Volterra series under the small perturbation method which is equivalent to the low-frequency limit. In this Letter we demonstrate that the RH also carries the high-frequency or the geometrical optics limit, at least to first order. This finding has never been explicitly derived in the literature. Our result comforts the idea that the RH might be an exact solution under some constraints in the general case of random rough surfaces and not only in the case of small-slope deterministic periodic gratings.

  11. A spatial emergy model for Alachua County, Florida

    Science.gov (United States)

    Lambert, James David

    will be directly comparable with the results of this study. The results and conclusions of this study reinforce the hypothesis that an urban landscape will develop a predictable spatial pattern that can be described in terms of a universal energy transformation hierarchy.

  12. Toward precision medicine in Alzheimer's disease.

    Science.gov (United States)

    Reitz, Christiane

    2016-03-01

    In Western societies, Alzheimer's disease (AD) is the most common form of dementia and the sixth leading cause of death. In recent years, the concept of precision medicine, an approach for disease prevention and treatment that is personalized to an individual's specific pattern of genetic variability, environment and lifestyle factors, has emerged. While for some diseases, in particular select cancers and a few monogenetic disorders such as cystic fibrosis, significant advances in precision medicine have been made over the past years, for most other diseases precision medicine is only in its beginning. To advance the application of precision medicine to a wider spectrum of disorders, governments around the world are starting to launch Precision Medicine Initiatives, major efforts to generate the extensive scientific knowledge needed to integrate the model of precision medicine into every day clinical practice. In this article we summarize the state of precision medicine in AD, review major obstacles in its development, and discuss its benefits in this highly prevalent, clinically and pathologically complex disease.

  13. FROM PERSONALIZED TO PRECISION MEDICINE

    Directory of Open Access Journals (Sweden)

    K. V. Raskina

    2017-01-01

    Full Text Available The need to maintain a high quality of life against a backdrop of its inevitably increasing duration is one of the main problems of modern health care. The concept of "right drug to the right patient at the right time", which at first was bearing the name "personalized", is currently unanimously approved by international scientific community as "precision medicine". Precision medicine takes all the individual characteristics into account: genes diversity, environment, lifestyles, and even bacterial microflora and also involves the use of the latest technological developments, which serves to ensure that each patient gets assistance fitting his state best. In the United States, Canada and France national precision medicine programs have already been submitted and implemented. The aim of this review is to describe the dynamic integration of precision medicine methods into routine medical practice and life of modern society. The new paradigm prospects description are complemented by figures, proving the already achieved success in the application of precise methods for example, the targeted therapy of cancer. All in all, the presence of real-life examples, proving the regularity of transition to a new paradigm, and a wide range  of technical and diagnostic capabilities available and constantly evolving make the all-round transition to precision medicine almost inevitable.

  14. Emerging role of Geographical Information System (GIS), Life Cycle Assessment (LCA) and spatial LCA (GIS-LCA) in sustainable bioenergy planning.

    Science.gov (United States)

    Hiloidhari, Moonmoon; Baruah, D C; Singh, Anoop; Kataki, Sampriti; Medhi, Kristina; Kumari, Shilpi; Ramachandra, T V; Jenkins, B M; Thakur, Indu Shekhar

    2017-10-01

    Sustainability of a bioenergy project depends on precise assessment of biomass resource, planning of cost-effective logistics and evaluation of possible environmental implications. In this context, this paper reviews the role and applications of geo-spatial tool such as Geographical Information System (GIS) for precise agro-residue resource assessment, biomass logistic and power plant design. Further, application of Life Cycle Assessment (LCA) in understanding the potential impact of agro-residue bioenergy generation on different ecosystem services has also been reviewed and limitations associated with LCA variability and uncertainty were discussed. Usefulness of integration of GIS into LCA (i.e. spatial LCA) to overcome the limitations of conventional LCA and to produce a holistic evaluation of the environmental benefits and concerns of bioenergy is also reviewed. Application of GIS, LCA and spatial LCA can help alleviate the challenges faced by ambitious bioenergy projects by addressing both economics and environmental goals. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Longitudinal study of spatial working memory development in young children.

    Science.gov (United States)

    Tsujii, Takeo; Yamamoto, Eriko; Masuda, Sayako; Watanabe, Shigeru

    2009-05-27

    This study longitudinally compared activity in the frontal cortex during a spatial working memory task between 5-year-old and 7-year-old children using near-infrared spectroscopy. Eight children participated in this study twice, once at 5 years and once at 7 years of age. Behavioral analysis showed that older children performed the working memory task more precisely and more rapidly than younger children. Near-infrared spectroscopy analysis showed that right hemisphere dominance was observed in older children, whereas no hemispheric difference was apparent in younger children. Children with strengthened lateralization showed improved performance from 5 to 7 years. We therefore offer the first demonstration of the developmental changes in frontal cortical activation during spatial working memory tasks during the preschool period.

  16. Sleep deprivation impairs spatial retrieval but not spatial learning in the non-human primate grey mouse lemur.

    Directory of Open Access Journals (Sweden)

    Anisur Rahman

    Full Text Available A bulk of studies in rodents and humans suggest that sleep facilitates different phases of learning and memory process, while sleep deprivation (SD impairs these processes. Here we tested the hypothesis that SD could alter spatial learning and memory processing in a non-human primate, the grey mouse lemur (Microcebus murinus, which is an interesting model of aging and Alzheimer's disease (AD. Two sets of experiments were performed. In a first set of experiments, we investigated the effects of SD on spatial learning and memory retrieval after one day of training in a circular platform task. Eleven male mouse lemurs aged between 2 to 3 years were tested in three different conditions: without SD as a baseline reference, 8 h of SD before the training and 8 h of SD before the testing. The SD was confirmed by electroencephalographic recordings. Results showed no effect of SD on learning when SD was applied before the training. When the SD was applied before the testing, it induced an increase of the amount of errors and of the latency prior to reach the target. In a second set of experiments, we tested the effect of 8 h of SD on spatial memory retrieval after 3 days of training. Twenty male mouse lemurs aged between 2 to 3 years were tested in this set of experiments. In this condition, the SD did not affect memory retrieval. This is the first study that documents the disruptive effects of the SD on spatial memory retrieval in this primate which may serve as a new validated challenge to investigate the effects of new compounds along physiological and pathological aging.

  17. Spatial Economics Model Predicting Transport Volume

    Directory of Open Access Journals (Sweden)

    Lu Bo

    2016-10-01

    Full Text Available It is extremely important to predict the logistics requirements in a scientific and rational way. However, in recent years, the improvement effect on the prediction method is not very significant and the traditional statistical prediction method has the defects of low precision and poor interpretation of the prediction model, which cannot only guarantee the generalization ability of the prediction model theoretically, but also cannot explain the models effectively. Therefore, in combination with the theories of the spatial economics, industrial economics, and neo-classical economics, taking city of Zhuanghe as the research object, the study identifies the leading industry that can produce a large number of cargoes, and further predicts the static logistics generation of the Zhuanghe and hinterlands. By integrating various factors that can affect the regional logistics requirements, this study established a logistics requirements potential model from the aspect of spatial economic principles, and expanded the way of logistics requirements prediction from the single statistical principles to an new area of special and regional economics.

  18. How Low Can You Go: Spatial Frequency Sensitivity in Pure Alexia

    DEFF Research Database (Denmark)

    Starrfelt, Randi; Nielsen, S.; Habekost, T.

    Objective: Pure alexia is a seemingly selective deficit in reading, follow- ing focal lesions to the posterior left hemisphere. The hallmark feature of pure alexia is a word length effect in single word reading, where reaction times may increase with hundreds of milliseconds per additional letter...... in a word. Other language functions, including writing, are intact. It has been suggested that pure alexia is caused by a general deficit in visual processing, one that affects reading disproportionally compared to other visual stimuli. The most concrete hypothesis to date suggests that pure alexia...... is caused by a lack of sensitivity to particular spatial frequencies (e.g., Fiset et al., 2006), and that this results in the characteristic word length effect, as well as effects of letter confusability on reading times. Participants and Methods: We have tested this hypothesis in a patient with pure alexia...

  19. Dynamic illumination of spatially restricted or large brain volumes via a single tapered optical fiber.

    Science.gov (United States)

    Pisanello, Ferruccio; Mandelbaum, Gil; Pisanello, Marco; Oldenburg, Ian A; Sileo, Leonardo; Markowitz, Jeffrey E; Peterson, Ralph E; Della Patria, Andrea; Haynes, Trevor M; Emara, Mohamed S; Spagnolo, Barbara; Datta, Sandeep Robert; De Vittorio, Massimo; Sabatini, Bernardo L

    2017-08-01

    Optogenetics promises precise spatiotemporal control of neural processes using light. However, the spatial extent of illumination within the brain is difficult to control and cannot be adjusted using standard fiber optics. We demonstrate that optical fibers with tapered tips can be used to illuminate either spatially restricted or large brain volumes. Remotely adjusting the light input angle to the fiber varies the light-emitting portion of the taper over several millimeters without movement of the implant. We use this mode to activate dorsal versus ventral striatum of individual mice and reveal different effects of each manipulation on motor behavior. Conversely, injecting light over the full numerical aperture of the fiber results in light emission from the entire taper surface, achieving broader and more efficient optogenetic activation of neurons, compared to standard flat-faced fiber stimulation. Thus, tapered fibers permit focal or broad illumination that can be precisely and dynamically matched to experimental needs.

  20. A scan statistic for binary outcome based on hypergeometric probability model, with an application to detecting spatial clusters of Japanese encephalitis.

    Science.gov (United States)

    Zhao, Xing; Zhou, Xiao-Hua; Feng, Zijian; Guo, Pengfei; He, Hongyan; Zhang, Tao; Duan, Lei; Li, Xiaosong

    2013-01-01

    As a useful tool for geographical cluster detection of events, the spatial scan statistic is widely applied in many fields and plays an increasingly important role. The classic version of the spatial scan statistic for the binary outcome is developed by Kulldorff, based on the Bernoulli or the Poisson probability model. In this paper, we apply the Hypergeometric probability model to construct the likelihood function under the null hypothesis. Compared with existing methods, the likelihood function under the null hypothesis is an alternative and indirect method to identify the potential cluster, and the test statistic is the extreme value of the likelihood function. Similar with Kulldorff's methods, we adopt Monte Carlo test for the test of significance. Both methods are applied for detecting spatial clusters of Japanese encephalitis in Sichuan province, China, in 2009, and the detected clusters are identical. Through a simulation to independent benchmark data, it is indicated that the test statistic based on the Hypergeometric model outweighs Kulldorff's statistics for clusters of high population density or large size; otherwise Kulldorff's statistics are superior.

  1. Calculation of the spatial resolution in two-photon absorption spectroscopy applied to plasma diagnosis

    Energy Technology Data Exchange (ETDEWEB)

    Garcia-Lechuga, M. [Departamento de Física Teórica, Atómica y Óptica, Universidad de Valladolid, 47011-Valladolid (Spain); Laser Processing Group, Instituto de Óptica “Daza de Valdés,” CSIC, 28006-Madrid (Spain); Fuentes, L. M. [Departamento de Física Aplicada, Universidad de Valladolid, 47011-Valladolid (Spain); Grützmacher, K.; Pérez, C., E-mail: concha@opt.uva.es; Rosa, M. I. de la [Departamento de Física Teórica, Atómica y Óptica, Universidad de Valladolid, 47011-Valladolid (Spain)

    2014-10-07

    We report a detailed characterization of the spatial resolution provided by two-photon absorption spectroscopy suited for plasma diagnosis via the 1S-2S transition of atomic hydrogen for optogalvanic detection and laser induced fluorescence (LIF). A precise knowledge of the spatial resolution is crucial for a correct interpretation of measurements, if the plasma parameters to be analysed undergo strong spatial variations. The present study is based on a novel approach which provides a reliable and realistic determination of the spatial resolution. Measured irradiance distribution of laser beam waists in the overlap volume, provided by a high resolution UV camera, are employed to resolve coupled rate equations accounting for two-photon excitation, fluorescence decay and ionization. The resulting three-dimensional yield distributions reveal in detail the spatial resolution for optogalvanic and LIF detection and related saturation due to depletion. Two-photon absorption profiles broader than the Fourier transform-limited laser bandwidth are also incorporated in the calculations. The approach allows an accurate analysis of the spatial resolution present in recent and future measurements.

  2. Calculation of the spatial resolution in two-photon absorption spectroscopy applied to plasma diagnosis

    International Nuclear Information System (INIS)

    Garcia-Lechuga, M.; Fuentes, L. M.; Grützmacher, K.; Pérez, C.; Rosa, M. I. de la

    2014-01-01

    We report a detailed characterization of the spatial resolution provided by two-photon absorption spectroscopy suited for plasma diagnosis via the 1S-2S transition of atomic hydrogen for optogalvanic detection and laser induced fluorescence (LIF). A precise knowledge of the spatial resolution is crucial for a correct interpretation of measurements, if the plasma parameters to be analysed undergo strong spatial variations. The present study is based on a novel approach which provides a reliable and realistic determination of the spatial resolution. Measured irradiance distribution of laser beam waists in the overlap volume, provided by a high resolution UV camera, are employed to resolve coupled rate equations accounting for two-photon excitation, fluorescence decay and ionization. The resulting three-dimensional yield distributions reveal in detail the spatial resolution for optogalvanic and LIF detection and related saturation due to depletion. Two-photon absorption profiles broader than the Fourier transform-limited laser bandwidth are also incorporated in the calculations. The approach allows an accurate analysis of the spatial resolution present in recent and future measurements.

  3. High precision Cross-correlated imaging in Few-mode fibers

    DEFF Research Database (Denmark)

    Muliar, Olena; Usuga Castaneda, Mario A.; Kristensen, Torben

    2017-01-01

    us to distinguishing differential time delays between HOMs in the picosecond timescale. Broad wavelength scanning in combination with spectral shaping, allows us to estimate the modal behavior of FMF without prior knowledge of the fiber parameters. We performed our demonstration at wavelengths from...... existing approaches for modal content analysis, several methods as S2, C2 in time and frequency domain are available. In this contribution we will present an improved time-domain cross-correlated (C2) imaging technique for the experimental evaluation of modal properties in HOM fibers over a broad range......) in a few-mode fiber (FMF) are used as multiple spatial communication channels, comes in this context as a viable approach to enable the optimization of high-capacity links. From this perspective, it becomes highly necessary to possess a diagnostic tool for the precise modal characterization of FMFs. Among...

  4. Temporal Changes in the Spatial Variability of Soil Nutrients

    Energy Technology Data Exchange (ETDEWEB)

    Hoskinson, Reed Louis; Hess, John Richard; Alessi, Randolph Samuel

    1999-07-01

    This paper reports the temporal changes in the spatial variability of soil nutrient concentrations across a field during the growing season, over a four-year period. This study is part of the Site-Specific Technologies for Agriculture (SST4Ag) precision farming research project at the INEEL. Uniform fertilization did not produce a uniform increase in fertility. During the growing season, several of the nutrients and micronutrients showed increases in concentration although no additional fertilization had occurred. Potato plant uptake did not explain all of these changes. Some soil micronutrient concentrations increased above levels considered detrimental to potatoes, but the plants did not show the effects in reduced yield. All the nutrients measured changed between the last sampling in the fall and the first sampling the next spring prior to fertilization. The soil microbial community may play a major role in the temporal changes in the spatial variability of soil nutrient concentrations. These temporal changes suggest potential impact when determining fertilizer recommendations, and when evaluating the results of spatially varying fertilizer application.

  5. Development of sensor guided precision sprayers

    NARCIS (Netherlands)

    Nieuwenhuizen, A.T.; Zande, van de J.C.

    2013-01-01

    Sensor guided precision sprayers were developed to automate the spray process with a focus on emission reduction and identical or increased efficacy, with the precision agriculture concept in mind. Within the project “Innovations2” sensor guided precision sprayers were introduced to leek,

  6. Motor synergies and the equilibrium-point hypothesis.

    Science.gov (United States)

    Latash, Mark L

    2010-07-01

    The article offers a way to unite three recent developments in the field of motor control and coordination: (1) The notion of synergies is introduced based on the principle of motor abundance; (2) The uncontrolled manifold hypothesis is described as offering a computational framework to identify and quantify synergies; and (3) The equilibrium-point hypothesis is described for a single muscle, single joint, and multijoint systems. Merging these concepts into a single coherent scheme requires focusing on control variables rather than performance variables. The principle of minimal final action is formulated as the guiding principle within the referent configuration hypothesis. Motor actions are associated with setting two types of variables by a controller, those that ultimately define average performance patterns and those that define associated synergies. Predictions of the suggested scheme are reviewed, such as the phenomenon of anticipatory synergy adjustments, quick actions without changes in synergies, atypical synergies, and changes in synergies with practice. A few models are briefly reviewed.

  7. Spatial frequency discrimination: visual long-term memory or criterion setting?

    Science.gov (United States)

    Lages, M; Treisman, M

    1998-02-01

    A long-term sensory memory is believed to account for spatial frequency discrimination when reference and test stimuli are separated by long intervals. We test an alternative proposal: that discrimination is determined by the range of test stimuli, through their entrainment of criterion-setting processes. Experiments 1 and 2 show that the 50% point of the psychometric function is largely determined by the midpoint of the stimulus range, not by the reference stimulus. Experiment 3 shows that discrimination of spatial frequencies is similarly affected by orthogonal contextual stimuli and parallel contextual stimuli and that these effects can be explained by criterion-setting processes. These findings support the hypothesis that discrimination over long intervals is explained by the operation of criterion-setting processes rather than by long-term sensory retention of a neural representation of the stimulus.

  8. Flight control and landing precision in the nocturnal bee Megalopta is robust to large changes in light intensity

    Directory of Open Access Journals (Sweden)

    Emily eBaird

    2015-10-01

    Full Text Available Like its diurnal relatives, Megalopta genalis use visual information to control flight. Unlike their diurnal relatives, however, they do this at extremely low light intensities. Although Megalopta has developed optical specialisations to increase visual sensitivity, theoretical studies suggest that this enhanced sensitivity does not enable them to capture enough light to use visual information to reliably control flight in the rainforest at night. It has been proposed that Megalopta gain extra sensitivity by summing visual information over time. While enhancing the reliability of vision, this strategy would decrease the accuracy with which they can detect image motion - a crucial cue for flight control. Here, we test this temporal summation hypothesis by investigating how Megalopta’s flight control and landing precision is affected by light intensity and compare our findings with the results of similar experiments performed on the diurnal bumblebee Bombus terrestris, to explore the extent to which Megalopta’s adaptations to dim light affect their precision. We find that, unlike Bombus, light intensity does not affect flight and landing precision in Megalopta. Overall, we find little evidence that Megalopta uses a temporal summation strategy in dim light, while we find strong support for the use of this strategy in Bombus.

  9. Precision automation of cell type classification and sub-cellular fluorescence quantification from laser scanning confocal images

    Directory of Open Access Journals (Sweden)

    Hardy Craig Hall

    2016-02-01

    Full Text Available While novel whole-plant phenotyping technologies have been successfully implemented into functional genomics and breeding programs, the potential of automated phenotyping with cellular resolution is largely unexploited. Laser scanning confocal microscopy has the potential to close this gap by providing spatially highly resolved images containing anatomic as well as chemical information on a subcellular basis. However, in the absence of automated methods, the assessment of the spatial patterns and abundance of fluorescent markers with subcellular resolution is still largely qualitative and time-consuming. Recent advances in image acquisition and analysis, coupled with improvements in microprocessor performance, have brought such automated methods within reach, so that information from thousands of cells per image for hundreds of images may be derived in an experimentally convenient time-frame. Here, we present a MATLAB-based analytical pipeline to 1 segment radial plant organs into individual cells, 2 classify cells into cell type categories based upon random forest classification, 3 divide each cell into sub-regions, and 4 quantify fluorescence intensity to a subcellular degree of precision for a separate fluorescence channel. In this research advance, we demonstrate the precision of this analytical process for the relatively complex tissues of Arabidopsis hypocotyls at various stages of development. High speed and robustness make our approach suitable for phenotyping of large collections of stem-like material and other tissue types.

  10. An experimental test of the habitat-amount hypothesis for saproxylic beetles in a forested region.

    Science.gov (United States)

    Seibold, Sebastian; Bässler, Claus; Brandl, Roland; Fahrig, Lenore; Förster, Bernhard; Heurich, Marco; Hothorn, Torsten; Scheipl, Fabian; Thorn, Simon; Müller, Jörg

    2017-06-01

    The habitat-amount hypothesis challenges traditional concepts that explain species richness within habitats, such as the habitat-patch hypothesis, where species number is a function of patch size and patch isolation. It posits that effects of patch size and patch isolation are driven by effects of sample area, and thus that the number of species at a site is basically a function of the total habitat amount surrounding this site. We tested the habitat-amount hypothesis for saproxylic beetles and their habitat of dead wood by using an experiment comprising 190 plots with manipulated patch sizes situated in a forested region with a high variation in habitat amount (i.e., density of dead trees in the surrounding landscape). Although dead wood is a spatio-temporally dynamic habitat, saproxylic insects have life cycles shorter than the time needed for habitat turnover and they closely track their resource. Patch size was manipulated by adding various amounts of downed dead wood to the plots (~800 m³ in total); dead trees in the surrounding landscape (~240 km 2 ) were identified using airborne laser scanning (light detection and ranging). Over 3 yr, 477 saproxylic species (101,416 individuals) were recorded. Considering 20-1,000 m radii around the patches, local landscapes were identified as having a radius of 40-120 m. Both patch size and habitat amount in the local landscapes independently affected species numbers without a significant interaction effect, hence refuting the island effect. Species accumulation curves relative to cumulative patch size were not consistent with either the habitat-patch hypothesis or the habitat-amount hypothesis: several small dead-wood patches held more species than a single large patch with an amount of dead wood equal to the sum of that of the small patches. Our results indicate that conservation of saproxylic beetles in forested regions should primarily focus on increasing the overall amount of dead wood without considering its

  11. A Comparative Study of Precise Point Positioning (PPP Accuracy Using Online Services

    Directory of Open Access Journals (Sweden)

    Malinowski Marcin

    2016-12-01

    Full Text Available Precise Point Positioning (PPP is a technique used to determine the position of receiver antenna without communication with the reference station. It may be an alternative solution to differential measurements, where maintaining a connection with a single RTK station or a regional network of reference stations RTN is necessary. This situation is especially common in areas with poorly developed infrastructure of ground stations. A lot of research conducted so far on the use of the PPP technique has been concerned about the development of entire day observation sessions. However, this paper presents the results of a comparative analysis of accuracy of absolute determination of position from observations which last between 1 to 7 hours with the use of four permanent services which execute calculations with PPP technique such as: Automatic Precise Positioning Service (APPS, Canadian Spatial Reference System Precise Point Positioning (CSRS-PPP, GNSS Analysis and Positioning Software (GAPS and magicPPP - Precise Point Positioning Solution (magicGNSS. On the basis of acquired results of measurements, it can be concluded that at least two-hour long measurements allow acquiring an absolute position with an accuracy of 2-4 cm. An evaluation of the impact on the accuracy of simultaneous positioning of three points test network on the change of the horizontal distance and the relative height difference between measured triangle vertices was also conducted. Distances and relative height differences between points of the triangular test network measured with a laser station Leica TDRA6000 were adopted as references. The analyses of results show that at least two hours long measurement sessions can be used to determine the horizontal distance or the difference in height with an accuracy of 1-2 cm. Rapid products employed in calculations conducted with PPP technique reached the accuracy of determining coordinates on a close level as in elaborations which employ

  12. Tests of the Giant Impact Hypothesis

    Science.gov (United States)

    Jones, J. H.

    1998-01-01

    The giant impact hypothesis has gained popularity as a means of explaining a volatile-depleted Moon that still has a chemical affinity to the Earth. As Taylor's Axiom decrees, the best models of lunar origin are testable, but this is difficult with the giant impact model. The energy associated with the impact would be sufficient to totally melt and partially vaporize the Earth. And this means that there should he no geological vestige of Barber times. Accordingly, it is important to devise tests that may be used to evaluate the giant impact hypothesis. Three such tests are discussed here. None of these is supportive of the giant impact model, but neither do they disprove it.

  13. Intercomparison of XH2O Data from the GOSAT TANSO-FTS (TIR and SWIR and Ground-Based FTS Measurements: Impact of the Spatial Variability of XH2O on the Intercomparison

    Directory of Open Access Journals (Sweden)

    Hirofumi Ohyama

    2017-01-01

    Full Text Available Spatial and temporal variability of atmospheric water vapor (H2O is extremely high, and therefore it is difficult to accurately evaluate the measurement precision of H2O data by a simple comparison between the data derived from two different instruments. We determined the measurement precisions of column-averaged dry-air mole fractions of H2O (XH2O retrieved independently from spectral radiances in the thermal infrared (TIR and the short-wavelength infrared (SWIR regions measured using a Thermal And Near-infrared Sensor for carbon Observation-Fourier Transform Spectrometer (TANSO-FTS onboard the Greenhouse gases Observing SATellite (GOSAT, by an intercomparison between the two TANSO-FTS XH2O data products and the ground-based FTS XH2O data. Furthermore, the spatial variability of XH2O was also estimated in the intercomparison process. Mutually coincident XH2O data above land for the period ranging from April 2009 to May 2014 were intercompared with different spatial coincidence criteria. We found that the precisions of the TANSO-FTS TIR and TANSO-FTS SWIR XH2O were 7.3%–7.7% and 3.5%–4.5%, respectively, and that the spatial variability of XH2O was 6.7% within a radius of 50 km and 18.5% within a radius of 200 km. These results demonstrate that, in order to accurately evaluate the measurement precision of XH2O, it is necessary to set more rigorous spatial coincidence criteria or to take into account the spatial variability of XH2O as derived in the present study.

  14. Spatially explicit modeling in ecology: A review

    Science.gov (United States)

    DeAngelis, Donald L.; Yurek, Simeon

    2017-01-01

    The use of spatially explicit models (SEMs) in ecology has grown enormously in the past two decades. One major advancement has been that fine-scale details of landscapes, and of spatially dependent biological processes, such as dispersal and invasion, can now be simulated with great precision, due to improvements in computer technology. Many areas of modeling have shifted toward a focus on capturing these fine-scale details, to improve mechanistic understanding of ecosystems. However, spatially implicit models (SIMs) have played a dominant role in ecology, and arguments have been made that SIMs, which account for the effects of space without specifying spatial positions, have an advantage of being simpler and more broadly applicable, perhaps contributing more to understanding. We address this debate by comparing SEMs and SIMs in examples from the past few decades of modeling research. We argue that, although SIMs have been the dominant approach in the incorporation of space in theoretical ecology, SEMs have unique advantages for addressing pragmatic questions concerning species populations or communities in specific places, because local conditions, such as spatial heterogeneities, organism behaviors, and other contingencies, produce dynamics and patterns that usually cannot be incorporated into simpler SIMs. SEMs are also able to describe mechanisms at the local scale that can create amplifying positive feedbacks at that scale, creating emergent patterns at larger scales, and therefore are important to basic ecological theory. We review the use of SEMs at the level of populations, interacting populations, food webs, and ecosystems and argue that SEMs are not only essential in pragmatic issues, but must play a role in the understanding of causal relationships on landscapes.

  15. The Spatial Politics of Spatial Representation

    DEFF Research Database (Denmark)

    Olesen, Kristian; Richardson, Tim

    2011-01-01

    spatial planning in Denmark reveals how fuzzy spatial representations and relational spatial concepts are being used to depoliticise strategic spatial planning processes and to camouflage spatial politics. The paper concludes that, while relational geography might play an important role in building......This paper explores the interplay between the spatial politics of new governance landscapes and innovations in the use of spatial representations in planning. The central premise is that planning experiments with new relational approaches become enmeshed in spatial politics. The case of strategic...

  16. Testing the null hypothesis: the forgotten legacy of Karl Popper?

    Science.gov (United States)

    Wilkinson, Mick

    2013-01-01

    Testing of the null hypothesis is a fundamental aspect of the scientific method and has its basis in the falsification theory of Karl Popper. Null hypothesis testing makes use of deductive reasoning to ensure that the truth of conclusions is irrefutable. In contrast, attempting to demonstrate the new facts on the basis of testing the experimental or research hypothesis makes use of inductive reasoning and is prone to the problem of the Uniformity of Nature assumption described by David Hume in the eighteenth century. Despite this issue and the well documented solution provided by Popper's falsification theory, the majority of publications are still written such that they suggest the research hypothesis is being tested. This is contrary to accepted scientific convention and possibly highlights a poor understanding of the application of conventional significance-based data analysis approaches. Our work should remain driven by conjecture and attempted falsification such that it is always the null hypothesis that is tested. The write up of our studies should make it clear that we are indeed testing the null hypothesis and conforming to the established and accepted philosophical conventions of the scientific method.

  17. Clonality, genetic diversity and support for the diversifying selection hypothesis in natural populations of a flower-living yeast.

    Science.gov (United States)

    Herrera, C M; Pozo, M I; Bazaga, P

    2011-11-01

    Vast amounts of effort have been devoted to investigate patterns of genetic diversity and structuring in plants and animals, but similar information is scarce for organisms of other kingdoms. The study of the genetic structure of natural populations of wild yeasts can provide insights into the ecological and genetic correlates of clonality, and into the generality of recent hypotheses postulating that microbial populations lack the potential for genetic divergence and allopatric speciation. Ninety-one isolates of the flower-living yeast Metschnikowia gruessii from southeastern Spain were DNA fingerprinted using amplified fragment length polymorphism (AFLP) markers. Genetic diversity and structuring was investigated with band-based methods and model- and nonmodel-based clustering. Linkage disequilibrium tests were used to assess reproduction mode. Microsite-dependent, diversifying selection was tested by comparing genetic characteristics of isolates from bumble bee vectors and different floral microsites. AFLP polymorphism (91%) and genotypic diversity were very high. Genetic diversity was spatially structured, as shown by amova (Φ(st)  = 0.155) and clustering. The null hypothesis of random mating was rejected, clonality seeming the prevailing reproductive mode in the populations studied. Genetic diversity of isolates declined from bumble bee mouthparts to floral microsites, and frequency of five AFLP markers varied significantly across floral microsites, thus supporting the hypothesis of diversifying selection on clonal lineages. Wild populations of clonal fungal microbes can exhibit levels of genetic diversity and spatial structuring that are not singularly different from those shown by sexually reproducing plants or animals. Microsite-dependent, divergent selection can maintain high local and regional genetic diversity in microbial populations despite extensive clonality. © 2011 Blackwell Publishing Ltd.

  18. Mismatch or cumulative stress : Toward an integrated hypothesis of programming effects

    NARCIS (Netherlands)

    Nederhof, Esther; Schmidt, Mathias V.

    2012-01-01

    This paper integrates the cumulative stress hypothesis with the mismatch hypothesis, taking into account individual differences in sensitivity to programming. According to the cumulative stress hypothesis, individuals are more likely to suffer from disease as adversity accumulates. According to the

  19. What is precision medicine?

    Science.gov (United States)

    König, Inke R; Fuchs, Oliver; Hansen, Gesine; von Mutius, Erika; Kopp, Matthias V

    2017-10-01

    The term "precision medicine" has become very popular over recent years, fuelled by scientific as well as political perspectives. Despite its popularity, its exact meaning, and how it is different from other popular terms such as "stratified medicine", "targeted therapy" or "deep phenotyping" remains unclear. Commonly applied definitions focus on the stratification of patients, sometimes referred to as a novel taxonomy, and this is derived using large-scale data including clinical, lifestyle, genetic and further biomarker information, thus going beyond the classical "signs-and-symptoms" approach.While these aspects are relevant, this description leaves open a number of questions. For example, when does precision medicine begin? In which way does the stratification of patients translate into better healthcare? And can precision medicine be viewed as the end-point of a novel stratification of patients, as implied, or is it rather a greater whole?To clarify this, the aim of this paper is to provide a more comprehensive definition that focuses on precision medicine as a process. It will be shown that this proposed framework incorporates the derivation of novel taxonomies and their role in healthcare as part of the cycle, but also covers related terms. Copyright ©ERS 2017.

  20. The effect of unemployment, aggregate wages, and spatial contiguity on local wages: An investigation with German district level data

    OpenAIRE

    Thiess Buettner

    1999-01-01

    Despite spatial rigidity of collectively negotiated wages the local unemployment rate is found to have a significant negative impact on wages. This impact is shown to be consistent with both the wage-curve hypothesis and modern Phillips-curve modelling. Spatial contiguity effects are found in wages and unemployment and their neglect leads to an underestimation of the effect of local unemployment. Yet, the impact of local unemployment on wages turns out to be quite low as compared to studies f...

  1. Environmental policy without costs? A review of the Porter hypothesis

    Energy Technology Data Exchange (ETDEWEB)

    Braennlund, Runar; Lundgren, Tommy. e-mail: runar.brannlund@econ.umu.se

    2009-03-15

    This paper reviews the theoretical and empirical literature connected to the so called Porter Hypothesis. That is, to review the literature connected to the discussion about the relation between environmental policy and competitiveness. According to the conventional wisdom environmental policy, aiming for improving the environment through for example emission reductions, do imply costs since scarce resources must be diverted from somewhere else. However, this conventional wisdom has been challenged and questioned recently through what has been denoted the 'Porter hypothesis'. Those in the forefront of the Porter hypothesis challenge the conventional wisdom basically on the ground that resources are used inefficiently in the absence of the right kind of environmental regulations, and that the conventional neo-classical view is too static to take inefficiencies into account. The conclusions that can be made from this review is (1) that the theoretical literature can identify the circumstances and mechanisms that must exist for a Porter effect to occur, (2) that these circumstances are rather non-general, hence rejecting the Porter hypothesis in general, (3) that the empirical literature give no general support for the Porter hypothesis. Furthermore, a closer look at the 'Swedish case' reveals no support for the Porter hypothesis in spite of the fact that Swedish environmental policy the last 15-20 years seems to be in line the prerequisites stated by the Porter hypothesis concerning environmental policy

  2. Precision Medicine in Cancer Treatment

    Science.gov (United States)

    Precision medicine helps doctors select cancer treatments that are most likely to help patients based on a genetic understanding of their disease. Learn about the promise of precision medicine and the role it plays in cancer treatment.

  3. Precision electron polarimetry

    International Nuclear Information System (INIS)

    Chudakov, E.

    2013-01-01

    A new generation of precise Parity-Violating experiments will require a sub-percent accuracy of electron beam polarimetry. Compton polarimetry can provide such accuracy at high energies, but at a few hundred MeV the small analyzing power limits the sensitivity. Mo/ller polarimetry provides a high analyzing power independent on the beam energy, but is limited by the properties of the polarized targets commonly used. Options for precision polarimetry at 300 MeV will be discussed, in particular a proposal to use ultra-cold atomic hydrogen traps to provide a 100%-polarized electron target for Mo/ller polarimetry

  4. Precision Medicine in Gastrointestinal Pathology.

    Science.gov (United States)

    Wang, David H; Park, Jason Y

    2016-05-01

    -Precision medicine is the promise of individualized therapy and management of patients based on their personal biology. There are now multiple global initiatives to perform whole-genome sequencing on millions of individuals. In the United States, an early program was the Million Veteran Program, and a more recent proposal in 2015 by the president of the United States is the Precision Medicine Initiative. To implement precision medicine in routine oncology care, genetic variants present in tumors need to be matched with effective clinical therapeutics. When we focus on the current state of precision medicine for gastrointestinal malignancies, it becomes apparent that there is a mixed history of success and failure. -To present the current state of precision medicine using gastrointestinal oncology as a model. We will present currently available targeted therapeutics, promising new findings in clinical genomic oncology, remaining quality issues in genomic testing, and emerging oncology clinical trial designs. -Review of the literature including clinical genomic studies on gastrointestinal malignancies, clinical oncology trials on therapeutics targeted to molecular alterations, and emerging clinical oncology study designs. -Translating our ability to sequence thousands of genes into meaningful improvements in patient survival will be the challenge for the next decade.

  5. Modulation of microsaccades by spatial frequency during object categorization.

    Science.gov (United States)

    Craddock, Matt; Oppermann, Frank; Müller, Matthias M; Martinovic, Jasna

    2017-01-01

    The organization of visual processing into a coarse-to-fine information processing based on the spatial frequency properties of the input forms an important facet of the object recognition process. During visual object categorization tasks, microsaccades occur frequently. One potential functional role of these eye movements is to resolve high spatial frequency information. To assess this hypothesis, we examined the rate, amplitude and speed of microsaccades in an object categorization task in which participants viewed object and non-object images and classified them as showing either natural objects, man-made objects or non-objects. Images were presented unfiltered (broadband; BB) or filtered to contain only low (LSF) or high spatial frequency (HSF) information. This allowed us to examine whether microsaccades were modulated independently by the presence of a high-level feature - the presence of an object - and by low-level stimulus characteristics - spatial frequency. We found a bimodal distribution of saccades based on their amplitude, with a split between smaller and larger microsaccades at 0.4° of visual angle. The rate of larger saccades (⩾0.4°) was higher for objects than non-objects, and higher for objects with high spatial frequency content (HSF and BB objects) than for LSF objects. No effects were observed for smaller microsaccades (<0.4°). This is consistent with a role for larger microsaccades in resolving HSF information for object identification, and previous evidence that more microsaccades are directed towards informative image regions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Spatial But Not Oculomotor Information Biases Perceptual Memory: Evidence From Face Perception and Cognitive Modeling.

    Science.gov (United States)

    Wantz, Andrea L; Lobmaier, Janek S; Mast, Fred W; Senn, Walter

    2017-08-01

    Recent research put forward the hypothesis that eye movements are integrated in memory representations and are reactivated when later recalled. However, "looking back to nothing" during recall might be a consequence of spatial memory retrieval. Here, we aimed at distinguishing between the effect of spatial and oculomotor information on perceptual memory. Participants' task was to judge whether a morph looked rather like the first or second previously presented face. Crucially, faces and morphs were presented in a way that the morph reactivated oculomotor and/or spatial information associated with one of the previously encoded faces. Perceptual face memory was largely influenced by these manipulations. We considered a simple computational model with an excellent match (4.3% error) that expresses these biases as a linear combination of recency, saccade, and location. Surprisingly, saccades did not play a role. The results suggest that spatial and temporal rather than oculomotor information biases perceptual face memory. Copyright © 2016 Cognitive Science Society, Inc.

  7. Three-dimensional precise orientation of bilateral auricular trial prosthesis using a facebow for a young adult with Crouzon syndrome.

    Science.gov (United States)

    Rathee, Manu; Tamrakar, Amit Kumar; Kundu, Renu; Yunus, Nadeem

    2014-08-05

    Facial deformity can be debilitating, especially in the psychological and cosmetic aspects. Although surgical correction or replacement of deformed or missing parts is the ideal treatment, prosthetic replacement serves the purpose in case of surgical limitations. Prosthetic rehabilitation of a missing auricle is an acceptable option as it provides better control over the tortuous anatomical shape and shade of the missing portion. Improper spatial orientation of the prosthetic ear on the face can damage the results of even the most aesthetic prosthesis. This case report describes a simple and innovative method for precise spatial orientation of auricular trial prosthesis using a facebow and custom-made adjustable mechanical retention design using stainless steel wire. 2014 BMJ Publishing Group Ltd.

  8. [Progress in precision medicine: a scientific perspective].

    Science.gov (United States)

    Wang, B; Li, L M

    2017-01-10

    Precision medicine is a new strategy for disease prevention and treatment by taking into account differences in genetics, environment and lifestyles among individuals and making precise diseases classification and diagnosis, which can provide patients with personalized, targeted prevention and treatment. Large-scale population cohort studies are fundamental for precision medicine research, and could produce best evidence for precision medicine practices. Current criticisms on precision medicine mainly focus on the very small proportion of benefited patients, the neglect of social determinants for health, and the possible waste of limited medical resources. In spite of this, precision medicine is still a most hopeful research area, and would become a health care practice model in the future.

  9. Modeling and control of precision actuators

    CERN Document Server

    Kiong, Tan Kok

    2013-01-01

    IntroductionGrowing Interest in Precise ActuatorsTypes of Precise ActuatorsApplications of Precise ActuatorsNonlinear Dynamics and ModelingHysteresisCreepFrictionForce RipplesIdentification and Compensation of Preisach Hysteresis in Piezoelectric ActuatorsSVD-Based Identification and Compensation of Preisach HysteresisHigh-Bandwidth Identification and Compensation of Hysteretic Dynamics in Piezoelectric ActuatorsConcluding RemarksIdentification and Compensation of Frict

  10. Array-Enhanced Coherence Resonance: Nontrivial Effects of Heterogeneity and Spatial Independence of Noise

    International Nuclear Information System (INIS)

    Zhou, Changsong; Kurths, Juergen; Hu, Bambi

    2001-01-01

    We demonstrate the effect of coherence resonance in a heterogeneous array of coupled Fitz Hugh--Nagumo neurons. It is shown that coupling of such elements leads to a significantly stronger coherence compared to that of a single element. We report nontrivial effects of parameter heterogeneity and spatial independence of noise on array-enhanced coherence resonance; especially, we find that (i) the coherence increases as spatial correlation of the noise decreases, and (ii) inhomogeneity in the parameters of the array enhances the coherence. Our results have the implication that generic heterogeneity and background noise can play a constructive role to enhance the time precision of firing in neural systems

  11. Temporal, spatial and ecological dynamics of speciation among amphi-Beringian small mammals

    Science.gov (United States)

    Hope, Andrew G.; Takebayashi, Naoki; Galbreath, Kurt E.; Talbot, Sandra L.; Cook, Joseph A.

    2013-01-01

    Quaternary climate cycles played an important role in promoting diversification across the Northern Hemisphere, although details of the mechanisms driving evolutionary change are still poorly resolved. In a comparative phylogeographical framework, we investigate temporal, spatial and ecological components of evolution within a suite of Holarctic small mammals. We test a hypothesis of simultaneous divergence among multiple taxon pairs, investigating time to coalescence and demographic change for each taxon in response to a combination of climate and geography.

  12. Temporal precision and the capacity of auditory-verbal short-term memory.

    Science.gov (United States)

    Gilbert, Rebecca A; Hitch, Graham J; Hartley, Tom

    2017-12-01

    The capacity of serially ordered auditory-verbal short-term memory (AVSTM) is sensitive to the timing of the material to be stored, and both temporal processing and AVSTM capacity are implicated in the development of language. We developed a novel "rehearsal-probe" task to investigate the relationship between temporal precision and the capacity to remember serial order. Participants listened to a sub-span sequence of spoken digits and silently rehearsed the items and their timing during an unfilled retention interval. After an unpredictable delay, a tone prompted report of the item being rehearsed at that moment. An initial experiment showed cyclic distributions of item responses over time, with peaks preserving serial order and broad, overlapping tails. The spread of the response distributions increased with additional memory load and correlated negatively with participants' auditory digit spans. A second study replicated the negative correlation and demonstrated its specificity to AVSTM by controlling for differences in visuo-spatial STM and nonverbal IQ. The results are consistent with the idea that a common resource underpins both the temporal precision and capacity of AVSTM. The rehearsal-probe task may provide a valuable tool for investigating links between temporal processing and AVSTM capacity in the context of speech and language abilities.

  13. Implications of the Bohm-Aharonov hypothesis

    International Nuclear Information System (INIS)

    Ghirardi, G.C.; Rimini, A.; Weber, T.

    1976-01-01

    It is proved that the Bohm-Aharonov hypothesis concerning largerly separated subsystems of composite quantum systems implies that it is impossible to express the dynamical evolution in terms of the density operator

  14. Precision Higgs Boson Physics and Implications for Beyond the Standard Model Physics Theories

    International Nuclear Information System (INIS)

    Wells, James

    2015-01-01

    The discovery of the Higgs boson is one of science's most impressive recent achievements. We have taken a leap forward in understanding what is at the heart of elementary particle mass generation. We now have a significant opportunity to develop even deeper understanding of how the fundamental laws of nature are constructed. As such, we need intense focus from the scientific community to put this discovery in its proper context, to realign and narrow our understanding of viable theory based on this positive discovery, and to detail the implications the discovery has for theories that attempt to answer questions beyond what the Standard Model can explain. This project's first main object is to develop a state-of-the-art analysis of precision Higgs boson physics. This is to be done in the tradition of the electroweak precision measurements of the LEP/SLC era. Indeed, the electroweak precision studies of the past are necessary inputs to the full precision Higgs program. Calculations will be presented to the community of Higgs boson observables that detail just how well various couplings of the Higgs boson can be measured, and more. These will be carried out using state-of-the-art theory computations coupled with the new experimental results coming in from the LHC. The project's second main objective is to utilize the results obtained from LHC Higgs boson experiments and the precision analysis, along with the direct search studies at LHC, and discern viable theories of physics beyond the Standard Model that unify physics to a deeper level. Studies will be performed on supersymmetric theories, theories of extra spatial dimensions (and related theories, such as compositeness), and theories that contain hidden sector states uniquely accessible to the Higgs boson. In addition, if data becomes incompatible with the Standard Model's low-energy effective lagrangian, new physics theories will be developed that explain the anomaly and put it into a more

  15. Precision wildlife medicine: applications of the human-centred precision medicine revolution to species conservation.

    Science.gov (United States)

    Whilde, Jenny; Martindale, Mark Q; Duffy, David J

    2017-05-01

    The current species extinction crisis is being exacerbated by an increased rate of emergence of epizootic disease. Human-induced factors including habitat degradation, loss of biodiversity and wildlife population reductions resulting in reduced genetic variation are accelerating disease emergence. Novel, efficient and effective approaches are required to combat these epizootic events. Here, we present the case for the application of human precision medicine approaches to wildlife medicine in order to enhance species conservation efforts. We consider how the precision medicine revolution, coupled with the advances made in genomics, may provide a powerful and feasible approach to identifying and treating wildlife diseases in a targeted, effective and streamlined manner. A number of case studies of threatened species are presented which demonstrate the applicability of precision medicine to wildlife conservation, including sea turtles, amphibians and Tasmanian devils. These examples show how species conservation could be improved by using precision medicine techniques to determine novel treatments and management strategies for the specific medical conditions hampering efforts to restore population levels. Additionally, a precision medicine approach to wildlife health has in turn the potential to provide deeper insights into human health and the possibility of stemming and alleviating the impacts of zoonotic diseases. The integration of the currently emerging Precision Medicine Initiative with the concepts of EcoHealth (aiming for sustainable health of people, animals and ecosystems through transdisciplinary action research) and One Health (recognizing the intimate connection of humans, animal and ecosystem health and addressing a wide range of risks at the animal-human-ecosystem interface through a coordinated, collaborative, interdisciplinary approach) has great potential to deliver a deeper and broader interdisciplinary-based understanding of both wildlife and human

  16. Almost-Quantum Correlations Violate the No-Restriction Hypothesis.

    Science.gov (United States)

    Sainz, Ana Belén; Guryanova, Yelena; Acín, Antonio; Navascués, Miguel

    2018-05-18

    To identify which principles characterize quantum correlations, it is essential to understand in which sense this set of correlations differs from that of almost-quantum correlations. We solve this problem by invoking the so-called no-restriction hypothesis, an explicit and natural axiom in many reconstructions of quantum theory stating that the set of possible measurements is the dual of the set of states. We prove that, contrary to quantum correlations, no generalized probabilistic theory satisfying the no-restriction hypothesis is able to reproduce the set of almost-quantum correlations. Therefore, any theory whose correlations are exactly, or very close to, the almost-quantum correlations necessarily requires a rule limiting the possible measurements. Our results suggest that the no-restriction hypothesis may play a fundamental role in singling out the set of quantum correlations among other nonsignaling ones.

  17. Nanomaterials for Cancer Precision Medicine.

    Science.gov (United States)

    Wang, Yilong; Sun, Shuyang; Zhang, Zhiyuan; Shi, Donglu

    2018-04-01

    Medical science has recently advanced to the point where diagnosis and therapeutics can be carried out with high precision, even at the molecular level. A new field of "precision medicine" has consequently emerged with specific clinical implications and challenges that can be well-addressed by newly developed nanomaterials. Here, a nanoscience approach to precision medicine is provided, with a focus on cancer therapy, based on a new concept of "molecularly-defined cancers." "Next-generation sequencing" is introduced to identify the oncogene that is responsible for a class of cancers. This new approach is fundamentally different from all conventional cancer therapies that rely on diagnosis of the anatomic origins where the tumors are found. To treat cancers at molecular level, a recently developed "microRNA replacement therapy" is applied, utilizing nanocarriers, in order to regulate the driver oncogene, which is the core of cancer precision therapeutics. Furthermore, the outcome of the nanomediated oncogenic regulation has to be accurately assessed by the genetically characterized, patient-derived xenograft models. Cancer therapy in this fashion is a quintessential example of precision medicine, presenting many challenges to the materials communities with new issues in structural design, surface functionalization, gene/drug storage and delivery, cell targeting, and medical imaging. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Dissecting the Gravitational Lens B1608 656. II. Precision Measurements of the Hubble Constant, Spatial Curvature, and the Dark Energy Equation of State

    Energy Technology Data Exchange (ETDEWEB)

    Suyu, S.H.; /Argelander Inst. Astron.; Marshall, P.J.; /KIPAC, Menlo Park /UC, Santa Barbara; Auger, M.W.; /UC, Santa Barbara /UC, Davis; Hilbert, S.; /Argelander Inst. Astron. /Garching, Max Planck Inst.; Blandford, R.D.; /KIPAC, Menlo Park; Koopmans, L.V.E.; /Kapteyn Astron. Inst., Groningen; Fassnacht, C.D.; /UC, Davis; Treu, T.; /UC, Santa Barbara

    2009-12-11

    between {Omega}{sub m} and {Omega}{sub {Lambda}} at w = -1 and constrains the curvature parameter to be -0.031 < {Omega}{sub k} < 0.009 (95% CL), a level of precision comparable to that afforded by the current Type Ia SNe sample. Asserting a flat spatial geometry, we find that, in combination with WMAP, H{sub 0} = 69.7{sub 5.0}{sup +4.9} km s{sup -1} Mpc{sup -1} and w = -0.94{sub -0.19}{sup +0.17} (68% CL), suggesting that the observations of B1608+656 constrain w as tightly as do the current Baryon Acoustic Oscillation data.

  19. Assess the Critical Period Hypothesis in Second Language Acquisition

    Science.gov (United States)

    Du, Lihong

    2010-01-01

    The Critical Period Hypothesis aims to investigate the reason for significant difference between first language acquisition and second language acquisition. Over the past few decades, researchers carried out a series of studies to test the validity of the hypothesis. Although there were certain limitations in these studies, most of their results…

  20. An aberrant precision account of autism.

    Directory of Open Access Journals (Sweden)

    Rebecca P Lawson

    2014-05-01

    Full Text Available Autism is a neurodevelopmental disorder characterised by problems with social-communication, restricted interests and repetitive behaviour. A recent and controversial article presented a compelling normative explanation for the perceptual symptoms of autism in terms of a failure of Bayesian inference (Pellicano and Burr, 2012. In response, we suggested that when Bayesian interference is grounded in its neural instantiation – namely, predictive coding – many features of autistic perception can be attributed to aberrant precision (or beliefs about precision within the context of hierarchical message passing in the brain (Friston et al., 2013. Here, we unpack the aberrant precision account of autism. Specifically, we consider how empirical findings – that speak directly or indirectly to neurobiological mechanisms – are consistent with the aberrant encoding of precision in autism; in particular, an imbalance of the precision ascribed to sensory evidence relative to prior beliefs.

  1. NCI and the Precision Medicine Initiative®

    Science.gov (United States)

    NCI's activities related to precision medicine focuses on new and expanded precision medicine clinical trials; mechanisms to overcome drug resistance to cancer treatments; and developing a shared digital repository of precision medicine trials data.

  2. Effects of verbal and nonverbal interference on spatial and object visual working memory.

    Science.gov (United States)

    Postle, Bradley R; Desposito, Mark; Corkin, Suzanne

    2005-03-01

    We tested the hypothesis that a verbal coding mechanism is necessarily engaged by object, but not spatial, visual working memory tasks. We employed a dual-task procedure that paired n-back working memory tasks with domain-specific distractor trials inserted into each interstimulus interval of the n-back tasks. In two experiments, object n-back performance demonstrated greater sensitivity to verbal distraction, whereas spatial n-back performance demonstrated greater sensitivity to motion distraction. Visual object and spatial working memory may differ fundamentally in that the mnemonic representation of featural characteristics of objects incorporates a verbal (perhaps semantic) code, whereas the mnemonic representation of the location of objects does not. Thus, the processes supporting working memory for these two types of information may differ in more ways than those dictated by the "what/where" organization of the visual system, a fact more easily reconciled with a component process than a memory systems account of working memory function.

  3. Rapid spatial equilibration of a particle in a box.

    Science.gov (United States)

    Malabarba, Artur S L; Linden, Noah; Short, Anthony J

    2015-12-01

    We study the equilibration behavior of a quantum particle in a one-dimensional box, with respect to a coarse-grained position measurement (whether it lies in a certain spatial window or not). We show that equilibration in this context indeed takes place and does so very rapidly, in a time comparable to the time for the initial wave packet to reach the edges of the box. We also show that, for this situation, the equilibration behavior is relatively insensitive to the precise choice of position measurements or initial condition.

  4. Small-scale spatial cognition in pigeons.

    Science.gov (United States)

    Cheng, Ken; Spetch, Marcia L; Kelly, Debbie M; Bingman, Verner P

    2006-05-01

    Roberts and Van Veldhuizen's [Roberts, W.A., Van Veldhuizen, N., 1985. Spatial memory in pigeons on the radial maze. J. Exp. Psychol.: Anim. Behav. Proc. 11, 241-260] study on pigeons in the radial maze sparked research on landmark use by pigeons in lab-based tasks as well as variants of the radial-maze task. Pigeons perform well on open-field versions of the radial maze, with feeders scattered on the laboratory floor. Pigeons can also be trained to search precisely for buried food. The search can be based on multiple landmarks, but is sometimes controlled by just one or two landmarks, with the preferred landmarks varying across individuals. Findings are similar in landmark-based searching on a computer monitor and on a lab floor, despite many differences between the two kinds of tasks. A number of general learning principles are found in landmark-based searching, such as cue competition, generalization and peak shift, and selective attention. Pigeons also learn the geometry of the environment in which they are searching. Neurophysiological studies have implicated the hippocampal formation (HF) in avian spatial cognition, with the right hippocampus hypothesized to play a more important role in the spatial recognition of goal locations. Most recently, single-cell recording from the pigeon's hippocampal formation has revealed cells with different properties from the classic 'place' cells of rats, as well as differences in the two sides of the hippocampus.

  5. Registered Replication Report: Testing Disruptive Effects of Irrelevant Speech on Visual-Spatial Working Memory

    Directory of Open Access Journals (Sweden)

    Tatiana Kvetnaya

    2018-04-01

    Full Text Available A Partial Replication of “Functional Equivalence of Verbal and Spatial Information in Serial Short-Term Memory (Jones, Farrand, Stuart, & Morris, 1995; Experiment 4” The irrelevant speech effect (ISE—the phenomenon that background speech impairs serial recall of visually presented material—has been widely used for examining the structure of short-term memory. In Experiment 4, Jones, Farrand, Stuart, and Morris (1995 employed the ISE to demonstrate that impairment of performance is determined by the changing-state characteristics of the material, rather than its modality of origin. The present study directly replicated the spatial condition of Experiment 4 with 'N' = 40 German participants. In contrast to the original findings, no main effect of sound type was observed, 'F'(2, 78 = 0.81, 'p' = .450, η2'p' = .02. The absence of an ISE in the spatial domain does not support the changing state hypothesis.

  6. Historical spatial reconstruction of a spawning-aggregation fishery.

    Science.gov (United States)

    Buckley, Sarah M; Thurstan, Ruth H; Tobin, Andrew; Pandolfi, John M

    2017-12-01

    Aggregations of individual animals that form for breeding purposes are a critical ecological process for many species, yet these aggregations are inherently vulnerable to exploitation. Studies of the decline of exploited populations that form breeding aggregations tend to focus on catch rate and thus often overlook reductions in geographic range. We tested the hypothesis that catch rate and site occupancy of exploited fish-spawning aggregations (FSAs) decline in synchrony over time. We used the Spanish mackerel (Scomberomorus commerson) spawning-aggregation fishery in the Great Barrier Reef as a case study. Data were compiled from historical newspaper archives, fisher knowledge, and contemporary fishery logbooks to reconstruct catch rates and exploitation trends from the inception of the fishery. Our fine-scale analysis of catch and effort data spanned 103 years (1911-2013) and revealed a spatial expansion of fishing effort. Effort shifted offshore at a rate of 9.4 nm/decade, and 2.9 newly targeted FSAs were reported/decade. Spatial expansion of effort masked the sequential exploitation, commercial extinction, and loss of 70% of exploited FSAs. After standardizing for improvements in technological innovations, average catch rates declined by 90.5% from 1934 to 2011 (from 119.4 to 11.41 fish/vessel/trip). Mean catch rate of Spanish mackerel and occupancy of exploited mackerel FSAs were not significantly related. Our study revealed a special kind of shifting spatial baseline in which a contraction in exploited FSAs occurred undetected. Knowledge of temporally and spatially explicit information on FSAs can be relevant for the conservation and management of FSA species. © 2017 Society for Conservation Biology.

  7. [Precision Nursing: Individual-Based Knowledge Translation].

    Science.gov (United States)

    Chiang, Li-Chi; Yeh, Mei-Ling; Su, Sui-Lung

    2016-12-01

    U.S. President Obama announced a new era of precision medicine in the Precision Medicine Initiative (PMI). This initiative aims to accelerate the progress of personalized medicine in light of individual requirements for prevention and treatment in order to improve the state of individual and public health. The recent and dramatic development of large-scale biologic databases (such as the human genome sequence), powerful methods for characterizing patients (such as genomics, microbiome, diverse biomarkers, and even pharmacogenomics), and computational tools for analyzing big data are maximizing the potential benefits of precision medicine. Nursing science should follow and keep pace with this trend in order to develop empirical knowledge and expertise in the area of personalized nursing care. Nursing scientists must encourage, examine, and put into practice innovative research on precision nursing in order to provide evidence-based guidance to clinical practice. The applications in personalized precision nursing care include: explanations of personalized information such as the results of genetic testing; patient advocacy and support; anticipation of results and treatment; ongoing chronic monitoring; and support for shared decision-making throughout the disease trajectory. Further, attention must focus on the family and the ethical implications of taking a personalized approach to care. Nurses will need to embrace the paradigm shift to precision nursing and work collaboratively across disciplines to provide the optimal personalized care to patients. If realized, the full potential of precision nursing will provide the best chance for good health for all.

  8. Deterministic ion beam material adding technology for high-precision optical surfaces.

    Science.gov (United States)

    Liao, Wenlin; Dai, Yifan; Xie, Xuhui; Zhou, Lin

    2013-02-20

    Although ion beam figuring (IBF) provides a highly deterministic method for the precision figuring of optical components, several problems still need to be addressed, such as the limited correcting capability for mid-to-high spatial frequency surface errors and low machining efficiency for pit defects on surfaces. We propose a figuring method named deterministic ion beam material adding (IBA) technology to solve those problems in IBF. The current deterministic optical figuring mechanism, which is dedicated to removing local protuberances on optical surfaces, is enriched and developed by the IBA technology. Compared with IBF, this method can realize the uniform convergence of surface errors, where the particle transferring effect generated in the IBA process can effectively correct the mid-to-high spatial frequency errors. In addition, IBA can rapidly correct the pit defects on the surface and greatly improve the machining efficiency of the figuring process. The verification experiments are accomplished on our experimental installation to validate the feasibility of the IBA method. First, a fused silica sample with a rectangular pit defect is figured by using IBA. Through two iterations within only 47.5 min, this highly steep pit is effectively corrected, and the surface error is improved from the original 24.69 nm root mean square (RMS) to the final 3.68 nm RMS. Then another experiment is carried out to demonstrate the correcting capability of IBA for mid-to-high spatial frequency surface errors, and the final results indicate that the surface accuracy and surface quality can be simultaneously improved.

  9. Maximal Repetitions in Written Texts: Finite Energy Hypothesis vs. Strong Hilberg Conjecture

    Directory of Open Access Journals (Sweden)

    Łukasz Dębowski

    2015-08-01

    Full Text Available The article discusses two mutually-incompatible hypotheses about the stochastic mechanism of the generation of texts in natural language, which could be related to entropy. The first hypothesis, the finite energy hypothesis, assumes that texts are generated by a process with exponentially-decaying probabilities. This hypothesis implies a logarithmic upper bound for maximal repetition, as a function of the text length. The second hypothesis, the strong Hilberg conjecture, assumes that the topological entropy grows as a power law. This hypothesis leads to a hyperlogarithmic lower bound for maximal repetition. By a study of 35 written texts in German, English and French, it is found that the hyperlogarithmic growth of maximal repetition holds for natural language. In this way, the finite energy hypothesis is rejected, and the strong Hilberg conjecture is partly corroborated.

  10. Early blindness alters the spatial organization of verbal working memory.

    Science.gov (United States)

    Bottini, Roberto; Mattioni, Stefania; Collignon, Olivier

    2016-10-01

    Several studies suggest that serial order in working memory (WM) is grounded on space. For a list of ordered items held in WM, items at the beginning of the list are associated with the left side of space and items at the end of the list with the right side. This suggests that maintaining items in verbal WM is performed in strong analogy to writing these items down on a physical whiteboard for later consultation (The Mental Whiteboard Hypothesis). What drives this spatial mapping of ordered series in WM remains poorly understood. In the present study we tested whether visual experience is instrumental in establishing the link between serial order in WM and spatial processing. We tested early blind (EB), late blind (LB) and sighted individuals in an auditory WM task. Replicating previous studies, left-key responses were faster for early items in the list whereas later items facilitated right-key responses in the sighted group. The same effect was observed in LB individuals. In contrast, EB participants did not show any association between space and serial position in WM. These results suggest that early visual experience plays a critical role in linking ordered items in WM and spatial representations. The analogical spatial structure of WM may depend in part on the actual experience of using spatially organized devices (e.g., notes, whiteboards) to offload WM. These practices are largely precluded to EB individuals, who instead rely to mnemonic devices that are less spatially organized (e.g., recordings, vocal notes). The way we habitually organize information in the external world may bias the way we organize information in our WM. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. New methods for precision Moeller polarimetry*

    International Nuclear Information System (INIS)

    Gaskell, D.; Meekins, D.G.; Yan, C.

    2007-01-01

    Precision electron beam polarimetry is becoming increasingly important as parity violation experiments attempt to probe the frontiers of the standard model. In the few GeV regime, Moeller polarimetry is well suited to high-precision measurements, however is generally limited to use at relatively low beam currents (<10 μA). We present a novel technique that will enable precision Moeller polarimetry at very large currents, up to 100 μA. (orig.)

  12. Enhanced facilitation of spatial attention in schizophrenia.

    Science.gov (United States)

    Spencer, Kevin M; Nestor, Paul G; Valdman, Olga; Niznikiewicz, Margaret A; Shenton, Martha E; McCarley, Robert W

    2011-01-01

    While attentional functions are usually found to be impaired in schizophrenia, a review of the literature on the orienting of spatial attention in schizophrenia suggested that voluntary attentional orienting in response to a valid cue might be paradoxically enhanced. We tested this hypothesis with orienting tasks involving the cued detection of a laterally presented target stimulus. Subjects were chronic schizophrenia patients (SZ) and matched healthy control subjects (HC). In Experiment 1 (15 SZ, 16 HC), cues were endogenous (arrows) and could be valid (100% predictive) or neutral with respect to the subsequent target position. In Experiment 2 (16 SZ, 16 HC), subjects performed a standard orienting task with unpredictive exogenous cues (brightening of the target boxes). In Experiment 1, SZ showed a larger attentional facilitation effect on reaction time than HC. In Experiment 2, no clear sign of enhanced attentional facilitation was found in SZ. The voluntary, facilitatory shifting of spatial attention may be relatively enhanced in individuals with schizophrenia in comparison to healthy individuals. This effect bears resemblance to other relative enhancements of information processing in schizophrenia such as saccade speed and semantic priming. (c) 2010 APA, all rights reserved.

  13. The Fractal Market Hypothesis: Applications to Financial Forecasting

    OpenAIRE

    Blackledge, Jonathan

    2010-01-01

    Most financial modelling systems rely on an underlying hypothesis known as the Efficient Market Hypothesis (EMH) including the famous Black-Scholes formula for placing an option. However, the EMH has a fundamental flaw: it is based on the assumption that economic processes are normally distributed and it has long been known that this is not the case. This fundamental assumption leads to a number of shortcomings associated with using the EMH to analyse financial data which includes failure to ...

  14. Numerical precision control and GRACE

    International Nuclear Information System (INIS)

    Fujimoto, J.; Hamaguchi, N.; Ishikawa, T.; Kaneko, T.; Morita, H.; Perret-Gallix, D.; Tokura, A.; Shimizu, Y.

    2006-01-01

    The control of the numerical precision of large-scale computations like those generated by the GRACE system for automatic Feynman diagram calculations has become an intrinsic part of those packages. Recently, Hitachi Ltd. has developed in FORTRAN a new library HMLIB for quadruple and octuple precision arithmetic where the number of lost-bits is made available. This library has been tested with success on the 1-loop radiative correction to e + e - ->e + e - τ + τ - . It is shown that the approach followed by HMLIB provides an efficient way to track down the source of numerical significance losses and to deliver high-precision results yet minimizing computing time

  15. High-precision diode-laser-based temperature measurement for air refractive index compensation

    International Nuclear Information System (INIS)

    Hieta, Tuomas; Merimaa, Mikko; Vainio, Markku; Seppae, Jeremias; Lassila, Antti

    2011-01-01

    We present a laser-based system to measure the refractive index of air over a long path length. In optical distance measurements, it is essential to know the refractive index of air with high accuracy. Commonly, the refractive index of air is calculated from the properties of the ambient air using either Ciddor or Edlen equations, where the dominant uncertainty component is in most cases the air temperature. The method developed in this work utilizes direct absorption spectroscopy of oxygen to measure the average temperature of air and of water vapor to measure relative humidity. The method allows measurement of temperature and humidity over the same beam path as in optical distance measurement, providing spatially well-matching data. Indoor and outdoor measurements demonstrate the effectiveness of the method. In particular, we demonstrate an effective compensation of the refractive index of air in an interferometric length measurement at a time-variant and spatially nonhomogeneous temperature over a long time period. Further, we were able to demonstrate 7 mK RMS noise over a 67 m path length using a 120 s sample time. To our knowledge, this is the best temperature precision reported for a spectroscopic temperature measurement.

  16. [Working memory, phonological awareness and spelling hypothesis].

    Science.gov (United States)

    Gindri, Gigiane; Keske-Soares, Márcia; Mota, Helena Bolli

    2007-01-01

    Working memory, phonological awareness and spelling hypothesis. To verify the relationship between working memory, phonological awareness and spelling hypothesis in pre-school children and first graders. Participants of this study were 90 students, belonging to state schools, who presented typical linguistic development. Forty students were preschoolers, with the average age of six and 50 students were first graders, with the average age of seven. Participants were submitted to an evaluation of the working memory abilities based on the Working Memory Model (Baddeley, 2000), involving phonological loop. Phonological loop was evaluated using the Auditory Sequential Test, subtest 5 of Illinois Test of Psycholinguistic Abilities (ITPA), Brazilian version (Bogossian & Santos, 1977), and the Meaningless Words Memory Test (Kessler, 1997). Phonological awareness abilities were investigated using the Phonological Awareness: Instrument of Sequential Assessment (CONFIAS - Moojen et al., 2003), involving syllabic and phonemic awareness tasks. Writing was characterized according to Ferreiro & Teberosky (1999). Preschoolers presented the ability of repeating sequences of 4.80 digits and 4.30 syllables. Regarding phonological awareness, the performance in the syllabic level was of 19.68 and in the phonemic level was of 8.58. Most of the preschoolers demonstrated to have a pre-syllabic writing hypothesis. First graders repeated, in average, sequences of 5.06 digits and 4.56 syllables. These children presented a phonological awareness of 31.12 in the syllabic level and of 16.18 in the phonemic level, and demonstrated to have an alphabetic writing hypothesis. The performance of working memory, phonological awareness and spelling level are inter-related, as well as being related to chronological age, development and scholarity.

  17. Thorium spectrophotometric analysis with high precision

    International Nuclear Information System (INIS)

    Palmieri, H.E.L.

    1983-06-01

    An accurate and precise determination of thorium is proposed. Precision of about 0,1% is required for the determination of macroquantities of thorium processed. After an extensive literature search concerning this subject, spectrophotometric titration has been chosen, using disodium ethylenediaminetetraacetate (EDTA) solution and alizarin S as indicator. In order to obtain such a precision, an amount of 0,025 M EDTA solution precisely measured has been added and the titration was completed with less than 5 ml of 0,0025 M EDTA solution. It is usual to locate the end-point graphically, by plotting added titrant versus absorbance. The non-linear minimum square fit, using the Fletcher e Powell's minimization process and a computer program. (author)

  18. An insight in spatial corrosion prediction

    International Nuclear Information System (INIS)

    Mustaffa, Zahiraniza; Gelder, Pieter van; Hashim, Ahmad Mustafa

    2012-01-01

    Recent discoveries on fluid–structure interactions between the external flows and circular cylinders placed close to the wall have added new values to the hydrodynamics of unburied offshore pipelines laid on a sea bed. The hydrodynamics of waves and/or currents introduced vortex flows surrounding the pipeline. External corrosions formed in the pipelines were assumed to be partly contributed by such fluid–structure interactions. The spatial consequences of such interactions were of interest of this study. This paper summarizes selected previous experimental and numerical works reported by literature on these discoveries. Actual field data were utilized in this study for further validation. The characteristics of corrosion orientations in the pipelines were studied comprehensively using simple statistics and results were discussed. Results adopted from the field data acknowledged well to the hypothesis from the reported literature. The updated knowledge from this fluid–structure interaction is hoped to benefit the industry and constructively incorporated into the current subsea pipeline designs. Highlights: ► We attempt to predict spatial corrosions in offshore pipelines. ► We validate the analysis using theories on fluid–structure interactions. ► Vortex flows are assumed to cause external corrosions on pipeline walls. ► More defects are expected for pipelines placed in areas governed by waves.

  19. The economic case for precision medicine.

    Science.gov (United States)

    Gavan, Sean P; Thompson, Alexander J; Payne, Katherine

    2018-01-01

    Introduction : The advancement of precision medicine into routine clinical practice has been highlighted as an agenda for national and international health care policy. A principle barrier to this advancement is in meeting requirements of the payer or reimbursement agency for health care. This special report aims to explain the economic case for precision medicine, by accounting for the explicit objectives defined by decision-makers responsible for the allocation of limited health care resources. Areas covered : The framework of cost-effectiveness analysis, a method of economic evaluation, is used to describe how precision medicine can, in theory, exploit identifiable patient-level heterogeneity to improve population health outcomes and the relative cost-effectiveness of health care. Four case studies are used to illustrate potential challenges when demonstrating the economic case for a precision medicine in practice. Expert commentary : The economic case for a precision medicine should be considered at an early stage during its research and development phase. Clinical and economic evidence can be generated iteratively and should be in alignment with the objectives and requirements of decision-makers. Programmes of further research, to demonstrate the economic case of a precision medicine, can be prioritized by the extent that they reduce the uncertainty expressed by decision-makers.

  20. An Exercise for Illustrating the Logic of Hypothesis Testing

    Science.gov (United States)

    Lawton, Leigh

    2009-01-01

    Hypothesis testing is one of the more difficult concepts for students to master in a basic, undergraduate statistics course. Students often are puzzled as to why statisticians simply don't calculate the probability that a hypothesis is true. This article presents an exercise that forces students to lay out on their own a procedure for testing a…

  1. The Drift Burst Hypothesis

    OpenAIRE

    Christensen, Kim; Oomen, Roel; Renò, Roberto

    2016-01-01

    The Drift Burst Hypothesis postulates the existence of short-lived locally explosive trends in the price paths of financial assets. The recent US equity and Treasury flash crashes can be viewed as two high profile manifestations of such dynamics, but we argue that drift bursts of varying magnitude are an expected and regular occurrence in financial markets that can arise through established mechanisms such as feedback trading. At a theoretical level, we show how to build drift bursts into the...

  2. P value and the theory of hypothesis testing: an explanation for new researchers.

    Science.gov (United States)

    Biau, David Jean; Jolles, Brigitte M; Porcher, Raphaël

    2010-03-01

    In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.

  3. MEASUREMENT AND PRECISION, EXPERIMENTAL VERSION.

    Science.gov (United States)

    Harvard Univ., Cambridge, MA. Harvard Project Physics.

    THIS DOCUMENT IS AN EXPERIMENTAL VERSION OF A PROGRAMED TEXT ON MEASUREMENT AND PRECISION. PART I CONTAINS 24 FRAMES DEALING WITH PRECISION AND SIGNIFICANT FIGURES ENCOUNTERED IN VARIOUS MATHEMATICAL COMPUTATIONS AND MEASUREMENTS. PART II BEGINS WITH A BRIEF SECTION ON EXPERIMENTAL DATA, COVERING SUCH POINTS AS (1) ESTABLISHING THE ZERO POINT, (2)…

  4. MOLIERE: Automatic Biomedical Hypothesis Generation System.

    Science.gov (United States)

    Sybrandt, Justin; Shtutman, Michael; Safro, Ilya

    2017-08-01

    Hypothesis generation is becoming a crucial time-saving technique which allows biomedical researchers to quickly discover implicit connections between important concepts. Typically, these systems operate on domain-specific fractions of public medical data. MOLIERE, in contrast, utilizes information from over 24.5 million documents. At the heart of our approach lies a multi-modal and multi-relational network of biomedical objects extracted from several heterogeneous datasets from the National Center for Biotechnology Information (NCBI). These objects include but are not limited to scientific papers, keywords, genes, proteins, diseases, and diagnoses. We model hypotheses using Latent Dirichlet Allocation applied on abstracts found near shortest paths discovered within this network, and demonstrate the effectiveness of MOLIERE by performing hypothesis generation on historical data. Our network, implementation, and resulting data are all publicly available for the broad scientific community.

  5. Debates—Hypothesis testing in hydrology: Introduction

    Science.gov (United States)

    Blöschl, Günter

    2017-03-01

    This paper introduces the papers in the "Debates—Hypothesis testing in hydrology" series. The four articles in the series discuss whether and how the process of testing hypotheses leads to progress in hydrology. Repeated experiments with controlled boundary conditions are rarely feasible in hydrology. Research is therefore not easily aligned with the classical scientific method of testing hypotheses. Hypotheses in hydrology are often enshrined in computer models which are tested against observed data. Testability may be limited due to model complexity and data uncertainty. All four articles suggest that hypothesis testing has contributed to progress in hydrology and is needed in the future. However, the procedure is usually not as systematic as the philosophy of science suggests. A greater emphasis on a creative reasoning process on the basis of clues and explorative analyses is therefore needed.

  6. Mid-embryo patterning and precision in Drosophila segmentation: Krüppel dual regulation of hunchback.

    Directory of Open Access Journals (Sweden)

    David M Holloway

    Full Text Available In early development, genes are expressed in spatial patterns which later define cellular identities and tissue locations. The mechanisms of such pattern formation have been studied extensively in early Drosophila (fruit fly embryos. The gap gene hunchback (hb is one of the earliest genes to be expressed in anterior-posterior (AP body segmentation. As a transcriptional regulator for a number of downstream genes, the spatial precision of hb expression can have significant effects in the development of the body plan. To investigate the factors contributing to hb precision, we used fine spatial and temporal resolution data to develop a quantitative model for the regulation of hb expression in the mid-embryo. In particular, modelling hb pattern refinement in mid nuclear cleavage cycle 14 (NC14 reveals some of the regulatory contributions of simultaneously-expressed gap genes. Matching the model to recent data from wild-type (WT embryos and mutants of the gap gene Krüppel (Kr indicates that a mid-embryo Hb concentration peak important in thoracic development (at parasegment 4, PS4 is regulated in a dual manner by Kr, with low Kr concentration activating hb and high Kr concentration repressing hb. The processes of gene expression (transcription, translation, transport are intrinsically random. We used stochastic simulations to characterize the noise generated in hb expression. We find that Kr regulation can limit the positional variability of the Hb mid-embryo border. This has been recently corroborated in experimental comparisons of WT and Kr- mutant embryos. Further, Kr regulation can decrease uncertainty in mid-embryo hb expression (i.e. contribute to a smooth Hb boundary and decrease between-copy transcriptional variability within nuclei. Since many tissue boundaries are first established by interactions between neighbouring gene expression domains, these properties of Hb-Kr dynamics to diminish the effects of intrinsic expression noise may

  7. Antiaging therapy: a prospective hypothesis

    Directory of Open Access Journals (Sweden)

    Shahidi Bonjar MR

    2015-01-01

    Full Text Available Mohammad Rashid Shahidi Bonjar,1 Leyla Shahidi Bonjar2 1School of Dentistry, Kerman University of Medical Sciences, Kerman Iran; 2Department of Pharmacology, College of Pharmacy, Kerman University of Medical Sciences, Kerman, Iran Abstract: This hypothesis proposes a new prospective approach to slow the aging process in older humans. The hypothesis could lead to developing new treatments for age-related illnesses and help humans to live longer. This hypothesis has no previous documentation in scientific media and has no protocol. Scientists have presented evidence that systemic aging is influenced by peculiar molecules in the blood. Researchers at Albert Einstein College of Medicine, New York, and Harvard University in Cambridge discovered elevated titer of aging-related molecules (ARMs in blood, which trigger cascade of aging process in mice; they also indicated that the process can be reduced or even reversed. By inhibiting the production of ARMs, they could reduce age-related cognitive and physical declines. The present hypothesis offers a new approach to translate these findings into medical treatment: extracorporeal adjustment of ARMs would lead to slower rates of aging. A prospective “antiaging blood filtration column” (AABFC is a nanotechnological device that would fulfill the central role in this approach. An AABFC would set a near-youth homeostatic titer of ARMs in the blood. In this regard, the AABFC immobilizes ARMs from the blood while blood passes through the column. The AABFC harbors antibodies against ARMs. ARM antibodies would be conjugated irreversibly to ARMs on contact surfaces of the reaction platforms inside the AABFC till near-youth homeostasis is attained. The treatment is performed with the aid of a blood-circulating pump. Similar to a renal dialysis machine, blood would circulate from the body to the AABFC and from there back to the body in a closed circuit until ARMs were sufficiently depleted from the blood. The

  8. A hypothesis of a redistribution of North Atlantic swordfish based on changing ocean conditions

    Science.gov (United States)

    Schirripa, Michael J.; Abascal, F.; Andrushchenko, Irene; Diaz, Guillermo; Mejuto, Jaime; Ortiz, Maricio; Santos, M. N.; Walter, John

    2017-06-01

    Conflicting trends in indices of abundance for North Atlantic swordfish starting in the mid-to late 1990s, in the form of fleet specific catch-per-unit-effort (CPUE), suggest the possibility of a spatial shift in abundance to follow areas of preferred temperature. The observed changes in the direction of the CPUEs correspond with changes in trends in the summer Atlantic Multidecadal Oscillation (AMO), a long term mode of variability of North Atlantic sea surface temperature. To test the hypothesis of a relation between the CPUE and the AMO, the CPUEs were made spatially explicit by re-estimating using an ;areas-as-fleets; approach. These new CPUEs were then used to create alternative stock histories. The residuals of the fit were then regressed against the summer AMO. Significant, and opposite, relations were found in the regressions between eastern and western Atlantic areas. When the AMO was in a warm phase, the CPUEs in the western (eastern) areas were higher (lower) than predicted by the assessment model fit. Given the observed temperature tolerance limits of swordfish, it is possible that either their preferred habitat, prey species, or both have shifted spatial distributions resulting in conflicting CPUE indices. Because the available CPUE time series only overlaps with one change in the sign of the AMO ( 1995), it is not clear whether this is a directional or cyclical trend. Given the relatively localized nature of many of the fishing fleets, and the difficulty of separating fleet effects from changes in oceanography we feel that it is critical to create CPUE indices by combining data across similar fleets that fish in similar areas. This approach allowed us to evaluate area-specific catch rates which provided the power to detect basin-wide responses to changing oceanography, a critical step for providing robust management advice in a changing climate.

  9. Precision Experiments at LEP

    CERN Document Server

    de Boer, Wim

    2015-01-01

    The Large Electron Positron Collider (LEP) established the Standard Model (SM) of particle physics with unprecedented precision, including all its radiative corrections. These led to predictions for the masses of the top quark and Higgs boson, which were beautifully confirmed later on. After these precision measurements the Nobel Prize in Physics was awarded in 1999 jointly to 't Hooft and Veltman "for elucidating the quantum structure of electroweak interactions in physics". Another hallmark of the LEP results were the precise measurements of the gauge coupling constants, which excluded unification of the forces within the SM, but allowed unification within the supersymmetric extension of the SM. This increased the interest in Supersymmetry (SUSY) and Grand Unified Theories, especially since the SM has no candidate for the elusive dark matter, while Supersymmetry provides an excellent candidate for dark matter. In addition, Supersymmetry removes the quadratic divergencies of the SM and {\\it predicts} the Hig...

  10. Precision Health Economics and Outcomes Research to Support Precision Medicine: Big Data Meets Patient Heterogeneity on the Road to Value

    Directory of Open Access Journals (Sweden)

    Yixi Chen

    2016-11-01

    Full Text Available The “big data” era represents an exciting opportunity to utilize powerful new sources of information to reduce clinical and health economic uncertainty on an individual patient level. In turn, health economic outcomes research (HEOR practices will need to evolve to accommodate individual patient–level HEOR analyses. We propose the concept of “precision HEOR”, which utilizes a combination of costs and outcomes derived from big data to inform healthcare decision-making that is tailored to highly specific patient clusters or individuals. To explore this concept, we discuss the current and future roles of HEOR in health sector decision-making, big data and predictive analytics, and several key HEOR contexts in which big data and predictive analytics might transform traditional HEOR into precision HEOR. The guidance document addresses issues related to the transition from traditional to precision HEOR practices, the evaluation of patient similarity analysis and its appropriateness for precision HEOR analysis, and future challenges to precision HEOR adoption. Precision HEOR should make precision medicine more realizable by aiding and adapting healthcare resource allocation. The combined hopes for precision medicine and precision HEOR are that individual patients receive the best possible medical care while overall healthcare costs remain manageable or become more cost-efficient.

  11. Uniform functional structure across spatial scales in an intertidal benthic assemblage.

    Science.gov (United States)

    Barnes, R S K; Hamylton, Sarah

    2015-05-01

    To investigate the causes of the remarkable similarity of emergent assemblage properties that has been demonstrated across disparate intertidal seagrass sites and assemblages, this study examined whether their emergent functional-group metrics are scale related by testing the null hypothesis that functional diversity and the suite of dominant functional groups in seagrass-associated macrofauna are robust structural features of such assemblages and do not vary spatially across nested scales within a 0.4 ha area. This was carried out via a lattice of 64 spatially referenced stations. Although densities of individual components were patchily dispersed across the locality, rank orders of importance of the 14 functional groups present, their overall functional diversity and evenness, and the proportions of the total individuals contained within each showed, in contrast, statistically significant spatial uniformity, even at areal scales functional groups in their geospatial context also revealed weaker than expected levels of spatial autocorrelation, and then only at the smaller scales and amongst the most dominant groups, and only a small number of negative correlations occurred between the proportional importances of the individual groups. In effect, such patterning was a surface veneer overlying remarkable stability of assemblage functional composition across all spatial scales. Although assemblage species composition is known to be homogeneous in some soft-sediment marine systems over equivalent scales, this combination of patchy individual components yet basically constant functional-group structure seems as yet unreported. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Does Portuguese economy support crude oil conservation hypothesis?

    International Nuclear Information System (INIS)

    Bashiri Behmiri, Niaz; Pires Manso, José R.

    2012-01-01

    This paper examines cointegration relationships and Granger causality nexus in a trivariate framework among oil consumption, economic growth and international oil price in Portugal. For this purpose, we employ two Granger causality approaches: the Johansen cointegration test and vector error correction model (VECM) and the Toda–Yamamoto approaches. Cointegration test proves the existence of a long run equilibrium relationship among these variables and VECM and Toda–Yamamoto Granger causality tests indicate that there is bidirectional causality between crude oil consumption and economic growth (feed back hypothesis). Therefore, the Portuguese economy does not support crude oil conservation hypothesis. Consequently, policymakers should consider that implementing oil conservation and environmental policies may negatively impact on the Portuguese economic growth. - Highlights: ► We examine Granger causality among oil consumption, GDP and oil price in Portugal. ► VECM and Toda–Yamamoto tests found bidirectional causality among oil and GDP. ► Portuguese economy does not support the crude oil conservation hypothesis.

  13. A comparator-hypothesis account of biased contingency detection.

    Science.gov (United States)

    Vadillo, Miguel A; Barberia, Itxaso

    2018-02-12

    Our ability to detect statistical dependencies between different events in the environment is strongly biased by the number of coincidences between them. Even when there is no true covariation between a cue and an outcome, if the marginal probability of either of them is high, people tend to perceive some degree of statistical contingency between both events. The present paper explores the ability of the Comparator Hypothesis to explain the general pattern of results observed in this literature. Our simulations show that this model can account for the biasing effects of the marginal probabilities of cues and outcomes. Furthermore, the overall fit of the Comparator Hypothesis to a sample of experimental conditions from previous studies is comparable to that of the popular Rescorla-Wagner model. These results should encourage researchers to further explore and put to the test the predictions of the Comparator Hypothesis in the domain of biased contingency detection. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Precision Medicine, Cardiovascular Disease and Hunting Elephants.

    Science.gov (United States)

    Joyner, Michael J

    2016-01-01

    Precision medicine postulates improved prediction, prevention, diagnosis and treatment of disease based on patient specific factors especially DNA sequence (i.e., gene) variants. Ideas related to precision medicine stem from the much anticipated "genetic revolution in medicine" arising seamlessly from the human genome project (HGP). In this essay I deconstruct the concept of precision medicine and raise questions about the validity of the paradigm in general and its application to cardiovascular disease. Thus far precision medicine has underperformed based on the vision promulgated by enthusiasts. While niche successes for precision medicine are likely, the promises of broad based transformation should be viewed with skepticism. Open discussion and debate related to precision medicine are urgently needed to avoid misapplication of resources, hype, iatrogenic interventions, and distraction from established approaches with ongoing utility. Failure to engage in such debate will lead to negative unintended consequences from a revolution that might never come. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Counselor Hypothesis Testing Strategies: The Role of Initial Impressions and Self-Schema.

    Science.gov (United States)

    Strohmer, Douglas C.; Chiodo, Anthony L.

    1984-01-01

    Presents two experiments concerning confirmatory bias in the way counselors collect data to test their hypotheses. Counselors were asked either to develop their own clinical hypothesis or were given a hypothesis to test. Confirmatory bias in hypothesis testing was not supported in either experiment. (JAC)

  16. A Molecular–Structure Hypothesis

    Directory of Open Access Journals (Sweden)

    Jan C. A. Boeyens

    2010-11-01

    Full Text Available The self-similar symmetry that occurs between atomic nuclei, biological growth structures, the solar system, globular clusters and spiral galaxies suggests that a similar pattern should characterize atomic and molecular structures. This possibility is explored in terms of the current molecular structure-hypothesis and its extension into four-dimensional space-time. It is concluded that a quantum molecule only has structure in four dimensions and that classical (Newtonian structure, which occurs in three dimensions, cannot be simulated by quantum-chemical computation.

  17. Comparison of the precision of three commonly used GPS models

    Directory of Open Access Journals (Sweden)

    E Chavoshi

    2016-04-01

    Full Text Available Introduction: Development of science in various fields has caused change in the methods to determine geographical location. Precision farming involves new technology that provides the opportunity for farmers to change in factors such as nutrients, soil moisture available to plants, soil physical and chemical characteristics and other factors with the spatial resolution of less than a centimeter to several meters to monitor and evaluate. GPS receivers based on precision farming operations specified accuracies are used in the following areas: 1 monitoring of crop and soil sampling (less than one meter accuracy 2 use of fertilizer, pesticide and seed work (less than half a meter accuracy 3 Transplantation and row cultivation (precision of less than 4 cm (Perez et al., 2011. In one application of GPS in agriculture, route guidance precision farming tractors in the fields was designed to reduce the transmission error that deviate from the path specified in the range of 50 to 300 mm driver informed and improved way to display (Perez et al., 2011. In another study, the system automatically guidance, based on RTK-GPS technology, precision tillage operations was used between and within the rows very close to the drip irrigation pipe and without damage to their crops at a distance of 50 mm (Abidine et al., 2004. In another study, to compare the accuracy and precision of the receivers, 5 different models of Trimble Mark GPS devices from 15 stations were mapped, the results indicated that minimum error was related to Geo XT model with an accuracy of 91 cm and maximum error was related to Pharos model with an accuracy of 5.62 m (Kindra et al., 2006. Due to the increasing use of GPS receivers in agriculture as well as the lack of trust on the real accuracy and precision of receivers, this study aimed to compare the positioning accuracy and precision of three commonly used GPS receivers models used to specify receivers with the lowest error for precision

  18. Principles of precision medicine in stroke.

    Science.gov (United States)

    Hinman, Jason D; Rost, Natalia S; Leung, Thomas W; Montaner, Joan; Muir, Keith W; Brown, Scott; Arenillas, Juan F; Feldmann, Edward; Liebeskind, David S

    2017-01-01

    The era of precision medicine has arrived and conveys tremendous potential, particularly for stroke neurology. The diagnosis of stroke, its underlying aetiology, theranostic strategies, recurrence risk and path to recovery are populated by a series of highly individualised questions. Moreover, the phenotypic complexity of a clinical diagnosis of stroke makes a simple genetic risk assessment only partially informative on an individual basis. The guiding principles of precision medicine in stroke underscore the need to identify, value, organise and analyse the multitude of variables obtained from each individual to generate a precise approach to optimise cerebrovascular health. Existing data may be leveraged with novel technologies, informatics and practical clinical paradigms to apply these principles in stroke and realise the promise of precision medicine. Importantly, precision medicine in stroke will only be realised once efforts to collect, value and synthesise the wealth of data collected in clinical trials and routine care starts. Stroke theranostics, the ultimate vision of synchronising tailored therapeutic strategies based on specific diagnostic data, demand cerebrovascular expertise on big data approaches to clinically relevant paradigms. This review considers such challenges and delineates the principles on a roadmap for rational application of precision medicine to stroke and cerebrovascular health. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  19. Precision medicine needs pioneering clinical bioinformaticians.

    Science.gov (United States)

    Gómez-López, Gonzalo; Dopazo, Joaquín; Cigudosa, Juan C; Valencia, Alfonso; Al-Shahrour, Fátima

    2017-10-25

    Success in precision medicine depends on accessing high-quality genetic and molecular data from large, well-annotated patient cohorts that couple biological samples to comprehensive clinical data, which in conjunction can lead to effective therapies. From such a scenario emerges the need for a new professional profile, an expert bioinformatician with training in clinical areas who can make sense of multi-omics data to improve therapeutic interventions in patients, and the design of optimized basket trials. In this review, we first describe the main policies and international initiatives that focus on precision medicine. Secondly, we review the currently ongoing clinical trials in precision medicine, introducing the concept of 'precision bioinformatics', and we describe current pioneering bioinformatics efforts aimed at implementing tools and computational infrastructures for precision medicine in health institutions around the world. Thirdly, we discuss the challenges related to the clinical training of bioinformaticians, and the urgent need for computational specialists capable of assimilating medical terminologies and protocols to address real clinical questions. We also propose some skills required to carry out common tasks in clinical bioinformatics and some tips for emergent groups. Finally, we explore the future perspectives and the challenges faced by precision medicine bioinformatics. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Precision validation of MIPAS-Envisat products

    Directory of Open Access Journals (Sweden)

    C. Piccolo

    2007-01-01

    Full Text Available This paper discusses the variation and validation of the precision, or estimated random error, associated with the ESA Level 2 products from the Michelson Interferometer for Passive Atmospheric Sounding (MIPAS. This quantity represents the propagation of the radiometric noise from the spectra through the retrieval process into the Level 2 profile values. The noise itself varies with time, steadily rising between ice decontamination events, but the Level 2 precision has a greater variation due to the atmospheric temperature which controls the total radiance received. Hence, for all species, the precision varies latitudinally/seasonally with temperature, with a small superimposed temporal structure determined by the degree of ice contamination on the detectors. The precision validation involves comparing two MIPAS retrievals at the intersections of ascending/descending orbits. For 5 days per month of full resolution MIPAS operation, the standard deviation of the matching profile pairs is computed and compared with the precision given in the MIPAS Level 2 data, except for NO2 since it has a large diurnal variation between ascending/descending intersections. Even taking into account the propagation of the pressure-temperature retrieval errors into the VMR retrieval, the standard deviation of the matching pairs is usually a factor 1–2 larger than the precision. This is thought to be due to effects such as horizontal inhomogeneity of the atmosphere and instability of the retrieval.