Airborne Crowd Density Estimation
Meynberg, O.; Kuschk, G.
2013-10-01
This paper proposes a new method for estimating human crowd densities from aerial imagery. Applications benefiting from an accurate crowd monitoring system are mainly found in the security sector. Normally crowd density estimation is done through in-situ camera systems mounted on high locations although this is not appropriate in case of very large crowds with thousands of people. Using airborne camera systems in these scenarios is a new research topic. Our method uses a preliminary filtering of the whole image space by suitable and fast interest point detection resulting in a number of image regions, possibly containing human crowds. Validation of these candidates is done by transforming the corresponding image patches into a low-dimensional and discriminative feature space and classifying the results using a support vector machine (SVM). The feature space is spanned by texture features computed by applying a Gabor filter bank with varying scale and orientation to the image patches. For evaluation, we use 5 different image datasets acquired by the 3K+ aerial camera system of the German Aerospace Center during real mass events like concerts or football games. To evaluate the robustness and generality of our method, these datasets are taken from different flight heights between 800 m and 1500 m above ground (keeping a fixed focal length) and varying daylight and shadow conditions. The results of our crowd density estimation are evaluated against a reference data set obtained by manually labeling tens of thousands individual persons in the corresponding datasets and show that our method is able to estimate human crowd densities in challenging realistic scenarios.
Design and Environment, 1972
1972-01-01
Three-part report pinpointing problems and uncovering solutions for the dual concepts of density (ratio of people to space) and crowding (psychological response to density). Section one, A Primer on Crowding,'' reviews new psychological and social findings; section two, Density in the Suburbs,'' shows conflict between status quo and increased…
Crowd Art: Density and Flow Based Crowd Motion Design
Jordao, Kevin; Charalambous, Panayiotis; Christie, Marc; Pettré, Julien; Cani, Marie-Paule
2015-01-01
International audience; Artists, animation and game designers are in demand for solutions to easily populate large virtual environments with crowds that satisfy desired visual features. This paper presents a method to intuitively populate virtual environments by specifying two key features: localized density, being the amount of agents per unit of surface, and localized flow, being the direction in which agents move through a unit of surface. The technique we propose is also time-independant,...
Real-Time Density-Based Crowd Simulation
van Toll, W.G.; Cook IV, A.F.; Geraerts, R.J.
2012-01-01
Virtual characters in games and simulations often need to plan visually convincing paths through a crowded environment. This paper describes how crowd density information can be used to guide a large number of characters through a crowded environment. Crowd density information helps characters avoid
Realistic Crowd Simulation with Density-Based Path Planning
van Toll, W.G.; Cook IV, A.F.; Geraerts, R.J.
2012-01-01
Virtual characters in games and simulations often need to plan visually convincing paths through a crowded environment. This paper describes how crowd density information can be used to guide a large number of characters through a crowded environment. Crowd density information helps characters avoid
Realistic Crowd Simulation with Density-Based Path Planning
van Toll, W.G.; Cook IV, A.F.; Geraerts, R.J.
2012-01-01
Virtual characters in games and simulations often need to plan visually convincing paths through a crowded environment. This paper describes how crowd density information can be used to guide a large number of characters through a crowded environment. Crowd density information helps characters avoid
Real-Time Density-Based Crowd Simulation
van Toll, W.G.; Cook IV, A.F.; Geraerts, R.J.
2012-01-01
Virtual characters in games and simulations often need to plan visually convincing paths through a crowded environment. This paper describes how crowd density information can be used to guide a large number of characters through a crowded environment. Crowd density information helps characters avoid
Crowd Analysis by Using Optical Flow and Density Based Clustering
DEFF Research Database (Denmark)
Santoro, Francesco; Pedro, Sergio; Tan, Zheng-Hua;
2010-01-01
, it is applied a crowd tracker in every frame, allowing us to detect and track the crowds. Our system gives the output as a graphic overlay, i.e it adds arrows and colors to the original frame sequence, in order to identify crowds and their movements. For the evaluation, we check when our system detect certains......In this paper, we present a system to detect and track crowds in a video sequence captured by a camera. In a first step, we compute optical flows by means of pyramidal Lucas-Kanade feature tracking. Afterwards, a density based clustering is used to group similar vectors. In the last step...... events on the crowds, such as merging, splitting and collision....
L0 Regularized Stationary-time Estimation for Crowd Analysis.
Yi, Shuai; Wang, Xiaogang; Lu, Cewu; Jia, Jiaya; Li, Hongsheng
2016-04-29
In this paper, we tackle the problem of stationary crowd analysis which is as important as modeling mobile groups in crowd scenes and finds many important applications in crowd surveillance. Our key contribution is to propose a robust algorithm for estimating how long a foreground pixel becomes stationary. It is much more challenging than only subtracting background because failure at a single frame due to local movement of objects, lighting variation, and occlusion could lead to large errors on stationary-time estimation. To achieve robust and accurate estimation, sparse constraints along spatial and temporal dimensions are jointly added by mixed partials (which are second-order gradients) to shape a 3D stationary-time map. It is formulated as an L0 optimization problem. Besides background subtraction, it distinguishes among different foreground objects, which are close or overlapped in the spatio-temporal space by using a locally shared foreground codebook. The proposed technologies are further demonstrated through three applications. 1) Based on the results of stationary-time estimation, twelve descriptors are proposed to detect four types of stationary crowd activities. 2) The averaged stationary-time map is estimated to analyze crowd scene structures. 3) The result of stationary-time estimation is also used to study the influence of stationary crowd groups to traffic patterns.
Crowd Analysis by Using Optical Flow and Density Based Clustering
DEFF Research Database (Denmark)
Santoro, Francesco; Pedro, Sergio; Tan, Zheng-Hua
2010-01-01
In this paper, we present a system to detect and track crowds in a video sequence captured by a camera. In a first step, we compute optical flows by means of pyramidal Lucas-Kanade feature tracking. Afterwards, a density based clustering is used to group similar vectors. In the last step...
Estimating the crowding level with a neuro-fuzzy classifier
Boninsegna, Massimo; Coianiz, Tarcisio; Trentin, Edmondo
1997-07-01
This paper introduces a neuro-fuzzy system for the estimation of the crowding level in a scene. Monitoring the number of people present in a given indoor environment is a requirement in a variety of surveillance applications. In the present work, crowding has to be estimated from the image processing of visual scenes collected via a TV camera. A suitable preprocessing of the images, along with an ad hoc feature extraction process, is discussed. Estimation of the crowding level in the feature space is described in terms of a fuzzy decision rule, which relies on the membership of input patterns to a set of partially overlapping crowding classes, comprehensive of doubt classifications and outliers. A society of neural networks, either multilayer perceptrons or hyper radial basis functions, is trained to model individual class-membership functions. Integration of the neural nets within the fuzzy decision rule results in an overall neuro-fuzzy classifier. Important topics concerning the generalization ability, the robustness, the adaptivity and the performance evaluation of the system are explored. Experiments with real-world data were accomplished, comparing the present approach with statistical pattern recognition techniques, namely linear discriminant analysis and nearest neighbor. Experimental results validate the neuro-fuzzy approach to a large extent. The system is currently working successfully as a part of a monitoring system in the Dinegro underground station in Genoa, Italy.
Wang, Jinghong; Lo, Siuming; Wang, Qingsong; Sun, Jinhua; Mu, Honglin
2013-08-01
Crowd density is a key factor that influences the moving characteristics of a large group of people during a large-scale evacuation. In this article, the macro features of crowd flow and subsequent rescue strategies were considered, and a series of characteristic crowd densities that affect large-scale people movement, as well as the maximum bearing density when the crowd is extremely congested, were analyzed. On the basis of characteristic crowd densities, the queuing theory was applied to simulate crowd movement. Accordingly, the moving characteristics of the crowd and the effects of typical crowd density-which is viewed as the representation of the crowd's arrival intensity in front of the evacuation passageways-on rescue strategies was studied. Furthermore, a "risk axle of crowd density" is proposed to determine the efficiency of rescue strategies in a large-scale evacuation, i.e., whether the rescue strategies are able to effectively maintain or improve evacuation efficiency. Finally, through some rational hypotheses for the value of evacuation risk, a three-dimensional distribution of the evacuation risk is established to illustrate the risk axle of crowd density. This work aims to make some macro, but original, analysis on the risk of large-scale crowd evacuation from the perspective of the efficiency of rescue strategies. © 2012 Society for Risk Analysis.
[The condition of crowding and spacing. Measuring or estimation?].
Reukers, H A; Kuijpers-Jagtman, A M; van 't Hof, M A
1994-10-01
Two methods for the assessment of the amount of crowding or spacing (measuring and assessment by eye) are compared. Both methods are well comparable and reproducible. Assessment by eye has the practical advantage that it takes considerable less time.
State Variability in Childrens Medicaid Crowd-Out Estimates
U.S. Department of Health & Human Services — Health insurance crowd-out occurs when individuals enrolled in a public health insurance plan would have enrolled in a private plan but for the public option. The...
Contingent kernel density estimation.
Directory of Open Access Journals (Sweden)
Scott Fortmann-Roe
Full Text Available Kernel density estimation is a widely used method for estimating a distribution based on a sample of points drawn from that distribution. Generally, in practice some form of error contaminates the sample of observed points. Such error can be the result of imprecise measurements or observation bias. Often this error is negligible and may be disregarded in analysis. In cases where the error is non-negligible, estimation methods should be adjusted to reduce resulting bias. Several modifications of kernel density estimation have been developed to address specific forms of errors. One form of error that has not yet been addressed is the case where observations are nominally placed at the centers of areas from which the points are assumed to have been drawn, where these areas are of varying sizes. In this scenario, the bias arises because the size of the error can vary among points and some subset of points can be known to have smaller error than another subset or the form of the error may change among points. This paper proposes a "contingent kernel density estimation" technique to address this form of error. This new technique adjusts the standard kernel on a point-by-point basis in an adaptive response to changing structure and magnitude of error. In this paper, equations for our contingent kernel technique are derived, the technique is validated using numerical simulations, and an example using the geographic locations of social networking users is worked to demonstrate the utility of the method.
Mechanisms for perception of numerosity or texture-density are governed by crowding-like effects.
Anobile, Giovanni; Turi, Marco; Cicchini, Guido Marco; Burr, David C
2015-01-01
We have recently provided evidence that the perception of number and texture density is mediated by two independent mechanisms: numerosity mechanisms at relatively low numbers, obeying Weber's law, and texture-density mechanisms at higher numerosities, following a square root law. In this study we investigated whether the switch between the two mechanisms depends on the capacity to segregate individual dots, and therefore follows similar laws to those governing visual crowding. We measured numerosity discrimination for a wide range of numerosities at three eccentricities. We found that the point where the numerosity regime (Weber's law) gave way to the density regime (square root law) depended on eccentricity. In central vision, the regime changed at 2.3 dots/°2, while at 15° eccentricity, it changed at 0.5 dots/°2, three times less dense. As a consequence, thresholds for low numerosities increased with eccentricity, while at higher numerosities thresholds remained constant. We further showed that like crowding, the regime change was independent of dot size, depending on distance between dot centers, not distance between dot edges or ink coverage. Performance was not affected by stimulus contrast or blur, indicating that the transition does not depend on low-level stimulus properties. Our results reinforce the notion that numerosity and texture are mediated by two distinct processes, depending on whether the individual elements are perceptually segregable. Which mechanism is engaged follows laws that determine crowding.
Optimal cytoplasmatic density and flux balance model under macromolecular crowding effects.
Vazquez, Alexei
2010-05-21
Macromolecules occupy between 34% and 44% of the cell cytoplasm, about half the maximum packing density of spheres in three dimension. Yet, there is no clear understanding of what is special about this value. To address this fundamental question we investigate the effect of macromolecular crowding on cell metabolism. We develop a cell scale flux balance model capturing the main features of cell metabolism at different nutrient uptakes and macromolecular densities. Using this model we show there are two metabolic regimes at low and high nutrient uptakes. The latter regime is characterized by an optimal cytoplasmatic density where the increase of reaction rates by confinement and the decrease by diffusion slow-down balance. More important, the predicted optimal density is in the range of the experimentally determined density of Escherichia coli.
Beyond Crowd Judgments: Data-driven Estimation of Market Value in Association Football
DEFF Research Database (Denmark)
Müller, Oliver; Simons, Alexander; Weinmann, Markus
2017-01-01
Association football is a popular sport, but it is also a big business. From a managerial perspective, the most important decisions that team managers make concern player transfers, so issues related to player valuation, especially the determination of transfer fees and market values, are of major...... concern. Market values can be understood as estimates of transfer fees—that is, prices that could be paid for a player on the football market—so they play an important role in transfer negotiations. These values have traditionally been estimated by football experts, but crowdsourcing has emerged...... as an increasingly popular approach to estimating market value. While researchers have found high correlations between crowdsourced market values and actual transfer fees, the process behind crowd judgments is not transparent, crowd estimates are not replicable, and they are updated infrequently because they require...
Emergent Structural Mechanisms for High-Density Collective Motion Inspired by Human Crowds
Bottinelli, Arianna; Sumpter, David T. J.; Silverberg, Jesse L.
2016-11-01
Collective motion of large human crowds often depends on their density. In extreme cases like heavy metal concerts and black Friday sales events, motion is dominated by physical interactions instead of conventional social norms. Here, we study an active matter model inspired by situations when large groups of people gather at a point of common interest. Our analysis takes an approach developed for jammed granular media and identifies Goldstone modes, soft spots, and stochastic resonance as structurally driven mechanisms for potentially dangerous emergent collective motion.
Emergent Structural Mechanisms for High-Density Collective Motion Inspired by Human Crowds
Bottinelli, Arianna; Silverberg, Jesse L
2016-01-01
Collective motion of large human crowds often depends on their density. In extreme cases like heavy metal concerts and Black Friday sales events, motion is dominated by physical interactions instead of conventional social norms. Here, we study an active matter model inspired by situations when large groups of people gather at a point of common interest. Our analysis takes an approach developed for jammed granular media and identifies Goldstone modes, soft spots, and stochastic resonance as structurally-driven mechanisms for potentially dangerous emergent collective motion.
Comparison of density estimators. [Estimation of probability density functions
Energy Technology Data Exchange (ETDEWEB)
Kao, S.; Monahan, J.F.
1977-09-01
Recent work in the field of probability density estimation has included the introduction of some new methods, such as the polynomial and spline methods and the nearest neighbor method, and the study of asymptotic properties in depth. This earlier work is summarized here. In addition, the computational complexity of the various algorithms is analyzed, as are some simulations. The object is to compare the performance of the various methods in small samples and their sensitivity to change in their parameters, and to attempt to discover at what point a sample is so small that density estimation can no longer be worthwhile. (RWR)
Directory of Open Access Journals (Sweden)
André Gergs
Full Text Available Population size is often regulated by negative feedback between population density and individual fitness. At high population densities, animals run into double trouble: they might concurrently suffer from overexploitation of resources and also from negative interference among individuals regardless of resource availability, referred to as crowding. Animals are able to adapt to resource shortages by exhibiting a repertoire of life history and physiological plasticities. In addition to resource-related plasticity, crowding might lead to reduced fitness, with consequences for individual life history. We explored how different mechanisms behind resource-related plasticity and crowding-related fitness act independently or together, using the water flea Daphnia magna as a case study. For testing hypotheses related to mechanisms of plasticity and crowding stress across different biological levels, we used an individual-based population model that is based on dynamic energy budget theory. Each of the hypotheses, represented by a sub-model, is based on specific assumptions on how the uptake and allocation of energy are altered under conditions of resource shortage or crowding. For cross-level testing of different hypotheses, we explored how well the sub-models fit individual level data and also how well they predict population dynamics under different conditions of resource availability. Only operating resource-related and crowding-related hypotheses together enabled accurate model predictions of D. magna population dynamics and size structure. Whereas this study showed that various mechanisms might play a role in the negative feedback between population density and individual life history, it also indicated that different density levels might instigate the onset of the different mechanisms. This study provides an example of how the integration of dynamic energy budget theory and individual-based modelling can facilitate the exploration of mechanisms
Parallel Multiscale Autoregressive Density Estimation
Reed, Scott; Oord, Aäron van den; Kalchbrenner, Nal; Colmenarejo, Sergio Gómez; Wang, Ziyu; Belov, Dan; de Freitas, Nando
2017-01-01
PixelCNN achieves state-of-the-art results in density estimation for natural images. Although training is fast, inference is costly, requiring one network evaluation per pixel; O(N) for N pixels. This can be sped up by caching activations, but still involves generating each pixel sequentially. In this work, we propose a parallelized PixelCNN that allows more efficient inference by modeling certain pixel groups as conditionally independent. Our new PixelCNN model achieves competitive density e...
Assembling GHERG: Could "academic crowd-sourcing" address gaps in global health estimates?
Rudan, Igor; Campbell, Harry; Marušić, Ana; Sridhar, Devi; Nair, Harish; Adeloye, Davies; Theodoratou, Evropi; Chan, Kit Yee
2015-06-01
In recent months, the World Health Organization (WHO), independent academic researchers, the Lancet and PLoS Medicine journals worked together to improve reporting of population health estimates. The new guidelines for accurate and transparent health estimates reporting (likely to be named GATHER), which are eagerly awaited, represent a helpful move that should benefit the field of global health metrics. Building on this progress and drawing from a tradition of Child Health Epidemiology Reference Group (CHERG)'s successful work model, we would like to propose a new initiative - "Global Health Epidemiology Reference Group" (GHERG). We see GHERG as an informal and entirely voluntary international collaboration of academic groups who are willing to contribute to improving disease burden estimates and respect the principles of the new guidelines - a form of "academic crowd-sourcing". The main focus of GHERG will be to identify the "gap areas" where not much information is available and/or where there is a lot of uncertainty present about the accuracy of the existing estimates. This approach should serve to complement the existing WHO and IHME estimates and to represent added value to both efforts.
Tracking individuals in surveillance video of a high-density crowd
Hu, N.; Bouma, H.; Worring, M.
2012-01-01
Video cameras are widely used for monitoring public areas, such as train stations, airports and shopping centers. When crowds are dense, automatically tracking individuals becomes a challenging task. We propose a new tracker which employs a particle filter tracking framework, where the state
Crowd-Sourced Calibration: The GEDI Strategy for Empirical Biomass Estimation Using Spaceborne Lidar
Dubayah, R.
2015-12-01
The central task in estimating forest biomass from spaceborne sensors is the development of calibration equations that relate observed forest structure to biomass at a variety of spatial scales. Empirical methods generally rely on statistical estimation or machine learning techniques where field-based estimates of biomass at the plot level are associated with post-launch observations of variables such as canopy height and cover. For global-scale mapping the process is complex and leads to a number of questions: How many calibrations are required to capture non-stationarity in the relationships? Where does one calibration begin and another end? Should calibrations be conditioned by biome? Vegetation type? Land-use? Post-launch calibrations lead to further complications, such as the requirement to have sufficient field plot data underneath potentially sparse satellite observations, spatial and temporal mismatches in scale between field plots and pixels, and geolocation uncertainty, both in the plots and the satellite data. The Global Ecosystem Dynamics Investigation (GEDI) is under development by NASA to estimate forest biomass. GEDI will deploy a multi-beam lidar on the International Space Station and provide billions of observations of forest structure per year. Because GEDI uses relatively small footprints, about 25 m diameter, post-launch calibration is exceptionally problematic for the reasons listed earlier. Instead, GEDI will use a kind of "crowd-sourced" calibration strategy where existing lidar observations and the corresponding plot biomass will be assembled from data contributed by the science community. Through a process of continuous updating, calibrations will be refined as more data is ingested. This talk will focus on the GEDI pre-launch calibration strategy and present initial progress on its development, and how it forms the basis for meeting mission biomass requirements.
Feliciani, Claudio; Nishinari, Katsuhiro
2016-06-01
In this article we present an improved version of the Cellular Automata floor field model making use of a sub-mesh system to increase the maximum density allowed during simulation and reproduce phenomena observed in dense crowds. In order to calibrate the model's parameters and to validate it we used data obtained from an empirical observation of bidirectional pedestrian flow. A good agreement was found between numerical simulation and experimental data and, in particular, the double outflow peak observed during the formation of deadlocks could be reproduced in numerical simulations, thus allowing the analysis of deadlock formation and dissolution. Finally, we used the developed high density model to compute the flow-ratio dependent fundamental diagram of bidirectional flow, demonstrating the instability of balanced flow and predicting the bidirectional flow behavior at very high densities. The model we presented here can be used to prevent dense crowd accidents in the future and to investigate the dynamics of the accidents which already occurred in the past. Additionally, fields such as granular and active matter physics may benefit from the developed framework to study different collective phenomena.
Varying kernel density estimation on ℝ+
Mnatsakanov, Robert; Sarkisian, Khachatur
2015-01-01
In this article a new nonparametric density estimator based on the sequence of asymmetric kernels is proposed. This method is natural when estimating an unknown density function of a positive random variable. The rates of Mean Squared Error, Mean Integrated Squared Error, and the L1-consistency are investigated. Simulation studies are conducted to compare a new estimator and its modified version with traditional kernel density construction. PMID:26740729
Density estimation from local structure
CSIR Research Space (South Africa)
Van der Walt, Christiaan M
2009-11-01
Full Text Available Mixture Model (GMM) density function of the data and the log-likelihood scores are compared to the scores of a GMM trained with the expectation maximization (EM) algorithm on 5 real-world classification datasets (from the UCI collection). They show...
Density Estimation Trees in High Energy Physics
Anderlini, Lucio
2015-01-01
Density Estimation Trees can play an important role in exploratory data analysis for multidimensional, multi-modal data models of large samples. I briefly discuss the algorithm, a self-optimization technique based on kernel density estimation, and some applications in High Energy Physics.
Parametric Return Density Estimation for Reinforcement Learning
Morimura, Tetsuro; Kashima, Hisashi; Hachiya, Hirotaka; Tanaka, Toshiyuki
2012-01-01
Most conventional Reinforcement Learning (RL) algorithms aim to optimize decision- making rules in terms of the expected re- turns. However, especially for risk man- agement purposes, other risk-sensitive crite- ria such as the value-at-risk or the expected shortfall are sometimes preferred in real ap- plications. Here, we describe a parametric method for estimating density of the returns, which allows us to handle various criteria in a unified manner. We first extend the Bellman equation for the conditional expected return to cover a conditional probability density of the returns. Then we derive an extension of the TD-learning algorithm for estimating the return densities in an unknown environment. As test instances, several parametric density estimation algorithms are presented for the Gaussian, Laplace, and skewed Laplace dis- tributions. We show that these algorithms lead to risk-sensitive as well as robust RL paradigms through numerical experiments.
A lack of crowding? Body size does not decrease with density for two behavior-manipulating parasites
Weinersmith, KL; Warinner, Chloe B.; Tan, Virgina; Harris, David J.; Mora, Adrienne B.; Kuris, Armand M.; Lafferty, Kevin D.; Hechinger, Ryan F.
2014-01-01
For trophically transmitted parasites that manipulate the phenotype of their hosts, whether the parasites do or do not experience resource competition depends on such factors as the size of the parasites relative to their hosts, the intensity of infection, the extent to which parasites share the cost of defending against the host’s immune system or manipulating their host, and the extent to which parasites share transmission goals. Despite theoretical expectations for situations in which either no, or positive, or negative density-dependence should be observed, most studies document only negative density-dependence for trophically transmitted parasites. However, this trend may be an artifact of most studies having focused on systems in which parasites are large relative to their hosts. Yet, systems are common where parasites are small relative to their hosts, and these trophically transmitted parasites may be less likely to experience resource limitation. We looked for signs of density-dependence in Euhaplorchis californiensis (EUHA) and Renicola buchanani (RENB), two manipulative trematode parasites infecting wild-caught California killifish (Fundulus parvipinnis). These parasites are small relative to killifish (suggesting resources are not limiting), and are associated with changes in killifish behavior that are dependent on parasite-intensity and that increase predation rates by the parasites’ shared final host (indicating the possibility for cost sharing). We did not observe negative density-dependence in either species, indicating that resources are not limiting. In fact, observed patterns indicate possible mild positive density-dependence for EUHA. Although experimental confirmation is required, our findings suggest that some behavior-manipulating parasites suffer no reduction in size, and may even benefit when "crowded" by conspecifics.
Estimating stellar mean density through seismic inversions
Reese, D R; Goupil, M J; Thompson, M J; Deheuvels, S
2012-01-01
Determining the mass of stars is crucial both to improving stellar evolution theory and to characterising exoplanetary systems. Asteroseismology offers a promising way to estimate stellar mean density. When combined with accurate radii determinations, such as is expected from GAIA, this yields accurate stellar masses. The main difficulty is finding the best way to extract the mean density from a set of observed frequencies. We seek to establish a new method for estimating stellar mean density, which combines the simplicity of a scaling law while providing the accuracy of an inversion technique. We provide a framework in which to construct and evaluate kernel-based linear inversions which yield directly the mean density of a star. We then describe three different inversion techniques (SOLA and two scaling laws) and apply them to the sun, several test cases and three stars. The SOLA approach and the scaling law based on the surface correcting technique described by Kjeldsen et al. (2008) yield comparable result...
Bayesian mixture models for spectral density estimation
Cadonna, Annalisa
2017-01-01
We introduce a novel Bayesian modeling approach to spectral density estimation for multiple time series. Considering first the case of non-stationary timeseries, the log-periodogram of each series is modeled as a mixture of Gaussiandistributions with frequency-dependent weights and mean functions. The implied model for the log-spectral density is a mixture of linear mean functionswith frequency-dependent weights. The mixture weights are built throughsuccessive differences of a logit-normal di...
Particle Size Estimation Based on Edge Density
Institute of Scientific and Technical Information of China (English)
WANG Wei-xing
2005-01-01
Given image sequences of closely packed particles, the underlying aim is to estimate diameters without explicit segmentation. In a way, this is similar to the task of counting objects without directly counting them. Such calculations may, for example, be useful fast estimation of particle size in different application areas. The topic is that of estimating average size (=average diameter) of packed particles, from formulas involving edge density, and the edges from moment-based thresholding are used. An average shape factor is involved in the calculations, obtained for some frames from crude partial segmentation. Measurement results from about 80 frames have been analyzed.
Anisotropic Density Estimation in Global Illumination
DEFF Research Database (Denmark)
Schjøth, Lars
2009-01-01
Density estimation employed in multi-pass global illumination algorithms gives cause to a trade-off problem between bias and noise. The problem is seen most evident as blurring of strong illumination features. This thesis addresses the problem, presenting four methods that reduce both noise...... and bias in estimates. Good results are obtained by the use of anisotropic filtering. Two methods handles the most common cases; filtering illumination reflected from object surfaces. One methods extends filtering to the temporal domain and one performs filtering on illumination from participating media...
Bird population density estimated from acoustic signals
Dawson, D.K.; Efford, M.G.
2009-01-01
Many animal species are detected primarily by sound. Although songs, calls and other sounds are often used for population assessment, as in bird point counts and hydrophone surveys of cetaceans, there are few rigorous methods for estimating population density from acoustic data. 2. The problem has several parts - distinguishing individuals, adjusting for individuals that are missed, and adjusting for the area sampled. Spatially explicit capture-recapture (SECR) is a statistical methodology that addresses jointly the second and third parts of the problem. We have extended SECR to use uncalibrated information from acoustic signals on the distance to each source. 3. We applied this extension of SECR to data from an acoustic survey of ovenbird Seiurus aurocapilla density in an eastern US deciduous forest with multiple four-microphone arrays. We modelled average power from spectrograms of ovenbird songs measured within a window of 0??7 s duration and frequencies between 4200 and 5200 Hz. 4. The resulting estimates of the density of singing males (0??19 ha -1 SE 0??03 ha-1) were consistent with estimates of the adult male population density from mist-netting (0??36 ha-1 SE 0??12 ha-1). The fitted model predicts sound attenuation of 0??11 dB m-1 (SE 0??01 dB m-1) in excess of losses from spherical spreading. 5.Synthesis and applications. Our method for estimating animal population density from acoustic signals fills a gap in the census methods available for visually cryptic but vocal taxa, including many species of bird and cetacean. The necessary equipment is simple and readily available; as few as two microphones may provide adequate estimates, given spatial replication. The method requires that individuals detected at the same place are acoustically distinguishable and all individuals vocalize during the recording interval, or that the per capita rate of vocalization is known. We believe these requirements can be met, with suitable field methods, for a significant
Institute of Scientific and Technical Information of China (English)
赵星博; 王双唯; 宫姗; 王东芳; 梁士利
2016-01-01
Objective body temperature and breathing out gas has a great influence on the temperature and humidity in enclosed spaces. We use the temperature and humidity sensors to collect temperature and humidity information in enclosed spaces. Fusion process the collected information,calibration of enclosed space crowd density level,the actual monitoring data to compare with the calibration data. And through curve fitting,build number and linear relationship of temperature and humidity vector norm. Estimate the crowd density in the current space effectively,revealing monitoring data on the labview regulatory interface,set the crowd density level threshold,flash and sound alarm function. After test,the system design is reasonable and practical,and has a good stability,and suitable for confined space the crowd density monitoring.%为提高人群密度的监测效能，利用人体的体温和呼吸出的气体对封闭空间中的温湿度有较大影响的特点，通过温湿度传感器对封闭空间温湿度信息进行有效采集。将采集的温湿度信息进行融合处理，标定出封闭空间人群密度等级，对实际监测的数据与标定数据进行比较，并通过曲线拟合人数和温湿度矢量模的线性关系，可有效估计当前空间人群密度情况。利用Labview虚拟器编辑监管平台，对监测数据进行可视化显示，并通过设定人群密度等级阈值，可实现闪灯及响声听视觉双重报警功能。实验测试结果表明，该系统设计合理且实用性强，具有良好的稳定性和测量精度。
Variable kernel density estimation in high-dimensional feature spaces
CSIR Research Space (South Africa)
Van der Walt, Christiaan M
2017-02-01
Full Text Available Estimating the joint probability density function of a dataset is a central task in many machine learning applications. In this work we address the fundamental problem of kernel bandwidth estimation for variable kernel density estimation in high...
Wallace, Julian M; Tjan, Bosco S
2011-05-25
Crowding occurs when stimuli in the peripheral fields become harder to identify when flanked by other items. This phenomenon has been demonstrated extensively with simple patterns (e.g., Gabors and letters). Here, we characterize crowding for everyday objects. We presented three-item arrays of objects and letters, arranged radially and tangentially in the lower visual field. Observers identified the central target, and we measured contrast energy thresholds as a function of target-to-flanker spacing. Object crowding was similar to letter crowding in spatial extent but was much weaker. The average elevation in threshold contrast energy was in the order of 1 log unit for objects as compared to 2 log units for letters and silhouette objects. Furthermore, we examined whether the exterior and interior features of an object are differentially affected by crowding. We used a circular aperture to present or exclude the object interior. Critical spacings for these aperture and "donut" objects were similar to those of intact objects. Taken together, these findings suggest that crowding between letters and objects are essentially due to the same mechanism, which affects equally the interior and exterior features of an object. However, for objects defined with varying shades of gray, it is much easier to overcome crowding by increasing contrast.
Multivariate density estimation theory, practice, and visualization
Scott, David W
2015-01-01
David W. Scott, PhD, is Noah Harding Professor in the Department of Statistics at Rice University. The author of over 100 published articles, papers, and book chapters, Dr. Scott is also Fellow of the American Statistical Association (ASA) and the Institute of Mathematical Statistics. He is recipient of the ASA Founder's Award and the Army Wilks Award. His research interests include computational statistics, data visualization, and density estimation. Dr. Scott is also Coeditor of Wiley Interdisciplinary Reviews: Computational Statistics and previous Editor of the Journal of Computational and
Regularized Multitask Learning for Multidimensional Log-Density Gradient Estimation.
Yamane, Ikko; Sasaki, Hiroaki; Sugiyama, Masashi
2016-07-01
Log-density gradient estimation is a fundamental statistical problem and possesses various practical applications such as clustering and measuring nongaussianity. A naive two-step approach of first estimating the density and then taking its log gradient is unreliable because an accurate density estimate does not necessarily lead to an accurate log-density gradient estimate. To cope with this problem, a method to directly estimate the log-density gradient without density estimation has been explored and demonstrated to work much better than the two-step method. The objective of this letter is to improve the performance of this direct method in multidimensional cases. Our idea is to regard the problem of log-density gradient estimation in each dimension as a task and apply regularized multitask learning to the direct log-density gradient estimator. We experimentally demonstrate the usefulness of the proposed multitask method in log-density gradient estimation and mode-seeking clustering.
A field study on real-time self-reported emotions in crowds
Li, J.; Erkin, Z.; De Ridder, H.; Vermeeren, A.P.O.S.
2013-01-01
Crowd experience is inevitable in daily life. Crowd managers need tools to accurately estimate the psychological aspects of crowds, an important one being crowd emotion. In this study, we explore the feasibility of obtaining a real-time, dynamic map of crowd emotions through self-reporting by crowd
Density Estimations in Laboratory Debris Flow Experiments
Queiroz de Oliveira, Gustavo; Kulisch, Helmut; Malcherek, Andreas; Fischer, Jan-Thomas; Pudasaini, Shiva P.
2016-04-01
Bulk density and its variation is an important physical quantity to estimate the solid-liquid fractions in two-phase debris flows. Here we present mass and flow depth measurements for experiments performed in a large-scale laboratory set up. Once the mixture is released and it moves down the inclined channel, measurements allow us to determine the bulk density evolution throughout the debris flow. Flow depths are determined by ultrasonic pulse reflection, and the mass is measured with a total normal force sensor. The data were obtained at 50 Hz. The initial two phase material was composed of 350 kg debris with water content of 40%. A very fine pebble with mean particle diameter of 3 mm, particle density of 2760 kg/m³ and bulk density of 1400 kg/m³ in dry condition was chosen as the solid material. Measurements reveal that the debris bulk density remains high from the head to the middle of the debris body whereas it drops substantially at the tail. This indicates lower water content at the tail, compared to the head and the middle portion of the debris body. This means that the solid and fluid fractions are varying strongly in a non-linear manner along the flow path, and from the head to the tail of the debris mass. Importantly, this spatial-temporal density variation plays a crucial role in determining the impact forces associated with the dynamics of the flow. Our setup allows for investigating different two phase material compositions, including large fluid fractions, with high resolutions. The considered experimental set up may enable us to transfer the observed phenomena to natural large-scale events. Furthermore, the measurement data allows evaluating results of numerical two-phase mass flow simulations. These experiments are parts of the project avaflow.org that intends to develop a GIS-based open source computational tool to describe wide spectrum of rapid geophysical mass flows, including avalanches and real two-phase debris flows down complex natural
Routing in Dense Human Crowds Using Smartphone Movement Data and Optical Aerial Imagery
Directory of Open Access Journals (Sweden)
Florian Hillen
2015-06-01
Full Text Available In this paper, we propose a navigation approach for smartphones that enables visitors of major events to avoid crowded areas or narrow streets and to navigate out of dense crowds quickly. Two types of sensor data are integrated. Real-time optical images acquired and transmitted by an airborne camera system are used to compute an estimation of a crowd density map. For this purpose, a patch-based approach with a Gabor filter bank for texture classification in combination with an interest point detector and a smoothing function is applied. Furthermore, the crowd density is estimated based on location and movement speed of in situ smartphone measurements. This information allows for the enhancement of the overall crowd density layer. The composed density information is input to a least-cost routing workflow. Two possible use cases are presented, namely (i an emergency application and (ii a basic routing application. A prototypical implementation of the system is conducted as proof of concept. Our approach is capable of increasing the security level for major events. Visitors are able to avoid dense crowds by routing around them, while security and rescue forces are able to find the fastest way into the crowd.
Kernel density estimation using graphical processing unit
Sunarko, Su'ud, Zaki
2015-09-01
Kernel density estimation for particles distributed over a 2-dimensional space is calculated using a single graphical processing unit (GTX 660Ti GPU) and CUDA-C language. Parallel calculations are done for particles having bivariate normal distribution and by assigning calculations for equally-spaced node points to each scalar processor in the GPU. The number of particles, blocks and threads are varied to identify favorable configuration. Comparisons are obtained by performing the same calculation using 1, 2 and 4 processors on a 3.0 GHz CPU using MPICH 2.0 routines. Speedups attained with the GPU are in the range of 88 to 349 times compared the multiprocessor CPU. Blocks of 128 threads are found to be the optimum configuration for this case.
Baum, Neil
2016-01-01
The Internet has contributed new words and slang to our daily vernacular. A few terms, such as tweeting, texting, sexting, blogging, and googling, have become common in most vocabularies and in many languages, and are now included in the dictionary. A new buzzword making the rounds in industry is crowd sourcing, which involves outsourcing an activity, task, or problem by sending it to people or groups outside a business or a practice. Crowd sourcing allows doctors and practices to tap the wisdom of many instead of relying only on the few members of their close-knit group. This article defines "crowd sourcing," offers examples, and explains how to get started with this approach that can increase your ability to finish a task or solve problems that you don't have the time or expertise to accomplish.
Levi, Dennis M; Song, Shuang; Pelli, Denis G
2007-10-26
We measure acuity, crowding, and reading in amblyopic observers to answer four questions. (1) Is reading with the amblyopic eye impaired because of larger required letter size (i.e., worse acuity) or larger required spacing (i.e., worse crowding)? The size or spacing required to read at top speed is called "critical". For each eye of seven amblyopic observers and the preferred eyes of two normal observers, we measure reading rate as a function of the center-to-center spacing of the letters in central and peripheral vision. From these results, we estimate the critical spacing for reading. We also measured traditional acuity for an isolated letter and the critical spacing for identifying a letter among other letters, which is the classic measure of crowding. For both normals and amblyopes, in both central and peripheral vision, we find that the critical spacing for reading equals the critical spacing for crowding. The identical critical spacings, and very different critical sizes, show that crowding, not acuity, limits reading. (2) Does amblyopia affect peripheral reading? No. We find that amblyopes read normally with their amblyopic eye except that abnormal crowding in the fovea prevents them from reading fine print. (3) Is the normal periphery a good model for the amblyopic fovea? No. Reading centrally, the amblyopic eye has an abnormally large critical spacing but reads all larger spacings at normal rates. This is unlike the normal periphery, in which both critical spacing and maximum reading rate are severely impaired relative to the normal fovea. (4) Can the uncrowded-span theory of reading rate explain amblyopic reading? Yes. The case of amblyopia shows that crowding limits reading solely by determining the uncrowded span: the number of characters that are not crowded. Characters are uncrowded if and only if their spacing is more than critical. The text spacing may be uniform, but the observer's critical spacing increases with distance from fixation, so the
Mammographic density estimation with automated volumetric breast density measurement.
Ko, Su Yeon; Kim, Eun-Kyung; Kim, Min Jung; Moon, Hee Jung
2014-01-01
To compare automated volumetric breast density measurement (VBDM) with radiologists' evaluations based on the Breast Imaging Reporting and Data System (BI-RADS), and to identify the factors associated with technical failure of VBDM. In this study, 1129 women aged 19-82 years who underwent mammography from December 2011 to January 2012 were included. Breast density evaluations by radiologists based on BI-RADS and by VBDM (Volpara Version 1.5.1) were compared. The agreement in interpreting breast density between radiologists and VBDM was determined based on four density grades (D1, D2, D3, and D4) and a binary classification of fatty (D1-2) vs. dense (D3-4) breast using kappa statistics. The association between technical failure of VBDM and patient age, total breast volume, fibroglandular tissue volume, history of partial mastectomy, the frequency of mass > 3 cm, and breast density was analyzed. The agreement between breast density evaluations by radiologists and VBDM was fair (k value = 0.26) when the four density grades (D1/D2/D3/D4) were used and moderate (k value = 0.47) for the binary classification (D1-2/D3-4). Twenty-seven women (2.4%) showed failure of VBDM. Small total breast volume, history of partial mastectomy, and high breast density were significantly associated with technical failure of VBDM (p = 0.001 to 0.015). There is fair or moderate agreement in breast density evaluation between radiologists and VBDM. Technical failure of VBDM may be related to small total breast volume, a history of partial mastectomy, and high breast density.
Mammography density estimation with automated volumetic breast density measurement
Energy Technology Data Exchange (ETDEWEB)
Ko, Su Yeon; Kim, Eun Kyung; Kim, Min Jung; Moon, Hee Jung [Dept. of Radiology, Severance Hospital, Research Institute of Radiological Science, Yonsei University College of Medicine, Seoul (Korea, Republic of)
2014-06-15
To compare automated volumetric breast density measurement (VBDM) with radiologists' evaluations based on the Breast Imaging Reporting and Data System (BI-RADS), and to identify the factors associated with technical failure of VBDM. In this study, 1129 women aged 19-82 years who underwent mammography from December 2011 to January 2012 were included. Breast density evaluations by radiologists based on BI-RADS and by VBDM (Volpara Version 1.5.1) were compared. The agreement in interpreting breast density between radiologists and VBDM was determined based on four density grades (D1, D2, D3, and D4) and a binary classification of fatty (D1-2) vs. dense (D3-4) breast using kappa statistics. The association between technical failure of VBDM and patient age, total breast volume, fibroglandular tissue volume, history of partial mastectomy, the frequency of mass > 3 cm, and breast density was analyzed. The agreement between breast density evaluations by radiologists and VBDM was fair (k value = 0.26) when the four density grades (D1/D2/D3/D4) were used and moderate (k value = 0.47) for the binary classification (D1-2/D3-4). Twenty-seven women (2.4%) showed failure of VBDM. Small total breast volume, history of partial mastectomy, and high breast density were significantly associated with technical failure of VBDM (p 0.001 to 0.015). There is fair or moderate agreement in breast density evaluation between radiologists and VBDM. Technical failure of VBDM may be related to small total breast volume, a history of partial mastectomy, and high breast density.
Space and time in masking and crowding.
Lev, Maria; Polat, Uri
2015-01-01
Masking and crowding are major phenomena associated with contextual modulations, but the relationship between them remains unclear. We have recently shown that crowding is apparent in the fovea when the time available for processing is limited, pointing to the strong relationship between crowding in the spatial and temporal domains. Models of crowding emphasize the size (acuity) of the target and the spacing between the target and flankers as the main determinants that predict crowding. Our model, which is based on lateral interactions, posits that masking and crowding are related in the spatial and temporal domains at the fovea and periphery and that both can be explained by the increasing size of the human perceptive field (PF) with increasing eccentricity. We explored the relations between masking and crowding using letter identification and contrast detection by correlating the crowding effect with the estimated size of the PF and with masking under different spatiotemporal conditions. We found that there is a large variability in PF size and crowding effects across observers. Nevertheless, masking and crowding were both correlated with the estimated size of the PF in the fovea and periphery under a specific range of spatiotemporal parameters. Our results suggest that under certain conditions, crowding and masking share common neural mechanisms that underlie the spatiotemporal properties of these phenomena in both the fovea and periphery. These results could explain the transfer of training gains from spatiotemporal Gabor masking to letter acuity, reading, and reduced crowding.
Gender differences in crowd perception.
Bai, Yang; Leib, Allison Y; Puri, Amrita M; Whitney, David; Peng, Kaiping
2015-01-01
In this study, we investigated whether the first impression of a crowd of faces-crowd perception-is influenced by social background and cognitive processing. Specifically, we explored whether males and females, two groups that are distinct biologically and socially, differ in their ability to extract ensemble characteristics from crowds of faces that were comprised of different identities. Participants were presented with crowds of similar faces and were instructed to scroll through a morphed continuum of faces until they found a face that was representative of the average identity of each crowd. Consistent with previous research, females were more precise in single face perception. Furthermore, the results showed that females were generally more accurate in estimating the average identity of a crowd. However, the correlation between single face discrimination and crowd averaging differed between males and females. Specifically, male subjects' ensemble integration slightly compensated for their poor single face perception; their performance on the crowd perception task was not as poor as would be expected from their single face discrimination ability. Overall, the results suggest that group perception is not an isolated or uniform cognitive mechanism, but rather one that interacts with biological and social processes.
Concrete density estimation by rebound hammer method
Ismail, Mohamad Pauzi bin; Jefri, Muhamad Hafizie Bin; Abdullah, Mahadzir Bin; Masenwat, Noor Azreen bin; Sani, Suhairy bin; Mohd, Shukri; Isa, Nasharuddin bin; Mahmud, Mohamad Haniza bin
2016-01-01
Concrete is the most common and cheap material for radiation shielding. Compressive strength is the main parameter checked for determining concrete quality. However, for shielding purposes density is the parameter that needs to be considered. X- and -gamma radiations are effectively absorbed by a material with high atomic number and high density such as concrete. The high strength normally implies to higher density in concrete but this is not always true. This paper explains and discusses the correlation between rebound hammer testing and density for concrete containing hematite aggregates. A comparison is also made with normal concrete i.e. concrete containing crushed granite.
Concrete density estimation by rebound hammer method
Energy Technology Data Exchange (ETDEWEB)
Ismail, Mohamad Pauzi bin, E-mail: pauzi@nm.gov.my; Masenwat, Noor Azreen bin; Sani, Suhairy bin; Mohd, Shukri [NDT Group, Nuclear Malaysia, Bangi, Kajang, Selangor (Malaysia); Jefri, Muhamad Hafizie Bin; Abdullah, Mahadzir Bin [Material Technology Program, Faculty of Applied Sciences, UiTM, Shah Alam, Selangor (Malaysia); Isa, Nasharuddin bin; Mahmud, Mohamad Haniza bin [Pusat Penyelidikan Mineral, Jabatan Mineral dan Geosains, Ipoh, Perak (Malaysia)
2016-01-22
Concrete is the most common and cheap material for radiation shielding. Compressive strength is the main parameter checked for determining concrete quality. However, for shielding purposes density is the parameter that needs to be considered. X- and -gamma radiations are effectively absorbed by a material with high atomic number and high density such as concrete. The high strength normally implies to higher density in concrete but this is not always true. This paper explains and discusses the correlation between rebound hammer testing and density for concrete containing hematite aggregates. A comparison is also made with normal concrete i.e. concrete containing crushed granite.
Crowd Theory and the Management of Crowds
DEFF Research Database (Denmark)
Borch, Christian
2013-01-01
Sociologists of policing and collective protest have made a plea for eradicating from police literature and training programmes which aim to provide guidelines for crowd management any references to classical crowd theory where crowds are depicted as irrational entities. Instead, these scholars...... suggest, rational conceptions of crowds should inform contemporary crowd management. This article questions this plea on two grounds. First, it demonstrates that there is no unidirectional connection between sociological crowd theory (whatever its content) and practical strategies for governing crowds....... The tactical polyvalence of crowd theory is illustrated by showing how the irrational conception of crowds has given rise to very different strategies for the management of crowds (urban reform programmes in the Progressive Era and Hitler’s mobilization strategies, respectively). Second, the article argues...
Large Scale Density Estimation of Blue and Fin Whales (LSD)
2015-09-30
1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Large Scale Density Estimation of Blue and Fin Whales ...sensors, or both. The goal of this research is to develop and implement a new method for estimating blue and fin whale density that is effective over...develop and implement a density estimation methodology for quantifying blue and fin whale abundance from passive acoustic data recorded on sparse
A field study on real-time self-reported emotions in crowds
Li, J.; Erkin, Z.; De Ridder, H.; Vermeeren, A.P.O.S.
2013-01-01
Crowd experience is inevitable in daily life. Crowd managers need tools to accurately estimate the psychological aspects of crowds, an important one being crowd emotion. In this study, we explore the feasibility of obtaining a real-time, dynamic map of crowd emotions through self-reporting by
Directory of Open Access Journals (Sweden)
Olivia C Bolt
Full Text Available BACKGROUND: People with social anxiety disorder are afraid of being scrutinized by others and often feel that they are the excessive focus of other people's attention. This study investigated whether, when compared to low socially anxious individuals, high socially anxious individuals overestimate the proportion of people in a crowd who are observing them. It was hypothesized that any potential overestimation would be modulated by self-focused attention. METHOD: Forty-eight high and 48 low socially anxious participants performed a "faces in a crowd" computer task during which they briefly saw matrices of faces, which varied in terms of the proportion of people who were looking at them. Participants estimated the proportion of people who were looking at them. The task was performed once with mirrors present (to induce an enhanced self-focused state and once without mirrors present (neutral state. RESULTS: Participants' subjective estimates and the objective proportion of faces looking towards them were strongly correlated in both the high and low socially anxious groups. However, high socially anxious participants estimated that more people were looking at them than low socially anxious participants. In the first phase of the experiment, but not in the later phases, this effect was magnified in the mirror condition. DISCUSSION: This study provides preliminary evidence of a social anxiety related perceptual difference that may be amplified by self-focused attention. Clinical implications are discussed.
区域人群状态的实时感知监控%Real-time Monitoring for the Regional Crowds Status
Institute of Scientific and Technical Information of China (English)
宋宏权; 刘学军; 闾国年; 张兴国
2012-01-01
only measure results by the unit of pixels. It requires further conversion if we want to get the real value. But we can get the real value directly when we process crowd images in GIS using the method we proposed. (2) The accuracy of the pixel-based low-density crowd counting estimation results can be up to 90%. The classification accuracy of the high-density crowd levels support vector machine classifier is more than 95%. So, they can fully meet the needs of crowd monitoring. (3) We can get the crowd movement pattern and the main movement direction by the analysis of crowd movement vector field in GIS. Also, we can obtain the speed of the crowd in different directions. These crowd characters all can be expressed in GIS. (4) The system we developed for the crowd monitoring can be applied to crowd management and emergency warning. It can provide decision making basis for emergencies prevention and crowd divert.
Crowding effects in vehicular traffic.
Combinido, Jay Samuel L; Lim, May T
2012-01-01
While the impact of crowding on the diffusive transport of molecules within a cell is widely studied in biology, it has thus far been neglected in traffic systems where bulk behavior is the main concern. Here, we study the effects of crowding due to car density and driving fluctuations on the transport of vehicles. Using a microscopic model for traffic, we found that crowding can push car movement from a superballistic down to a subdiffusive state. The transition is also associated with a change in the shape of the probability distribution of positions from a negatively-skewed normal to an exponential distribution. Moreover, crowding broadens the distribution of cars' trap times and cluster sizes. At steady state, the subdiffusive state persists only when there is a large variability in car speeds. We further relate our work to prior findings from random walk models of transport in cellular systems.
Crowding effects in vehicular traffic
Combinido, Jay Samuel L
2012-01-01
While the impact of crowding on the diffusive transport of molecules within a cell is widely studied in biology, it has thus far been neglected in traffic systems where bulk behavior is the main concern. Here, we study the effects of crowding due to car density and driving fluctuations on the transport of vehicles. Using a microscopic model for traffic, we found that crowding can push car movement from a superballistic down to a subdiffusive state. The transition is also associated with a change in the shape of the probability distribution of positions from negatively-skewed normal to an exponential distribution. Moreover, crowding broadens the distribution of cars' trap times and cluster sizes. At steady state, the subdiffusive state persists only when there is a large variability in car speeds. We further relate our work to prior findings from random walk models of transport in cellular systems.
Ant-inspired density estimation via random walks.
Musco, Cameron; Su, Hsin-Hao; Lynch, Nancy A
2017-09-19
Many ant species use distributed population density estimation in applications ranging from quorum sensing, to task allocation, to appraisal of enemy colony strength. It has been shown that ants estimate local population density by tracking encounter rates: The higher the density, the more often the ants bump into each other. We study distributed density estimation from a theoretical perspective. We prove that a group of anonymous agents randomly walking on a grid are able to estimate their density within a small multiplicative error in few steps by measuring their rates of encounter with other agents. Despite dependencies inherent in the fact that nearby agents may collide repeatedly (and, worse, cannot recognize when this happens), our bound nearly matches what would be required to estimate density by independently sampling grid locations. From a biological perspective, our work helps shed light on how ants and other social insects can obtain relatively accurate density estimates via encounter rates. From a technical perspective, our analysis provides tools for understanding complex dependencies in the collision probabilities of multiple random walks. We bound the strength of these dependencies using local mixing properties of the underlying graph. Our results extend beyond the grid to more general graphs, and we discuss applications to size estimation for social networks, density estimation for robot swarms, and random walk-based sampling for sensor networks.
Kernel bandwidth estimation for non-parametric density estimation: a comparative study
CSIR Research Space (South Africa)
Van der Walt, CM
2013-12-01
Full Text Available We investigate the performance of conventional bandwidth estimators for non-parametric kernel density estimation on a number of representative pattern-recognition tasks, to gain a better understanding of the behaviour of these estimators in high...
Breast density estimation from high spectral and spatial resolution MRI.
Li, Hui; Weiss, William A; Medved, Milica; Abe, Hiroyuki; Newstead, Gillian M; Karczmar, Gregory S; Giger, Maryellen L
2016-10-01
A three-dimensional breast density estimation method is presented for high spectral and spatial resolution (HiSS) MR imaging. Twenty-two patients were recruited (under an Institutional Review Board--approved Health Insurance Portability and Accountability Act-compliant protocol) for high-risk breast cancer screening. Each patient received standard-of-care clinical digital x-ray mammograms and MR scans, as well as HiSS scans. The algorithm for breast density estimation includes breast mask generating, breast skin removal, and breast percentage density calculation. The inter- and intra-user variabilities of the HiSS-based density estimation were determined using correlation analysis and limits of agreement. Correlation analysis was also performed between the HiSS-based density estimation and radiologists' breast imaging-reporting and data system (BI-RADS) density ratings. A correlation coefficient of 0.91 ([Formula: see text]) was obtained between left and right breast density estimations. An interclass correlation coefficient of 0.99 ([Formula: see text]) indicated high reliability for the inter-user variability of the HiSS-based breast density estimations. A moderate correlation coefficient of 0.55 ([Formula: see text]) was observed between HiSS-based breast density estimations and radiologists' BI-RADS. In summary, an objective density estimation method using HiSS spectral data from breast MRI was developed. The high reproducibility with low inter- and low intra-user variabilities shown in this preliminary study suggest that such a HiSS-based density metric may be potentially beneficial in programs requiring breast density such as in breast cancer risk assessment and monitoring effects of therapy.
Current Source Density Estimation for Single Neurons
Directory of Open Access Journals (Sweden)
Dorottya Cserpán
2014-03-01
Full Text Available Recent developments of multielectrode technology made it possible to measure the extracellular potential generated in the neural tissue with spatial precision on the order of tens of micrometers and on submillisecond time scale. Combining such measurements with imaging of single neurons within the studied tissue opens up new experimental possibilities for estimating distribution of current sources along a dendritic tree. In this work we show that if we are able to relate part of the recording of extracellular potential to a specific cell of known morphology we can estimate the spatiotemporal distribution of transmembrane currents along it. We present here an extension of the kernel CSD method (Potworowski et al., 2012 applicable in such case. We test it on several model neurons of progressively complicated morphologies from ball-and-stick to realistic, up to analysis of simulated neuron activity embedded in a substantial working network (Traub et al, 2005. We discuss the caveats and possibilities of this new approach.
Highway traffic model-based density estimation
Morarescu, Irinel - Constantin; CANUDAS DE WIT, Carlos
2011-01-01
International audience; The travel time spent in traffic networks is one of the main concerns of the societies in developed countries. A major requirement for providing traffic control and services is the continuous prediction, for several minutes into the future. This paper focuses on an important ingredient necessary for the traffic forecasting which is the real-time traffic state estimation using only a limited amount of data. Simulation results illustrate the performances of the proposed ...
Toward accurate and precise estimates of lion density.
Elliot, Nicholas B; Gopalaswamy, Arjun M
2017-08-01
Reliable estimates of animal density are fundamental to understanding ecological processes and population dynamics. Furthermore, their accuracy is vital to conservation because wildlife authorities rely on estimates to make decisions. However, it is notoriously difficult to accurately estimate density for wide-ranging carnivores that occur at low densities. In recent years, significant progress has been made in density estimation of Asian carnivores, but the methods have not been widely adapted to African carnivores, such as lions (Panthera leo). Although abundance indices for lions may produce poor inferences, they continue to be used to estimate density and inform management and policy. We used sighting data from a 3-month survey and adapted a Bayesian spatially explicit capture-recapture (SECR) model to estimate spatial lion density in the Maasai Mara National Reserve and surrounding conservancies in Kenya. Our unstructured spatial capture-recapture sampling design incorporated search effort to explicitly estimate detection probability and density on a fine spatial scale, making our approach robust in the context of varying detection probabilities. Overall posterior mean lion density was estimated to be 17.08 (posterior SD 1.310) lions >1 year old/100 km(2) , and the sex ratio was estimated at 2.2 females to 1 male. Our modeling framework and narrow posterior SD demonstrate that SECR methods can produce statistically rigorous and precise estimates of population parameters, and we argue that they should be favored over less reliable abundance indices. Furthermore, our approach is flexible enough to incorporate different data types, which enables robust population estimates over relatively short survey periods in a variety of systems. Trend analyses are essential to guide conservation decisions but are frequently based on surveys of differing reliability. We therefore call for a unified framework to assess lion numbers in key populations to improve management and
Christian Education Movement, London (England).
This booklet is designed to help British teachers introduce concepts of crowds to young students. Elementary school students will better understand issues of crowd behavior such as rural to urban migration and crowding in urban areas if they realize that all crowds are composed of individual human beings. Teachers can help students become familiar…
Fusion of Hard and Soft Information in Nonparametric Density Estimation
2015-06-10
estimation exploiting, in concert, hard and soft information. Although our development, theoretical and numerical, makes no distinction based on sample...Fusion of Hard and Soft Information in Nonparametric Density Estimation∗ Johannes O. Royset Roger J-B Wets Department of Operations Research...univariate density estimation in situations when the sample ( hard information) is supplemented by “soft” information about the random phenomenon. These
Density estimates of monarch butterflies overwintering in central Mexico.
Thogmartin, Wayne E; Diffendorfer, Jay E; López-Hoffman, Laura; Oberhauser, Karen; Pleasants, John; Semmens, Brice X; Semmens, Darius; Taylor, Orley R; Wiederholt, Ruscena
2017-01-01
Given the rapid population decline and recent petition for listing of the monarch butterfly (Danaus plexippus L.) under the Endangered Species Act, an accurate estimate of the Eastern, migratory population size is needed. Because of difficulty in counting individual monarchs, the number of hectares occupied by monarchs in the overwintering area is commonly used as a proxy for population size, which is then multiplied by the density of individuals per hectare to estimate population size. There is, however, considerable variation in published estimates of overwintering density, ranging from 6.9-60.9 million ha(-1). We develop a probability distribution for overwinter density of monarch butterflies from six published density estimates. The mean density among the mixture of the six published estimates was ∼27.9 million butterflies ha(-1) (95% CI [2.4-80.7] million ha(-1)); the mixture distribution is approximately log-normal, and as such is better represented by the median (21.1 million butterflies ha(-1)). Based upon assumptions regarding the number of milkweed needed to support monarchs, the amount of milkweed (Asclepias spp.) lost (0.86 billion stems) in the northern US plus the amount of milkweed remaining (1.34 billion stems), we estimate >1.8 billion stems is needed to return monarchs to an average population size of 6 ha. Considerable uncertainty exists in this required amount of milkweed because of the considerable uncertainty occurring in overwinter density estimates. Nevertheless, the estimate is on the same order as other published estimates. The studies included in our synthesis differ substantially by year, location, method, and measures of precision. A better understanding of the factors influencing overwintering density across space and time would be valuable for increasing the precision of conservation recommendations.
Density estimates of monarch butterflies overwintering in central Mexico
Directory of Open Access Journals (Sweden)
Wayne E. Thogmartin
2017-04-01
Full Text Available Given the rapid population decline and recent petition for listing of the monarch butterfly (Danaus plexippus L. under the Endangered Species Act, an accurate estimate of the Eastern, migratory population size is needed. Because of difficulty in counting individual monarchs, the number of hectares occupied by monarchs in the overwintering area is commonly used as a proxy for population size, which is then multiplied by the density of individuals per hectare to estimate population size. There is, however, considerable variation in published estimates of overwintering density, ranging from 6.9–60.9 million ha−1. We develop a probability distribution for overwinter density of monarch butterflies from six published density estimates. The mean density among the mixture of the six published estimates was ∼27.9 million butterflies ha−1 (95% CI [2.4–80.7] million ha−1; the mixture distribution is approximately log-normal, and as such is better represented by the median (21.1 million butterflies ha−1. Based upon assumptions regarding the number of milkweed needed to support monarchs, the amount of milkweed (Asclepias spp. lost (0.86 billion stems in the northern US plus the amount of milkweed remaining (1.34 billion stems, we estimate >1.8 billion stems is needed to return monarchs to an average population size of 6 ha. Considerable uncertainty exists in this required amount of milkweed because of the considerable uncertainty occurring in overwinter density estimates. Nevertheless, the estimate is on the same order as other published estimates. The studies included in our synthesis differ substantially by year, location, method, and measures of precision. A better understanding of the factors influencing overwintering density across space and time would be valuable for increasing the precision of conservation recommendations.
Density estimates of monarch butterflies overwintering in central Mexico
Thogmartin, Wayne E.; Diffendorfer, James E.; Lopez-Hoffman, Laura; Oberhauser, Karen; Pleasants, John M.; Semmens, Brice X.; Semmens, Darius J.; Taylor, Orley R.; Wiederholt, Ruscena
2017-01-01
Given the rapid population decline and recent petition for listing of the monarch butterfly (Danaus plexippus L.) under the Endangered Species Act, an accurate estimate of the Eastern, migratory population size is needed. Because of difficulty in counting individual monarchs, the number of hectares occupied by monarchs in the overwintering area is commonly used as a proxy for population size, which is then multiplied by the density of individuals per hectare to estimate population size. There is, however, considerable variation in published estimates of overwintering density, ranging from 6.9–60.9 million ha−1. We develop a probability distribution for overwinter density of monarch butterflies from six published density estimates. The mean density among the mixture of the six published estimates was ∼27.9 million butterflies ha−1 (95% CI [2.4–80.7] million ha−1); the mixture distribution is approximately log-normal, and as such is better represented by the median (21.1 million butterflies ha−1). Based upon assumptions regarding the number of milkweed needed to support monarchs, the amount of milkweed (Asclepias spp.) lost (0.86 billion stems) in the northern US plus the amount of milkweed remaining (1.34 billion stems), we estimate >1.8 billion stems is needed to return monarchs to an average population size of 6 ha. Considerable uncertainty exists in this required amount of milkweed because of the considerable uncertainty occurring in overwinter density estimates. Nevertheless, the estimate is on the same order as other published estimates. The studies included in our synthesis differ substantially by year, location, method, and measures of precision. A better understanding of the factors influencing overwintering density across space and time would be valuable for increasing the precision of conservation recommendations.
Comparison of density estimation methods for astronomical datasets
Ferdosi, B.J.; Buddelmeijer, H.; Trager, S.C.; Wilkinson, M.H.F.; Roerdink, J.B.T.M.
2011-01-01
Context. Galaxies are strongly influenced by their environment. Quantifying the galaxy density is a difficult but critical step in studying the properties of galaxies. Aims. We aim to determine differences in density estimation methods and their applicability in astronomical problems. We study the p
Density estimators in particle hydrodynamics - DTFE versus regular SPH
Pelupessy, FI; Schaap, WE; van de Weygaert, R
2003-01-01
We present the results of a study comparing density maps reconstructed by the Delaunay Tessellation Field Estimator (DTFE) and by regular SPH kernel-based techniques. The density maps are constructed from the outcome of an SPH particle hydrodynamics simulation of a multiphase interstellar medium. Th
Estimating maritime snow density from seasonal climate variables
Bormann, K. J.; Evans, J. P.; Westra, S.; McCabe, M. F.; Painter, T. H.
2013-12-01
Snow density is a complex parameter that influences thermal, optical and mechanical snow properties and processes. Depth-integrated properties of snowpacks, including snow density, remain very difficult to obtain remotely. Observations of snow density are therefore limited to in-situ point locations. In maritime snowfields such as those in Australia and in parts of the western US, snow densification rates are enhanced and inter-annual variability is high compared to continental snow regions. In-situ snow observation networks in maritime climates often cannot characterise the variability in snowpack properties at spatial and temporal resolutions required for many modelling and observations-based applications. Regionalised density-time curves are commonly used to approximate snow densities over broad areas. However, these relationships have limited spatial applicability and do not allow for interannual variability in densification rates, which are important in maritime environments. Physically-based density models are relatively complex and rely on empirical algorithms derived from limited observations, which may not represent the variability observed in maritime snow. In this study, seasonal climate factors were used to estimate late season snow densities using multiple linear regressions. Daily snow density estimates were then obtained by projecting linearly to fresh snow densities at the start of the season. When applied spatially, the daily snow density fields compare well to in-situ observations across multiple sites in Australia, and provide a new method for extrapolating existing snow density datasets in maritime snow environments. While the relatively simple algorithm for estimating snow densities has been used in this study to constrain snowmelt rates in a temperature-index model, the estimates may also be used to incorporate variability in snow depth to snow water equivalent conversion.
Maximum likelihood estimation for semiparametric density ratio model.
Diao, Guoqing; Ning, Jing; Qin, Jing
2012-06-27
In the statistical literature, the conditional density model specification is commonly used to study regression effects. One attractive model is the semiparametric density ratio model, under which the conditional density function is the product of an unknown baseline density function and a known parametric function containing the covariate information. This model has a natural connection with generalized linear models and is closely related to biased sampling problems. Despite the attractive features and importance of this model, most existing methods are too restrictive since they are based on multi-sample data or conditional likelihood functions. The conditional likelihood approach can eliminate the unknown baseline density but cannot estimate it. We propose efficient estimation procedures based on the nonparametric likelihood. The nonparametric likelihood approach allows for general forms of covariates and estimates the regression parameters and the baseline density simultaneously. Therefore, the nonparametric likelihood approach is more versatile than the conditional likelihood approach especially when estimation of the conditional mean or other quantities of the outcome is of interest. We show that the nonparametric maximum likelihood estimators are consistent, asymptotically normal, and asymptotically efficient. Simulation studies demonstrate that the proposed methods perform well in practical settings. A real example is used for illustration.
Attentional priming releases crowding.
Kristjánsson, Arni; Heimisson, Pétur Rúnar; Róbertsson, Gunnar Freyr; Whitney, David
2013-10-01
Views of natural scenes unfold over time, and objects of interest that were present a moment ago tend to remain present. While visual crowding places a fundamental limit on object recognition in cluttered scenes, most studies of crowding have suffered from the limitation that they typically involved static scenes. The role of temporal continuity in crowding has therefore been unaddressed. We investigated intertrial effects upon crowding in visual scenes, showing that crowding is considerably diminished when objects remain constant on consecutive visual search trials. Repetition of both the target and distractors decreases the critical distance for crowding from flankers. More generally, our results show how object continuity through between-trial priming releases objects that would otherwise be unidentifiable due to crowding. Crowding, although it is a significant bottleneck on object recognition, can be mitigated by statistically likely temporal continuity of the objects. Crowding therefore depends not only on what is momentarily present, but also on what was previously attended.
Kernel density estimation of a multidimensional efficiency profile
Poluektov, Anton
2014-01-01
Kernel density estimation is a convenient way to estimate the probability density of a distribution given the sample of data points. However, it has certain drawbacks: proper description of the density using narrow kernels needs large data samples, whereas if the kernel width is large, boundaries and narrow structures tend to be smeared. Here, an approach to correct for such effects, is proposed that uses an approximate density to describe narrow structures and boundaries. The approach is shown to be well suited for the description of the efficiency shape over a multidimensional phase space in a typical particle physics analysis. An example is given for the five-dimensional phase space of the $\\Lambda_b^0\\to D^0p\\pi$ decay.
A morpho-density approach to estimating neural connectivity.
Directory of Open Access Journals (Sweden)
Michael P McAssey
Full Text Available Neuronal signal integration and information processing in cortical neuronal networks critically depend on the organization of synaptic connectivity. Because of the challenges involved in measuring a large number of neurons, synaptic connectivity is difficult to determine experimentally. Current computational methods for estimating connectivity typically rely on the juxtaposition of experimentally available neurons and applying mathematical techniques to compute estimates of neural connectivity. However, since the number of available neurons is very limited, these connectivity estimates may be subject to large uncertainties. We use a morpho-density field approach applied to a vast ensemble of model-generated neurons. A morpho-density field (MDF describes the distribution of neural mass in the space around the neural soma. The estimated axonal and dendritic MDFs are derived from 100,000 model neurons that are generated by a stochastic phenomenological model of neurite outgrowth. These MDFs are then used to estimate the connectivity between pairs of neurons as a function of their inter-soma displacement. Compared with other density-field methods, our approach to estimating synaptic connectivity uses fewer restricting assumptions and produces connectivity estimates with a lower standard deviation. An important requirement is that the model-generated neurons reflect accurately the morphology and variation in morphology of the experimental neurons used for optimizing the model parameters. As such, the method remains subject to the uncertainties caused by the limited number of neurons in the experimental data set and by the quality of the model and the assumptions used in creating the MDFs and in calculating estimating connectivity. In summary, MDFs are a powerful tool for visualizing the spatial distribution of axonal and dendritic densities, for estimating the number of potential synapses between neurons with low standard deviation, and for obtaining
Optimization of volumetric breast density estimation in digital mammograms.
Holland, Katharina; Gubern-Mérida, Albert; Mann, Ritse M; Karssemeijer, Nico
2017-05-07
Fibroglandular tissue volume and percent density can be estimated in unprocessed mammograms using a physics-based method, which relies on an internal reference value representing the projection of fat only. However, pixels representing fat only may not be present in dense breasts, causing an underestimation of density measurements. In this work, we investigate alternative approaches for obtaining a tissue reference value to improve density estimations, particularly in dense breasts. Two of three investigated reference values (F1, F2) are percentiles of the pixel value distribution in the breast interior (the contact area of breast and compression paddle). F1 is determined in a small breast interior, which minimizes the risk that peripheral pixels are included in the measurement at the cost of increasing the chance that no proper reference can be found. F2 is obtained using a larger breast interior. The new approach which is developed for very dense breasts does not require the presence of a fatty tissue region. As reference region we select the densest region in the mammogram and assume that this represents a projection of entirely dense tissue embedded between the subcutaneous fatty tissue layers. By measuring the thickness of the fat layers a reference (F3) can be computed. To obtain accurate breast density estimates irrespective of breast composition we investigated a combination of the results of the three reference values. We collected 202 pairs of MRI's and digital mammograms from 119 women. We compared the percent dense volume estimates based on both modalities and calculated Pearson's correlation coefficients. With the references F1-F3 we found respectively a correlation of [Formula: see text], [Formula: see text] and [Formula: see text]. Best results were obtained with the combination of the density estimations ([Formula: see text]). Results show that better volumetric density estimates can be obtained with the hybrid method, in particular for dense
Quantiles, parametric-select density estimation, and bi-information parameter estimators
Parzen, E.
1982-01-01
A quantile-based approach to statistical analysis and probability modeling of data is presented which formulates statistical inference problems as functional inference problems in which the parameters to be estimated are density functions. Density estimators can be non-parametric (computed independently of model identified) or parametric-select (approximated by finite parametric models that can provide standard models whose fit can be tested). Exponential models and autoregressive models are approximating densities which can be justified as maximum entropy for respectively the entropy of a probability density and the entropy of a quantile density. Applications of these ideas are outlined to the problems of modeling: (1) univariate data; (2) bivariate data and tests for independence; and (3) two samples and likelihood ratios. It is proposed that bi-information estimation of a density function can be developed by analogy to the problem of identification of regression models.
The crowding factor method applied to parafoveal vision
Ghahghaei, Saeideh; Walker, Laura
2016-01-01
Crowding increases with eccentricity and is most readily observed in the periphery. During natural, active vision, however, central vision plays an important role. Measures of critical distance to estimate crowding are difficult in central vision, as these distances are small. Any overlap of flankers with the target may create an overlay masking confound. The crowding factor method avoids this issue by simultaneously modulating target size and flanker distance and using a ratio to compare crowded to uncrowded conditions. This method was developed and applied in the periphery (Petrov & Meleshkevich, 2011b). In this work, we apply the method to characterize crowding in parafoveal vision (crowding than in the periphery, yet radial/tangential asymmetries are clearly preserved. There are considerable idiosyncratic differences observed between participants. The crowding factor method provides a powerful tool for examining crowding in central and peripheral vision, which will be useful in future studies that seek to understand visual processing under natural, active viewing conditions. PMID:27690170
Green's function based density estimation
Energy Technology Data Exchange (ETDEWEB)
Kovesarki, Peter; Brock, Ian C.; Nuncio Quiroz, Adriana Elizabeth [Physikalisches Institut, Universitaet Bonn (Germany)
2012-07-01
A method was developed based on Green's function identities to estimate probability densities. This can be used for likelihood estimations and for binary classifications. It offers several advantages over neural networks, boosted decision trees and other, regression based classifiers. For example, it is less prone to overtraining, and it is much easier to combine several samples. Some capabilities are demonstrated using ATLAS data.
Density estimates of monarch butterflies overwintering in central Mexico
Thogmartin, Wayne; Diffendorfer, Jay E.; Lopez-Hoffman, Laura; Oberhauser, Karen; Pleasants, John; Semmens, Brice Xavier; Semmens, Darius; Taylor, Orley R.; Wiederholt, Ruscena
2017-01-01
Given the rapid population decline and recent petition for listing of the monarch butterfly (Danaus plexippus L.) under the Endangered Species Act, an accurate estimate of the Eastern, migratory population size is needed. Because of difficulty in counting individual monarchs, the number of hectares occupied by monarchs in the overwintering area is commonly used as a proxy for population size, which is then multiplied by the density of individuals per hectare to estimate population size. There...
Density Estimation in Several Populations With Uncertain Population Membership
Ma, Yanyuan
2011-09-01
We devise methods to estimate probability density functions of several populations using observations with uncertain population membership, meaning from which population an observation comes is unknown. The probability of an observation being sampled from any given population can be calculated. We develop general estimation procedures and bandwidth selection methods for our setting. We establish large-sample properties and study finite-sample performance using simulation studies. We illustrate our methods with data from a nutrition study.
Estimation of volumetric breast density for breast cancer risk prediction
Pawluczyk, Olga; Yaffe, Martin J.; Boyd, Norman F.; Jong, Roberta A.
2000-04-01
Mammographic density (MD) has been shown to be a strong risk predictor for breast cancer. Compared to subjective assessment by a radiologist, computer-aided analysis of digitized mammograms provides a quantitative and more reproducible method for assessing breast density. However, the current methods of estimating breast density based on the area of bright signal in a mammogram do not reflect the true, volumetric quantity of dense tissue in the breast. A computerized method to estimate the amount of radiographically dense tissue in the overall volume of the breast has been developed to provide an automatic, user-independent tool for breast cancer risk assessment. The procedure for volumetric density estimation consists of first correcting the image for inhomogeneity, then performing a volume density calculation. First, optical sensitometry is used to convert all images to the logarithm of relative exposure (LRE), in order to simplify the image correction operations. The field non-uniformity correction, which takes into account heel effect, inverse square law, path obliquity and intrinsic field and grid non- uniformity is obtained by imaging a spherical section PMMA phantom. The processed LRE image of the phantom is then used as a correction offset for actual mammograms. From information about the thickness and placement of the breast, as well as the parameters of a breast-like calibration step wedge placed in the mammogram, MD of the breast is calculated. Post processing and a simple calibration phantom enable user- independent, reliable and repeatable volumetric estimation of density in breast-equivalent phantoms. Initial results obtained on known density phantoms show the estimation to vary less than 5% in MD from the actual value. This can be compared to estimated mammographic density differences of 30% between the true and non-corrected values. Since a more simplistic breast density measurement based on the projected area has been shown to be a strong indicator
Estimating neuronal connectivity from axonal and dendritic density fields
van Pelt, Jaap; van Ooyen, Arjen
2013-01-01
Neurons innervate space by extending axonal and dendritic arborizations. When axons and dendrites come in close proximity of each other, synapses between neurons can be formed. Neurons vary greatly in their morphologies and synaptic connections with other neurons. The size and shape of the arborizations determine the way neurons innervate space. A neuron may therefore be characterized by the spatial distribution of its axonal and dendritic “mass.” A population mean “mass” density field of a particular neuron type can be obtained by averaging over the individual variations in neuron geometries. Connectivity in terms of candidate synaptic contacts between neurons can be determined directly on the basis of their arborizations but also indirectly on the basis of their density fields. To decide when a candidate synapse can be formed, we previously developed a criterion defining that axonal and dendritic line pieces should cross in 3D and have an orthogonal distance less than a threshold value. In this paper, we developed new methodology for applying this criterion to density fields. We show that estimates of the number of contacts between neuron pairs calculated from their density fields are fully consistent with the number of contacts calculated from the actual arborizations. However, the estimation of the connection probability and the expected number of contacts per connection cannot be calculated directly from density fields, because density fields do not carry anymore the correlative structure in the spatial distribution of synaptic contacts. Alternatively, these two connectivity measures can be estimated from the expected number of contacts by using empirical mapping functions. The neurons used for the validation studies were generated by our neuron simulator NETMORPH. An example is given of the estimation of average connectivity and Euclidean pre- and postsynaptic distance distributions in a network of neurons represented by their population mean density
Image correlates of crowding in natural scenes.
Wallis, Thomas S A; Bex, Peter J
2012-07-13
Visual crowding is the inability to identify visible features when they are surrounded by other structure in the peripheral field. Since natural environments are replete with structure and most of our visual field is peripheral, crowding represents the primary limit on vision in the real world. However, little is known about the characteristics of crowding under natural conditions. Here we examine where crowding occurs in natural images. Observers were required to identify which of four locations contained a patch of "dead leaves'' (synthetic, naturalistic contour structure) embedded into natural images. Threshold size for the dead leaves patch scaled with eccentricity in a manner consistent with crowding. Reverse correlation at multiple scales was used to determine local image statistics that correlated with task performance. Stepwise model selection revealed that local RMS contrast and edge density at the site of the dead leaves patch were of primary importance in predicting the occurrence of crowding once patch size and eccentricity had been considered. The absolute magnitudes of the regression weights for RMS contrast at different spatial scales varied in a manner consistent with receptive field sizes measured in striate cortex of primate brains. Our results are consistent with crowding models that are based on spatial averaging of features in the early stages of the visual system, and allow the prediction of where crowding is likely to occur in natural images.
Face Value: Towards Robust Estimates of Snow Leopard Densities.
Directory of Open Access Journals (Sweden)
Justine S Alexander
Full Text Available When densities of large carnivores fall below certain thresholds, dramatic ecological effects can follow, leading to oversimplified ecosystems. Understanding the population status of such species remains a major challenge as they occur in low densities and their ranges are wide. This paper describes the use of non-invasive data collection techniques combined with recent spatial capture-recapture methods to estimate the density of snow leopards Panthera uncia. It also investigates the influence of environmental and human activity indicators on their spatial distribution. A total of 60 camera traps were systematically set up during a three-month period over a 480 km2 study area in Qilianshan National Nature Reserve, Gansu Province, China. We recorded 76 separate snow leopard captures over 2,906 trap-days, representing an average capture success of 2.62 captures/100 trap-days. We identified a total number of 20 unique individuals from photographs and estimated snow leopard density at 3.31 (SE = 1.01 individuals per 100 km2. Results of our simulation exercise indicate that our estimates from the Spatial Capture Recapture models were not optimal to respect to bias and precision (RMSEs for density parameters less or equal to 0.87. Our results underline the critical challenge in achieving sufficient sample sizes of snow leopard captures and recaptures. Possible performance improvements are discussed, principally by optimising effective camera capture and photographic data quality.
Face Value: Towards Robust Estimates of Snow Leopard Densities.
Alexander, Justine S; Gopalaswamy, Arjun M; Shi, Kun; Riordan, Philip
2015-01-01
When densities of large carnivores fall below certain thresholds, dramatic ecological effects can follow, leading to oversimplified ecosystems. Understanding the population status of such species remains a major challenge as they occur in low densities and their ranges are wide. This paper describes the use of non-invasive data collection techniques combined with recent spatial capture-recapture methods to estimate the density of snow leopards Panthera uncia. It also investigates the influence of environmental and human activity indicators on their spatial distribution. A total of 60 camera traps were systematically set up during a three-month period over a 480 km2 study area in Qilianshan National Nature Reserve, Gansu Province, China. We recorded 76 separate snow leopard captures over 2,906 trap-days, representing an average capture success of 2.62 captures/100 trap-days. We identified a total number of 20 unique individuals from photographs and estimated snow leopard density at 3.31 (SE = 1.01) individuals per 100 km2. Results of our simulation exercise indicate that our estimates from the Spatial Capture Recapture models were not optimal to respect to bias and precision (RMSEs for density parameters less or equal to 0.87). Our results underline the critical challenge in achieving sufficient sample sizes of snow leopard captures and recaptures. Possible performance improvements are discussed, principally by optimising effective camera capture and photographic data quality.
Corruption clubs: empirical evidence from kernel density estimates
Herzfeld, T.; Weiss, Ch.
2007-01-01
A common finding of many analytical models is the existence of multiple equilibria of corruption. Countries characterized by the same economic, social and cultural background do not necessarily experience the same levels of corruption. In this article, we use Kernel Density Estimation techniques to
Corruption clubs: empirical evidence from kernel density estimates
Herzfeld, T.; Weiss, Ch.
2007-01-01
A common finding of many analytical models is the existence of multiple equilibria of corruption. Countries characterized by the same economic, social and cultural background do not necessarily experience the same levels of corruption. In this article, we use Kernel Density Estimation techniques to
Density estimation in tiger populations: combining information for strong inference.
Gopalaswamy, Arjun M; Royle, J Andrew; Delampady, Mohan; Nichols, James D; Karanth, K Ullas; Macdonald, David W
2012-07-01
A productive way forward in studies of animal populations is to efficiently make use of all the information available, either as raw data or as published sources, on critical parameters of interest. In this study, we demonstrate two approaches to the use of multiple sources of information on a parameter of fundamental interest to ecologists: animal density. The first approach produces estimates simultaneously from two different sources of data. The second approach was developed for situations in which initial data collection and analysis are followed up by subsequent data collection and prior knowledge is updated with new data using a stepwise process. Both approaches are used to estimate density of a rare and elusive predator, the tiger, by combining photographic and fecal DNA spatial capture-recapture data. The model, which combined information, provided the most precise estimate of density (8.5 +/- 1.95 tigers/100 km2 [posterior mean +/- SD]) relative to a model that utilized only one data source (photographic, 12.02 +/- 3.02 tigers/100 km2 and fecal DNA, 6.65 +/- 2.37 tigers/100 km2). Our study demonstrates that, by accounting for multiple sources of available information, estimates of animal density can be significantly improved.
Density estimation in tiger populations: combining information for strong inference
Gopalaswamy, Arjun M.; Royle, J. Andrew; Delampady, Mohan; Nichols, James D.; Karanth, K. Ullas; Macdonald, David W.
2012-01-01
A productive way forward in studies of animal populations is to efficiently make use of all the information available, either as raw data or as published sources, on critical parameters of interest. In this study, we demonstrate two approaches to the use of multiple sources of information on a parameter of fundamental interest to ecologists: animal density. The first approach produces estimates simultaneously from two different sources of data. The second approach was developed for situations in which initial data collection and analysis are followed up by subsequent data collection and prior knowledge is updated with new data using a stepwise process. Both approaches are used to estimate density of a rare and elusive predator, the tiger, by combining photographic and fecal DNA spatial capture–recapture data. The model, which combined information, provided the most precise estimate of density (8.5 ± 1.95 tigers/100 km2 [posterior mean ± SD]) relative to a model that utilized only one data source (photographic, 12.02 ± 3.02 tigers/100 km2 and fecal DNA, 6.65 ± 2.37 tigers/100 km2). Our study demonstrates that, by accounting for multiple sources of available information, estimates of animal density can be significantly improved.
Optimization of volumetric breast density estimation in digital mammograms
Holland, K.; Gubern Merida, A.; Mann, R.M.; Karssemeijer, N.
2017-01-01
Fibroglandular tissue volume and percent density can be estimated in unprocessed mammograms using a physics-based method, which relies on an internal reference value representing the projection of fat only. However, pixels representing fat only may not be present in dense breasts, causing an
State of the Art in Photon-Density Estimation
DEFF Research Database (Denmark)
Hachisuka, Toshiya; Jarosz, Wojciech; Georgiev, Iliyan
2013-01-01
Photon-density estimation techniques are a popular choice for simulating light transport in scenes with complicated geometry and materials. This class of algorithms can be used to accurately simulate inter-reflections, caustics, color bleeding, scattering in participating media, and subsurface sc...
State of the Art in Photon Density Estimation
DEFF Research Database (Denmark)
Hachisuka, Toshiya; Jarosz, Wojciech; Bouchard, Guillaume
2012-01-01
Photon-density estimation techniques are a popular choice for simulating light transport in scenes with complicated geometry and materials. This class of algorithms can be used to accurately simulate inter-reflections, caustics, color bleeding, scattering in participating media, and subsurface sc...
Estimation of the space density of low surface brightness galaxies
Briggs, FH
1997-01-01
The space density of low surface brightness and tiny gas-rich dwarf galaxies are estimated for two recent catalogs: the Arecibo Survey of Northern Dwarf and Low Surface Brightness Galaxies and the Catalog of Low Surface Brightness Galaxies, List II. The goals are (1) to evaluate the additions to the
State of the Art in Photon Density Estimation
DEFF Research Database (Denmark)
Hachisuka, Toshiya; Jarosz, Wojciech; Bouchard, Guillaume
2012-01-01
scattering. Since its introduction, photon-density estimation has been significantly extended in computer graphics with the introduction of: specialized techniques that intelligently modify the positions or bandwidths to reduce visual error using a small number of photons, approaches that eliminate error...
Regularized Regression and Density Estimation based on Optimal Transport
Burger, M.
2012-03-11
The aim of this paper is to investigate a novel nonparametric approach for estimating and smoothing density functions as well as probability densities from discrete samples based on a variational regularization method with the Wasserstein metric as a data fidelity. The approach allows a unified treatment of discrete and continuous probability measures and is hence attractive for various tasks. In particular, the variational model for special regularization functionals yields a natural method for estimating densities and for preserving edges in the case of total variation regularization. In order to compute solutions of the variational problems, a regularized optimal transport problem needs to be solved, for which we discuss several formulations and provide a detailed analysis. Moreover, we compute special self-similar solutions for standard regularization functionals and we discuss several computational approaches and results. © 2012 The Author(s).
Simplified large African carnivore density estimators from track indices
Directory of Open Access Journals (Sweden)
Christiaan W. Winterbach
2016-12-01
Full Text Available Background The range, population size and trend of large carnivores are important parameters to assess their status globally and to plan conservation strategies. One can use linear models to assess population size and trends of large carnivores from track-based surveys on suitable substrates. The conventional approach of a linear model with intercept may not intercept at zero, but may fit the data better than linear model through the origin. We assess whether a linear regression through the origin is more appropriate than a linear regression with intercept to model large African carnivore densities and track indices. Methods We did simple linear regression with intercept analysis and simple linear regression through the origin and used the confidence interval for ß in the linear model y = αx + ß, Standard Error of Estimate, Mean Squares Residual and Akaike Information Criteria to evaluate the models. Results The Lion on Clay and Low Density on Sand models with intercept were not significant (P > 0.05. The other four models with intercept and the six models thorough origin were all significant (P < 0.05. The models using linear regression with intercept all included zero in the confidence interval for ß and the null hypothesis that ß = 0 could not be rejected. All models showed that the linear model through the origin provided a better fit than the linear model with intercept, as indicated by the Standard Error of Estimate and Mean Square Residuals. Akaike Information Criteria showed that linear models through the origin were better and that none of the linear models with intercept had substantial support. Discussion Our results showed that linear regression through the origin is justified over the more typical linear regression with intercept for all models we tested. A general model can be used to estimate large carnivore densities from track densities across species and study areas. The formula observed track density = 3.26
SVM for density estimation and application to medical image segmentation
Institute of Scientific and Technical Information of China (English)
ZHANG Zhao; ZHANG Su; ZHANG Chen-xi; CHEN Ya-zhu
2006-01-01
A method of medical image segmentation based on support vector machine (SVM) for density estimation is presented. We used this estimator to construct a prior model of the image intensity and curvature profile of the structure from training images. When segmenting a novel image similar to the training images, the technique of narrow level set method is used. The higher dimensional surface evolution metric is defined by the prior model instead of by energy minimization function. This method offers several advantages. First, SVM for density estimation is consistent and its solution is sparse. Second, compared to the traditional level set methods, this method incorporates shape information on the object to be segmented into the segmentation process.Segmentation results are demonstrated on synthetic images, MR images and ultrasonic images.
Evaluating lidar point densities for effective estimation of aboveground biomass
Wu, Zhuoting; Dye, Dennis G.; Stoker, Jason M.; Vogel, John M.; Velasco, Miguel G.; Middleton, Barry R.
2016-01-01
The U.S. Geological Survey (USGS) 3D Elevation Program (3DEP) was recently established to provide airborne lidar data coverage on a national scale. As part of a broader research effort of the USGS to develop an effective remote sensing-based methodology for the creation of an operational biomass Essential Climate Variable (Biomass ECV) data product, we evaluated the performance of airborne lidar data at various pulse densities against Landsat 8 satellite imagery in estimating above ground biomass for forests and woodlands in a study area in east-central Arizona, U.S. High point density airborne lidar data, were randomly sampled to produce five lidar datasets with reduced densities ranging from 0.5 to 8 point(s)/m2, corresponding to the point density range of 3DEP to provide national lidar coverage over time. Lidar-derived aboveground biomass estimate errors showed an overall decreasing trend as lidar point density increased from 0.5 to 8 points/m2. Landsat 8-based aboveground biomass estimates produced errors larger than the lowest lidar point density of 0.5 point/m2, and therefore Landsat 8 observations alone were ineffective relative to airborne lidar for generating a Biomass ECV product, at least for the forest and woodland vegetation types of the Southwestern U.S. While a national Biomass ECV product with optimal accuracy could potentially be achieved with 3DEP data at 8 points/m2, our results indicate that even lower density lidar data could be sufficient to provide a national Biomass ECV product with accuracies significantly higher than that from Landsat observations alone.
Estimation of Enceladus Plume Density Using Cassini Flight Data
Wang, Eric K.; Lee, Allan Y.
2011-01-01
The Cassini spacecraft was launched on October 15, 1997 by a Titan 4B launch vehicle. After an interplanetary cruise of almost seven years, it arrived at Saturn on June 30, 2004. In 2005, Cassini completed three flybys of Enceladus, a small, icy satellite of Saturn. Observations made during these flybys confirmed the existence of water vapor plumes in the south polar region of Enceladus. Five additional low-altitude flybys of Enceladus were successfully executed in 2008-9 to better characterize these watery plumes. During some of these Enceladus flybys, the spacecraft attitude was controlled by a set of three reaction wheels. When the disturbance torque imparted on the spacecraft was predicted to exceed the control authority of the reaction wheels, thrusters were used to control the spacecraft attitude. Using telemetry data of reaction wheel rates or thruster on-times collected from four low-altitude Enceladus flybys (in 2008-10), one can reconstruct the time histories of the Enceladus plume jet density. The 1 sigma uncertainty of the estimated density is 5.9-6.7% (depending on the density estimation methodology employed). These plume density estimates could be used to confirm measurements made by other onboard science instruments and to support the modeling of Enceladus plume jets.
Open-cluster density profiles derived using a kernel estimator
Seleznev, Anton F
2016-01-01
Surface and spatial radial density profiles in open clusters are derived using a kernel estimator method. Formulae are obtained for the contribution of every star into the spatial density profile. The evaluation of spatial density profiles is tested against open-cluster models from N-body experiments with N = 500. Surface density profiles are derived for seven open clusters (NGC 1502, 1960, 2287, 2516, 2682, 6819 and 6939) using Two-Micron All-Sky Survey data and for different limiting magnitudes. The selection of an optimal kernel half-width is discussed. It is shown that open-cluster radius estimates hardly depend on the kernel half-width. Hints of stellar mass segregation and structural features indicating cluster non-stationarity in the regular force field are found. A comparison with other investigations shows that the data on open-cluster sizes are often underestimated. The existence of an extended corona around the open cluster NGC 6939 was confirmed. A combined function composed of the King density pr...
A method for density estimation based on expectation identities
Peralta, Joaquín; Loyola, Claudia; Loguercio, Humberto; Davis, Sergio
2017-06-01
We present a simple and direct method for non-parametric estimation of a one-dimensional probability density, based on the application of the recent conjugate variables theorem. The method expands the logarithm of the probability density ln P(x|I) in terms of a complete basis and numerically solves for the coefficients of the expansion using a linear system of equations. No Monte Carlo sampling is needed. We present preliminary results that show the practical usefulness of the method for modeling statistical data.
Bayesian error estimation in density-functional theory
DEFF Research Database (Denmark)
Mortensen, Jens Jørgen; Kaasbjerg, Kristen; Frederiksen, Søren Lund
2005-01-01
We present a practical scheme for performing error estimates for density-functional theory calculations. The approach, which is based on ideas from Bayesian statistics, involves creating an ensemble of exchange-correlation functionals by comparing with an experimental database of binding energies...... for molecules and solids. Fluctuations within the ensemble can then be used to estimate errors relative to experiment on calculated quantities such as binding energies, bond lengths, and vibrational frequencies. It is demonstrated that the error bars on energy differences may vary by orders of magnitude...
Photo-z Estimation: An Example of Nonparametric Conditional Density Estimation under Selection Bias
Izbicki, Rafael; Freeman, Peter E
2016-01-01
Redshift is a key quantity for inferring cosmological model parameters. In photometric redshift estimation, cosmologists use the coarse data collected from the vast majority of galaxies to predict the redshift of individual galaxies. To properly quantify the uncertainty in the predictions, however, one needs to go beyond standard regression and instead estimate the full conditional density f(z|x) of a galaxy's redshift z given its photometric covariates x. The problem is further complicated by selection bias: usually only the rarest and brightest galaxies have known redshifts, and these galaxies have characteristics and measured covariates that do not necessarily match those of more numerous and dimmer galaxies of unknown redshift. Unfortunately, there is not much research on how to best estimate complex multivariate densities in such settings. Here we describe a general framework for properly constructing and assessing nonparametric conditional density estimators under selection bias, and for combining two o...
Large Scale Density Estimation of Blue and Fin Whales (LSD)
2014-09-30
interactions with human activity requires knowledge of how many animals are present in an area during a specific time period. Many marine mammal species ...Ocean at Wake Island will then be applied to the same species in the Indian Ocean at the CTBTO location at Diego Garcia. 1. Develop and implement...proposed density estimation method is also highly dependent on call rate inputs, which are used in the development of species specific multipliers for
Some Bayesian statistical techniques useful in estimating frequency and density
Johnson, D.H.
1977-01-01
This paper presents some elementary applications of Bayesian statistics to problems faced by wildlife biologists. Bayesian confidence limits for frequency of occurrence are shown to be generally superior to classical confidence limits. Population density can be estimated from frequency data if the species is sparsely distributed relative to the size of the sample plot. For other situations, limits are developed based on the normal distribution and prior knowledge that the density is non-negative, which insures that the lower confidence limit is non-negative. Conditions are described under which Bayesian confidence limits are superior to those calculated with classical methods; examples are also given on how prior knowledge of the density can be used to sharpen inferences drawn from a new sample.
Covariance and correlation estimation in electron-density maps.
Altomare, Angela; Cuocci, Corrado; Giacovazzo, Carmelo; Moliterni, Anna; Rizzi, Rosanna
2012-03-01
Quite recently two papers have been published [Giacovazzo & Mazzone (2011). Acta Cryst. A67, 210-218; Giacovazzo et al. (2011). Acta Cryst. A67, 368-382] which calculate the variance in any point of an electron-density map at any stage of the phasing process. The main aim of the papers was to associate a standard deviation to each pixel of the map, in order to obtain a better estimate of the map reliability. This paper deals with the covariance estimate between points of an electron-density map in any space group, centrosymmetric or non-centrosymmetric, no matter the correlation between the model and target structures. The aim is as follows: to verify if the electron density in one point of the map is amplified or depressed as an effect of the electron density in one or more other points of the map. High values of the covariances are usually connected with undesired features of the map. The phases are the primitive random variables of our probabilistic model; the covariance changes with the quality of the model and therefore with the quality of the phases. The conclusive formulas show that the covariance is also influenced by the Patterson map. Uncertainty on measurements may influence the covariance, particularly in the final stages of the structure refinement; a general formula is obtained taking into account both phase and measurement uncertainty, valid at any stage of the crystal structure solution.
Accurate photometric redshift probability density estimation - method comparison and application
Rau, Markus Michael; Brimioulle, Fabrice; Frank, Eibe; Friedrich, Oliver; Gruen, Daniel; Hoyle, Ben
2015-01-01
We introduce an ordinal classification algorithm for photometric redshift estimation, which vastly improves the reconstruction of photometric redshift probability density functions (PDFs) for individual galaxies and galaxy samples. As a use case we apply our method to CFHTLS galaxies. The ordinal classification algorithm treats distinct redshift bins as ordered values, which improves the quality of photometric redshift PDFs, compared with non-ordinal classification architectures. We also propose a new single value point estimate of the galaxy redshift, that can be used to estimate the full redshift PDF of a galaxy sample. This method is competitive in terms of accuracy with contemporary algorithms, which stack the full redshift PDFs of all galaxies in the sample, but requires orders of magnitudes less storage space. The methods described in this paper greatly improve the log-likelihood of individual object redshift PDFs, when compared with a popular Neural Network code (ANNz). In our use case, this improvemen...
Lev, Maria; Yehezkel, Oren; Polat, Uri
2014-02-12
Visual crowding, as context modulation, reduce the ability to recognize objects in clutter, sets a fundamental limit on visual perception and object recognition. It's considered that crowding does not exist in the fovea and extensive efforts explored crowding in the periphery revealed various models that consider several aspects of spatial processing. Studies showed that spatial and temporal crowding are correlated, suggesting a tradeoff between spatial and temporal processing of crowding. We hypothesized that limiting stimulus availability should decrease object recognition in clutter. Here we show, for the first time, that robust contour interactions exist in the fovea for much larger target-flanker spacing than reported previously: participants overcome crowded conditions for long presentations times but exhibit contour interaction effects for short presentation times. Thus, by enabling enough processing time in the fovea, contour interactions can be overcome, enabling object recognition. Our results suggest that contemporary models of context modulation should include both time and spatial processing.
Perceived positions determine crowding.
Maus, Gerrit W; Fischer, Jason; Whitney, David
2011-01-01
Crowding is a fundamental bottleneck in object recognition. In crowding, an object in the periphery becomes unrecognizable when surrounded by clutter or distractor objects. Crowding depends on the positions of target and distractors, both their eccentricity and their relative spacing. In all previous studies, position has been expressed in terms of retinal position. However, in a number of situations retinal and perceived positions can be dissociated. Does retinal or perceived position determine the magnitude of crowding? Here observers performed an orientation judgment on a target Gabor patch surrounded by distractors that drifted toward or away from the target, causing an illusory motion-induced position shift. Distractors in identical physical positions led to worse performance when they drifted towards the target (appearing closer) versus away from the target (appearing further). This difference in crowding corresponded to the difference in perceived positions. Further, the perceptual mislocalization was necessary for the change in crowding, and both the mislocalization and crowding scaled with drift speed. The results show that crowding occurs after perceived positions have been assigned by the visual system. Crowding does not operate in a purely retinal coordinate system; perceived positions need to be taken into account.
Perceived positions determine crowding.
Directory of Open Access Journals (Sweden)
Gerrit W Maus
Full Text Available Crowding is a fundamental bottleneck in object recognition. In crowding, an object in the periphery becomes unrecognizable when surrounded by clutter or distractor objects. Crowding depends on the positions of target and distractors, both their eccentricity and their relative spacing. In all previous studies, position has been expressed in terms of retinal position. However, in a number of situations retinal and perceived positions can be dissociated. Does retinal or perceived position determine the magnitude of crowding? Here observers performed an orientation judgment on a target Gabor patch surrounded by distractors that drifted toward or away from the target, causing an illusory motion-induced position shift. Distractors in identical physical positions led to worse performance when they drifted towards the target (appearing closer versus away from the target (appearing further. This difference in crowding corresponded to the difference in perceived positions. Further, the perceptual mislocalization was necessary for the change in crowding, and both the mislocalization and crowding scaled with drift speed. The results show that crowding occurs after perceived positions have been assigned by the visual system. Crowding does not operate in a purely retinal coordinate system; perceived positions need to be taken into account.
A projection and density estimation method for knowledge discovery.
Stanski, Adam; Hellwich, Olaf
2012-01-01
A key ingredient to modern data analysis is probability density estimation. However, it is well known that the curse of dimensionality prevents a proper estimation of densities in high dimensions. The problem is typically circumvented by using a fixed set of assumptions about the data, e.g., by assuming partial independence of features, data on a manifold or a customized kernel. These fixed assumptions limit the applicability of a method. In this paper we propose a framework that uses a flexible set of assumptions instead. It allows to tailor a model to various problems by means of 1d-decompositions. The approach achieves a fast runtime and is not limited by the curse of dimensionality as all estimations are performed in 1d-space. The wide range of applications is demonstrated at two very different real world examples. The first is a data mining software that allows the fully automatic discovery of patterns. The software is publicly available for evaluation. As a second example an image segmentation method is realized. It achieves state of the art performance on a benchmark dataset although it uses only a fraction of the training data and very simple features.
A projection and density estimation method for knowledge discovery.
Directory of Open Access Journals (Sweden)
Adam Stanski
Full Text Available A key ingredient to modern data analysis is probability density estimation. However, it is well known that the curse of dimensionality prevents a proper estimation of densities in high dimensions. The problem is typically circumvented by using a fixed set of assumptions about the data, e.g., by assuming partial independence of features, data on a manifold or a customized kernel. These fixed assumptions limit the applicability of a method. In this paper we propose a framework that uses a flexible set of assumptions instead. It allows to tailor a model to various problems by means of 1d-decompositions. The approach achieves a fast runtime and is not limited by the curse of dimensionality as all estimations are performed in 1d-space. The wide range of applications is demonstrated at two very different real world examples. The first is a data mining software that allows the fully automatic discovery of patterns. The software is publicly available for evaluation. As a second example an image segmentation method is realized. It achieves state of the art performance on a benchmark dataset although it uses only a fraction of the training data and very simple features.
Technology for Simulating Crowd Evacuation Behaviors
Institute of Scientific and Technical Information of China (English)
Wen-Hu Qin; Guo-Hui Su; Xiao-Na Li
2009-01-01
This paper presents a model for simulating crowd evacuation and investigates three widely recognized problems. For the space continuity problem, this paper presents two computation algorithms: one uses grid space to evaluate the coordinates of the obstacle's bounding box and the other employs the geometry rule to establish individual evacuation routes. For the problem of collision, avoidance, and excess among the individuals, this paper computes the generalized force and friction force and then modifies the direction of march to obtain a speed model based on the crowd density and real time speed. For the exit selection problem, this paper establishes a method of selecting the exits by combining the exit's crowd state with the individuals. Finally, a particle system is used to simulate the behavior of crowd evacuation and produces useful test results.
Effect of Random Clustering on Surface Damage Density Estimates
Energy Technology Data Exchange (ETDEWEB)
Matthews, M J; Feit, M D
2007-10-29
Identification and spatial registration of laser-induced damage relative to incident fluence profiles is often required to characterize the damage properties of laser optics near damage threshold. Of particular interest in inertial confinement laser systems are large aperture beam damage tests (>1cm{sup 2}) where the number of initiated damage sites for {phi}>14J/cm{sup 2} can approach 10{sup 5}-10{sup 6}, requiring automatic microscopy counting to locate and register individual damage sites. However, as was shown for the case of bacteria counting in biology decades ago, random overlapping or 'clumping' prevents accurate counting of Poisson-distributed objects at high densities, and must be accounted for if the underlying statistics are to be understood. In this work we analyze the effect of random clumping on damage initiation density estimates at fluences above damage threshold. The parameter {psi} = a{rho} = {rho}/{rho}{sub 0}, where a = 1/{rho}{sub 0} is the mean damage site area and {rho} is the mean number density, is used to characterize the onset of clumping, and approximations based on a simple model are used to derive an expression for clumped damage density vs. fluence and damage site size. The influence of the uncorrected {rho} vs. {phi} curve on damage initiation probability predictions is also discussed.
A Concept of Approximated Densities for Efficient Nonlinear Estimation
Directory of Open Access Journals (Sweden)
Virginie F. Ruiz
2002-10-01
Full Text Available This paper presents the theoretical development of a nonlinear adaptive filter based on a concept of filtering by approximated densities (FAD. The most common procedures for nonlinear estimation apply the extended Kalman filter. As opposed to conventional techniques, the proposed recursive algorithm does not require any linearisation. The prediction uses a maximum entropy principle subject to constraints. Thus, the densities created are of an exponential type and depend on a finite number of parameters. The filtering yields recursive equations involving these parameters. The update applies the Bayes theorem. Through simulation on a generic exponential model, the proposed nonlinear filter is implemented and the results prove to be superior to that of the extended Kalman filter and a class of nonlinear filters based on partitioning algorithms.
Estimation of probability densities using scale-free field theories.
Kinney, Justin B
2014-07-01
The question of how best to estimate a continuous probability density from finite data is an intriguing open problem at the interface of statistics and physics. Previous work has argued that this problem can be addressed in a natural way using methods from statistical field theory. Here I describe results that allow this field-theoretic approach to be rapidly and deterministically computed in low dimensions, making it practical for use in day-to-day data analysis. Importantly, this approach does not impose a privileged length scale for smoothness of the inferred probability density, but rather learns a natural length scale from the data due to the tradeoff between goodness of fit and an Occam factor. Open source software implementing this method in one and two dimensions is provided.
Monaghan, Alison A
2017-12-01
Over significant areas of the UK and western Europe, anthropogenic alteration of the subsurface by mining of coal has occurred beneath highly populated areas which are now considering a multiplicity of 'low carbon' unconventional energy resources including shale gas and oil, coal bed methane, geothermal energy and energy storage. To enable decision making on the 3D planning, licensing and extraction of these resources requires reduced uncertainty around complex geology and hydrogeological and geomechanical processes. An exemplar from the Carboniferous of central Scotland, UK, illustrates how, in areas lacking hydrocarbon well production data and 3D seismic surveys, legacy coal mine plans and associated boreholes provide valuable data that can be used to reduce the uncertainty around geometry and faulting of subsurface energy resources. However, legacy coal mines also limit unconventional resource volumes since mines and associated shafts alter the stress and hydrogeochemical state of the subsurface, commonly forming pathways to the surface. To reduce the risk of subsurface connections between energy resources, an example of an adapted methodology is described for shale gas/oil resource estimation to include a vertical separation or 'stand-off' zone between the deepest mine workings, to ensure the hydraulic fracturing required for shale resource production would not intersect legacy coal mines. Whilst the size of such separation zones requires further work, developing the concept of 3D spatial separation and planning is key to utilising the crowded subsurface energy system, whilst mitigating against resource sterilisation and environmental impacts, and could play a role in positively informing public and policy debate. Copyright © 2017 British Geological Survey, a component institute of NERC. Published by Elsevier B.V. All rights reserved.
Trotter, Robert J.
1974-01-01
This article considers the effects of human crowding in light of recent tests and observations. Factors such as sex, age, culture, socio-economic standing, frustration, and interpersonal physical distance are examined. Results indicate that crowding contributes to social problems and crime. (TK)
DEFF Research Database (Denmark)
Thelle, Mikkel
2016-01-01
This article seeks to address the relation between crowds andpublic space as a question of appropriation. With the newliberal constitutions in Europe, several phenomena of crowdingemerge in major cities, of which Copenhagen is taken as anexample. By focusing on the crowd as an agglomeration ofbod...
A Hierarchical Bayesian Model for Crowd Emotions
Urizar, Oscar J.; Baig, Mirza S.; Barakova, Emilia I.; Regazzoni, Carlo S.; Marcenaro, Lucio; Rauterberg, Matthias
2016-01-01
Estimation of emotions is an essential aspect in developing intelligent systems intended for crowded environments. However, emotion estimation in crowds remains a challenging problem due to the complexity in which human emotions are manifested and the capability of a system to perceive them in such conditions. This paper proposes a hierarchical Bayesian model to learn in unsupervised manner the behavior of individuals and of the crowd as a single entity, and explore the relation between behavior and emotions to infer emotional states. Information about the motion patterns of individuals are described using a self-organizing map, and a hierarchical Bayesian network builds probabilistic models to identify behaviors and infer the emotional state of individuals and the crowd. This model is trained and tested using data produced from simulated scenarios that resemble real-life environments. The conducted experiments tested the efficiency of our method to learn, detect and associate behaviors with emotional states yielding accuracy levels of 74% for individuals and 81% for the crowd, similar in performance with existing methods for pedestrian behavior detection but with novel concepts regarding the analysis of crowds. PMID:27458366
DEFF Research Database (Denmark)
Bondo Hansen, Kristian
, the dissertation offers a broad, yet rigorously focused, historical perspective on crowd phenomena in financial markets. Furthermore, it explores how ideas about crowd action, imitation, herding and contagion were introduced to and became integral parts of the discourses on financial markets. Reiterations......This dissertation undertakes an explorative historical analysis of problems associated with crowd phenomena in the U.S. financial markets between 1890 and 1940. While a study of crowd-related problems in the financial markets invariably involves examinations of panics and crises, the dissertation...... shows that crowds were not exclusively seen as crisis phenomena, but were considered by many financial writers to be of much broader significance to the organisation and functioning of markets. The dissertation claims that it is necessary to explore the close connections between financial markets...
Change-in-ratio density estimator for feral pigs is less biased than closed mark-recapture estimates
Hanson, L.B.; Grand, J.B.; Mitchell, M.S.; Jolley, D.B.; Sparklin, B.D.; Ditchkoff, S.S.
2008-01-01
Closed-population capture-mark-recapture (CMR) methods can produce biased density estimates for species with low or heterogeneous detection probabilities. In an attempt to address such biases, we developed a density-estimation method based on the change in ratio (CIR) of survival between two populations where survival, calculated using an open-population CMR model, is known to differ. We used our method to estimate density for a feral pig (Sus scrofa) population on Fort Benning, Georgia, USA. To assess its validity, we compared it to an estimate of the minimum density of pigs known to be alive and two estimates based on closed-population CMR models. Comparison of the density estimates revealed that the CIR estimator produced a density estimate with low precision that was reasonable with respect to minimum known density. By contrast, density point estimates using the closed-population CMR models were less than the minimum known density, consistent with biases created by low and heterogeneous capture probabilities for species like feral pigs that may occur in low density or are difficult to capture. Our CIR density estimator may be useful for tracking broad-scale, long-term changes in species, such as large cats, for which closed CMR models are unlikely to work. ?? CSIRO 2008.
Directory of Open Access Journals (Sweden)
D.O. Smallwood
1996-01-01
Full Text Available It is shown that the usual method for estimating the coherence functions (ordinary, partial, and multiple for a general multiple-input! multiple-output problem can be expressed as a modified form of Cholesky decomposition of the cross-spectral density matrix of the input and output records. The results can be equivalently obtained using singular value decomposition (SVD of the cross-spectral density matrix. Using SVD suggests a new form of fractional coherence. The formulation as a SVD problem also suggests a way to order the inputs when a natural physical order of the inputs is absent.
A bivalent scale for measuring crowding among deer hunters
Gigliotti, Larry M.; Chase, Loren
2014-01-01
One factor that may influence satisfaction in outdoor recreation is crowding, which historically has been defined as a negative evaluation of the density of other participants. While this definition is suitable for most scenarios, there are circumstances where encounters with others in the area are evaluated positively and thus contribute to the satisfaction of the participant. To adequately describe this phenomenon we suggest a more inclusive measurement of crowding that allows for both positive and negative evaluations of participant density to more accurately explore the relationship between crowding and satisfaction. We identified a sub-group of deer hunters who negatively evaluated the low density of other hunters, which reduced their satisfaction with their overall hunting experience. The methodology for measuring crowding in recreation research may have an important effect in identifying the relationship crowding has with other relevant variables as well as management implications.
Probability Density and CFAR Threshold Estimation for Hyperspectral Imaging
Energy Technology Data Exchange (ETDEWEB)
Clark, G A
2004-09-21
The work reported here shows the proof of principle (using a small data set) for a suite of algorithms designed to estimate the probability density function of hyperspectral background data and compute the appropriate Constant False Alarm Rate (CFAR) matched filter decision threshold for a chemical plume detector. Future work will provide a thorough demonstration of the algorithms and their performance with a large data set. The LASI (Large Aperture Search Initiative) Project involves instrumentation and image processing for hyperspectral images of chemical plumes in the atmosphere. The work reported here involves research and development on algorithms for reducing the false alarm rate in chemical plume detection and identification algorithms operating on hyperspectral image cubes. The chemical plume detection algorithms to date have used matched filters designed using generalized maximum likelihood ratio hypothesis testing algorithms [1, 2, 5, 6, 7, 12, 10, 11, 13]. One of the key challenges in hyperspectral imaging research is the high false alarm rate that often results from the plume detector [1, 2]. The overall goal of this work is to extend the classical matched filter detector to apply Constant False Alarm Rate (CFAR) methods to reduce the false alarm rate, or Probability of False Alarm P{sub FA} of the matched filter [4, 8, 9, 12]. A detector designer is interested in minimizing the probability of false alarm while simultaneously maximizing the probability of detection P{sub D}. This is summarized by the Receiver Operating Characteristic Curve (ROC) [10, 11], which is actually a family of curves depicting P{sub D} vs. P{sub FA}parameterized by varying levels of signal to noise (or clutter) ratio (SNR or SCR). Often, it is advantageous to be able to specify a desired P{sub FA} and develop a ROC curve (P{sub D} vs. decision threshold r{sub 0}) for that case. That is the purpose of this work. Specifically, this work develops a set of algorithms and MATLAB
2015-09-30
1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Large Scale Density Estimation of Blue and Fin Whales ...Utilizing Sparse Array Data to Develop and Implement a New Method for Estimating Blue and Fin Whale Density Len Thomas & Danielle Harris Centre...to develop and implement a new method for estimating blue and fin whale density that is effective over large spatial scales and is designed to cope
Some asymptotic results on density estimators by wavelet projections
Varron, Davit
2012-01-01
Let $(X_i)_{i\\geq 1}$ be an i.i.d. sample on $\\RRR^d$ having density $f$. Given a real function $\\phi$ on $\\RRR^d$ with finite variation and given an integer valued sequence $(j_n)$, let $\\fn$ denote the estimator of $f$ by wavelet projection based on $\\phi$ and with multiresolution level equal to $j_n$. We provide exact rates of almost sure convergence to 0 of the quantity $\\sup_{x\\in H}\\mid \\fn(x)-\\EEE(\\fn)(x)\\mid$, when $n2^{-dj_n}/\\log n \\rar \\infty$ and $H$ is a given hypercube of $\\RRR^d$. We then show that, if $n2^{-dj_n}/\\log n \\rar c$ for a constant $c>0$, then the quantity $\\sup_{x\\in H}\\mid \\fn(x)-f\\mid$ almost surely fails to converge to 0.
An Adaptive Background Subtraction Method Based on Kernel Density Estimation
Directory of Open Access Journals (Sweden)
Mignon Park
2012-09-01
Full Text Available In this paper, a pixel-based background modeling method, which uses nonparametric kernel density estimation, is proposed. To reduce the burden of image storage, we modify the original KDE method by using the first frame to initialize it and update it subsequently at every frame by controlling the learning rate according to the situations. We apply an adaptive threshold method based on image changes to effectively subtract the dynamic backgrounds. The devised scheme allows the proposed method to automatically adapt to various environments and effectively extract the foreground. The method presented here exhibits good performance and is suitable for dynamic background environments. The algorithm is tested on various video sequences and compared with other state-of-the-art background subtraction methods so as to verify its performance.
Effects of Crowding in Prisons.
Paulus, Paul B.; And Others
Research on crowding in prisons is reviewed. Studies have shown that crowding in prisons can increase blood pressure, palmar sweat, illness complaints, and aggression. The number of people per housing unit appears to be more important than space per person. Tolerance for crowding was found to decrease with the experience of crowding. Research on…
Cortical cell and neuron density estimates in one chimpanzee hemisphere.
Collins, Christine E; Turner, Emily C; Sawyer, Eva Kille; Reed, Jamie L; Young, Nicole A; Flaherty, David K; Kaas, Jon H
2016-01-19
The density of cells and neurons in the neocortex of many mammals varies across cortical areas and regions. This variability is, perhaps, most pronounced in primates. Nonuniformity in the composition of cortex suggests regions of the cortex have different specializations. Specifically, regions with densely packed neurons contain smaller neurons that are activated by relatively few inputs, thereby preserving information, whereas regions that are less densely packed have larger neurons that have more integrative functions. Here we present the numbers of cells and neurons for 742 discrete locations across the neocortex in a chimpanzee. Using isotropic fractionation and flow fractionation methods for cell and neuron counts, we estimate that neocortex of one hemisphere contains 9.5 billion cells and 3.7 billion neurons. Primary visual cortex occupies 35 cm(2) of surface, 10% of the total, and contains 737 million densely packed neurons, 20% of the total neurons contained within the hemisphere. Other areas of high neuron packing include secondary visual areas, somatosensory cortex, and prefrontal granular cortex. Areas of low levels of neuron packing density include motor and premotor cortex. These values reflect those obtained from more limited samples of cortex in humans and other primates.
2014-08-12
movements and behaviors of “ crowds ” of people. This interest spans several scientific... movement in dense, heterogeneous crowds . The monograph is organized into different parts that consolidate...project. We developed solutions to a number of important and challenging problems related to visual analysis of crowds and modeling of
US Fish and Wildlife Service, Department of the Interior — This Crowd Control Plan for Swan Lake National Wildlife Refuge outlines operational procedures in the event of a civil disorder on the Refuge. An inventory of...
DEFF Research Database (Denmark)
Mortensen, Michael Lind; Wallace, Byron C.; Kraska, Tim
for complex multi-criteria search problems through crowdsourcing. The CrowdFilter system is capable of supporting both criteria-level labels and n-gram rationales, capturing the human decision making process behind each filtering choice. Using the data provided through CrowdFilter we also introduce a novel......Multi-criteria filtering of mixed open/closed-world data is a time-consuming task, requiring significant manual effort when latent open-world attributes are present. In this work we introduce a novel open-world filtering framework CrowdFilter, enabling automatic UI generation and label elicitation...... multi-criteria active learning method; capable of incorporating labels and n-gram rationales per inclusion criteria, and thus capable of determining both clear includes/excludes, as well as complex borderline cases. By incorporating the active learning approach into the elicitation process of Crowd...
DEFF Research Database (Denmark)
Borch, Christian
of crowds and their alleged revolutionary aspirations. Interestingly, the sociological interest in crowds took off in the late nineteenth century as a reflection on modern political (dis)order, with the French Revolution and in particular the Paris Commune serving as key points of reference. This early......The November Revolution in 1918 made manifest and further unleashed a political crisis in Germany, the consequences of which have been thoroughly examined. What has attracted less attention is how the Revolution also triggered a semantic crisis within sociology, namely with regard to conceptions...... semantics of crowds associated collective behaviour with irrationality, contagion and hypnotic suggestibility. Precisely this semantic repertoire was called into question after the November revolution: Weimar sociologists, with Theodor Geiger in a lead role, argued for an alternative conception of crowds...
Modelling asymmetric growth in crowded plant communities
DEFF Research Database (Denmark)
Damgaard, Christian
2010-01-01
A class of models that may be used to quantify the effect of size-asymmetric competition in crowded plant communities by estimating a community specific degree of size-asymmetric growth for each species in the community is suggested. The model consists of two parts: an individual size-asymmetric ......A class of models that may be used to quantify the effect of size-asymmetric competition in crowded plant communities by estimating a community specific degree of size-asymmetric growth for each species in the community is suggested. The model consists of two parts: an individual size...
2014-09-30
1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Cheap DECAF: Density Estimation for Cetaceans from... cetaceans using passive fixed acoustics rely on large, dense arrays of cabled hydrophones and/or auxiliary information from animal tagging projects...estimating cetacean density. Therefore, the goal of Cheap DECAF is to focus on the development of cetacean density estimation methods using sensors that
Institute of Scientific and Technical Information of China (English)
王志鹏; 李子奈
2004-01-01
By reconstructing the absolute crowding in or out relationship between FDI and domestic investment, we correct the flaws of the existed relative crowding in or out model. Using larger sample of Panel data and comparing the different estimates from "absolute" and "relative" models, we find no significant crowding in or out effects in China from the country level. However, further study reveals that the crowding in or out effects show a fine regional characteristic:crowding out effects dominate in eastern China, crowding in effects lead in mid China, and crowding out effects are insignificant in west China.
Modelling crowd-bridge dynamic interaction with a discretely defined crowd
Carroll, S. P.; Owen, J. S.; Hussein, M. F. M.
2012-05-01
This paper presents a novel method of modelling crowd-bridge interaction using discrete element theory (DET) to model the pedestrian crowd. DET, also known as agent-based modelling, is commonly used in the simulation of pedestrian movement, particularly in cases where building evacuation is critical or potentially problematic. Pedestrians are modelled as individual elements subject to global behavioural rules. In this paper a discrete element crowd model is coupled with a dynamic bridge model in a time-stepping framework. Feedback takes place between both models at each time-step. An additional pedestrian stimulus is introduced that is a function of bridge lateral dynamic behaviour. The pedestrians' relationship with the vibrating bridge as well as the pedestrians around them is thus simulated. The lateral dynamic behaviour of the bridge is modelled as a damped single degree of freedom (SDoF) oscillator. The excitation and mass enhancement of the dynamic system is determined as the sum of individual pedestrian contributions at each time-step. Previous crowd-structure interaction modelling has utilised a continuous hydrodynamic crowd model. Limitations inherent in this modelling approach are identified and results presented that demonstrate the ability of DET to address these limitations. Simulation results demonstrate the model's ability to consider low density traffic flows and inter-subject variability. The emergence of the crowd's velocity-density relationship is also discussed.
Crowding by invisible flankers.
Directory of Open Access Journals (Sweden)
Cristy Ho
Full Text Available BACKGROUND: Human object recognition degrades sharply as the target object moves from central vision into peripheral vision. In particular, one's ability to recognize a peripheral target is severely impaired by the presence of flanking objects, a phenomenon known as visual crowding. Recent studies on how visual awareness of flanker existence influences crowding had shown mixed results. More importantly, it is not known whether conscious awareness of the existence of both the target and flankers are necessary for crowding to occur. METHODOLOGY/PRINCIPAL FINDINGS: Here we show that crowding persists even when people are completely unaware of the flankers, which are rendered invisible through the continuous flash suppression technique. Contrast threshold for identifying the orientation of a grating pattern was elevated in the flanked condition, even when the subjects reported that they were unaware of the perceptually suppressed flankers. Moreover, we find that orientation-specific adaptation is attenuated by flankers even when both the target and flankers are invisible. CONCLUSIONS: These findings complement the suggested correlation between crowding and visual awareness. What's more, our results demonstrate that conscious awareness and attention are not prerequisite for crowding.
Strategies in crowd and crowd structure
Gawronski, P; Krawczyk, M J; Malinowski, J; Kupczak, A; Sikora, W; Kulakowski, K; Was, J; Kantelhardt, J
2012-01-01
In an emergency situation, imitation of strategies of neighbours can lead to an order-disorder phase transition, where spatial clusters of pedestrians adopt the same strategy. We assume that there are two strategies, cooperating and competitive, which correspond to a smaller or larger desired velocity. The results of our simulations within the Social Force Model indicate that the ordered phase can be detected as an increase of spatial order of positions of the pedestrians in the crowd.
Simulation of Cognitive Pedestrian Agents Crowds in Crisis Situations
Directory of Open Access Journals (Sweden)
Margaret Lyell
2006-06-01
Full Text Available In crisis situations in an urban environment, first responder teams often must deal with crowds of people. Consider the case of a building fire in a dense city environment. People may be injured; walkways may be blocked, with fire equipment attempting to reach the scene. Crowd behavior can become an issue when trying to reach the injured, ensure safety and restore conditions to normal. The motivations of pedestrians that form the crowd can vary. Some are there because they are curious about the crisis situation. Others, attending to their individual concerns, may have found themselves in the 'wrong' location. They may be trying to leave the area, but the density of people as well as the spatial layout of the walkways may be impeding their progress. Other individuals, unaware of the fire, may be attempting to reach their intended destinations that happen to be near the crisis area, thus adding to crowd congestion. With a model of crowd behavior, effective strategies for resource usage in managing crowd behavior can be developed. Our approach to this problem is that of agent-based modeling and simulation. We develop a cognitive pedestrian agent model. Utilizing this model, we simulate crowd behavior in a 'city fire' scenario. Characteristics of crowd behavior with different pedestrian personality mixes and a strategy for crowd management are investigated
Stress and Incongruity Theory: Effects of Crowding,
1981-01-01
Nogami, G. Y. Crowding: Effects of group size, room size or density. Journal of Applied Social Psychology , 1976, 6, 105-125. Osgood, C. E. and...perceived control and behavioral effects. Journal of Applied Social Psychology , 1974, 4, 171-186. -43- Sommer, R. Personal space: The behavioral
Estimating the mass density of neutral gas at $z < 1$
Natarajan, P; Natarajan, Priyamvada; Pettini, Max
1997-01-01
We use the relationships between galactic HI mass and B-band luminosity determined by Rao & Briggs to recalculate the mass density of neutral gas at the present epoch based on more recent measures of the galaxy luminosity function than were available to those authors. We find $\\Omega_{gas}(z=0) value, suggesting that this quantity is now reasonably secure. We then show that, if the scaling between H I mass and B-band luminosity has remained approximately constant since $z = 1$, the evolution of the luminosity function found by the Canada-France redshift survey translates to an increase of obtained quite independently from consideration of the luminosity function of Mg II absorbers at $z = 0.65$. By combining these new estimates with data from damped \\lya systems at higher redshift, it is possible to assemble a rough sketch of the evolution of $Ømega_{gas}$ over the last 90% of the age of the universe. The consumption of H I gas with time is in broad agreement with models of chemical evolution which inclu...
Freeman, Jeremy; Pelli, Denis G
2007-10-26
Crowding occurs when nearby flankers jumble the appearance of a target object, making it hard to identify. Crowding is feature integration over an inappropriately large region. What determines the size of that region? According to bottom-up proposals, the size is that of an anatomically determined isolation field. According to top-down proposals, the size is that of the spotlight of attention. Intriligator and Cavanagh (2001) proposed the latter, but we show that their conclusion rests on an implausible assumption. Here we investigate the role of attention in crowding using the change blindness paradigm. We measure capacity for widely and narrowly spaced letters during a change detection task, both with and without an interstimulus cue. We find that standard crowding manipulations-reducing spacing and adding flankers-severely impair uncued change detection but have no effect on cued change detection. Because crowded letters look less familiar, we must use longer internal descriptions (less compact representations) to remember them. Thus, fewer fit into working memory. The memory limit does not apply to the cued condition because the observer need remember only the cued letter. Cued performance escapes the effects of crowding, as predicted by a top-down account. However, our most parsimonious account of the results is bottom-up: Cued change detection is so easy that the observer can tolerate feature degradation and letter distortion, making the observer immune to crowding. The change detection task enhances the classic partial report paradigm by making the test easier (same/different instead of identifying one of many possible targets), which increases its sensitivity, so it can reveal degraded memory traces.
Evidence for Categorical Crowding
Directory of Open Access Journals (Sweden)
J Reuther
2014-08-01
Full Text Available An object easily recognised in isolation is hampered when other objects are situated close to it. This phenomenon is called crowding. It is generally thought that crowding affects object recognition only at the level of feature combination. However, recent studies have shown that if flankers and targets belong to different categories crowding is weaker, calling into question the above assertion. Nevertheless, these results can be explained in terms of featural-differences between categories. The current study tests if category-level (i.e., high-level interference in crowding occurs when featural-differences are controlled for. The first experiment used letters and numbers as targets and flankers in a two-by-two study design. We found lower critical spacing when targets and flankers belonged to different categories, replicating previous results. In a second experiment, using a font that ensured that both categories had the same feature-set, we observed the same, albeit weaker, category-dependent effect. This suggests that although featural differences can account partly for the reduction, category-level effects persist even when featural differences are fully controlled for. We conclude that crowding results from not only the well-documented feature-level interactions but also additional interactions at a level where objects are grouped by meaning.
Institute of Scientific and Technical Information of China (English)
张路平; 王鲁平; 李飚; 赵明
2015-01-01
In order to improve the performance of the probability hypothesis density (PHD) algorithm based particle filter (PF) in terms of number estimation and states extraction of multiple targets, a new probability hypothesis density filter algorithm based on marginalized particle and kernel density estimation is proposed, which utilizes the idea of marginalized particle filter to enhance the estimating performance of the PHD. The state variables are decomposed into linear and non-linear parts. The particle filter is adopted to predict and estimate the nonlinear states of multi-target after dimensionality reduction, while the Kalman filter is applied to estimate the linear parts under linear Gaussian condition. Embedding the information of the linear states into the estimated nonlinear states helps to reduce the estimating variance and improve the accuracy of target number estimation. The meanshift kernel density estimation, being of the inherent nature of searching peak value via an adaptive gradient ascent iteration, is introduced to cluster particles and extract target states, which is independent of the target number and can converge to the local peak position of the PHD distribution while avoiding the errors due to the inaccuracy in modeling and parameters estimation. Experiments show that the proposed algorithm can obtain higher tracking accuracy when using fewer sampling particles and is of lower computational complexity compared with the PF-PHD.
Nonparametric estimation of population density for line transect sampling using FOURIER series
Crain, B.R.; Burnham, K.P.; Anderson, D.R.; Lake, J.L.
1979-01-01
A nonparametric, robust density estimation method is explored for the analysis of right-angle distances from a transect line to the objects sighted. The method is based on the FOURIER series expansion of a probability density function over an interval. With only mild assumptions, a general population density estimator of wide applicability is obtained.
Firms, crowds, and innovation.
Felin, Teppo; Lakhani, Karim R; Tushman, Michael L
2017-05-01
The purpose of this article is to suggest a (preliminary) taxonomy and research agenda for the topic of "firms, crowds, and innovation" and to provide an introduction to the associated special issue. We specifically discuss how various crowd-related phenomena and practices-for example, crowdsourcing, crowdfunding, user innovation, and peer production-relate to theories of the firm, with particular attention on "sociality" in firms and markets. We first briefly review extant theories of the firm and then discuss three theoretical aspects of sociality related to crowds in the context of strategy, organizations, and innovation: (1) the functions of sociality (sociality as extension of rationality, sociality as sensing and signaling, sociality as matching and identity), (2) the forms of sociality (independent/aggregate and interacting/emergent forms of sociality), and (3) the failures of sociality (misattribution and misapplication). We conclude with an outline of future research directions and introduce the special issue papers and essays.
Entropic Geometry of Crowd Dynamics
Ivancevic, Vladimir G
2008-01-01
We propose an entropic geometrical model of psycho-physical crowd dynamics (with dissipative crowd kinematics), using Feynman action-amplitude formalism that operates on three synergetic levels: macro, meso and micro. Its most natural statistical descriptor is crowd entropy $S$ that satisfies the Prigogine's extended second law of thermodynamics, $\\partial_tS \\geq 0$ (for any nonisolated multicomponent system). Qualitative similarities and superpositions between individual and crowd configuration manifolds motivate our claim that goal-directed crowd movement operates under entropy conservation, $\\partial_tS = 0$, while natural crowd dynamics operates under (monotonically) increasing entropy function, $\\partial_tS > 0$. Between these two distinct topological phases lies a phase transition with a chaotic inter-phase. Both inertial crowd dynamics and its dissipative kinematics represent diffusion processes on the crowd manifold governed by the Ricci flow, with the associated Perelman entropy-action. Keywords: Cr...
DEFF Research Database (Denmark)
Buch-Kromann, Tine; Nielsen, Jens
2012-01-01
This paper introduces a multivariate density estimator for truncated and censored data with special emphasis on extreme values based on survival analysis. A local constant density estimator is considered. We extend this estimator by means of tail flattening transformation, dimension reducing prio...
Spectral density estimation for symmetric stable p-adic processes
Directory of Open Access Journals (Sweden)
Rachid Sabre
2013-05-01
Full Text Available Applications of p-adic numbers ar beming increasingly important espcially in the field of applied physics. The objective of this work is to study the estimation of the spectral of p-adic stable processes. An estimator formed by a smoothing periodogram is constructed. It is shwon that this estimator is asymptotically unbiased and consistent. Rates of convergences are also examined.
Directory of Open Access Journals (Sweden)
Hendra Gunawan
2014-06-01
Full Text Available http://dx.doi.org/10.17014/ijog.vol3no3.20084The precision of topographic density (Bouguer density estimation by the Nettleton approach is based on a minimum correlation of Bouguer gravity anomaly and topography. The other method, the Parasnis approach, is based on a minimum correlation of Bouguer gravity anomaly and Bouguer correction. The precision of Bouguer density estimates was investigated by both methods on simple 2D syntetic models and under an assumption free-air anomaly consisting of an effect of topography, an effect of intracrustal, and an isostatic compensation. Based on simulation results, Bouguer density estimates were then investigated for a gravity survey of 2005 on La Soufriere Volcano-Guadeloupe area (Antilles Islands. The Bouguer density based on the Parasnis approach is 2.71 g/cm3 for the whole area, except the edifice area where average topography density estimates are 2.21 g/cm3 where Bouguer density estimates from previous gravity survey of 1975 are 2.67 g/cm3. The Bouguer density in La Soufriere Volcano was uncertainly estimated to be 0.1 g/cm3. For the studied area, the density deduced from refraction seismic data is coherent with the recent Bouguer density estimates. New Bouguer anomaly map based on these Bouguer density values allows to a better geological intepretation.
Directory of Open Access Journals (Sweden)
Hendra Gunawan
2014-06-01
Full Text Available http://dx.doi.org/10.17014/ijog.vol3no3.20084The precision of topographic density (Bouguer density estimation by the Nettleton approach is based on a minimum correlation of Bouguer gravity anomaly and topography. The other method, the Parasnis approach, is based on a minimum correlation of Bouguer gravity anomaly and Bouguer correction. The precision of Bouguer density estimates was investigated by both methods on simple 2D syntetic models and under an assumption free-air anomaly consisting of an effect of topography, an effect of intracrustal, and an isostatic compensation. Based on simulation results, Bouguer density estimates were then investigated for a gravity survey of 2005 on La Soufriere Volcano-Guadeloupe area (Antilles Islands. The Bouguer density based on the Parasnis approach is 2.71 g/cm3 for the whole area, except the edifice area where average topography density estimates are 2.21 g/cm3 where Bouguer density estimates from previous gravity survey of 1975 are 2.67 g/cm3. The Bouguer density in La Soufriere Volcano was uncertainly estimated to be 0.1 g/cm3. For the studied area, the density deduced from refraction seismic data is coherent with the recent Bouguer density estimates. New Bouguer anomaly map based on these Bouguer density values allows to a better geological intepretation.
Booth, Alan; Edwards, John N.
1976-01-01
The effect of household and neighborhood crowding on the relations between spouses, those between parents and children, and the relations among children are examined; a sample of urban families residing in conditions ranging from open to highly compressed provided the data for the investigation and multiple regression was used to analyze the…
EuroMInd-D: A Density Estimate of Monthly Gross Domestic Product for the Euro Area
DEFF Research Database (Denmark)
Proietti, Tommaso; Marczak, Martyna; Mazzi, Gianluigi
EuroMInd-D is a density estimate of monthly gross domestic product (GDP) constructed according to a bottom–up approach, pooling the density estimates of eleven GDP components, by output and expenditure type. The components density estimates are obtained from a medium-size dynamic factor model...... parameters, and conditional simulation filters for simulating from the predictive distribution of GDP. Both algorithms process sequentially the data as they become available in real time. The GDP density estimates for the output and expenditure approach are combined using alternative weighting schemes...
Rigorous home range estimation with movement data: a new autocorrelated kernel density estimator.
Fleming, C H; Fagan, W F; Mueller, T; Olson, K A; Leimgruber, P; Calabrese, J M
2015-05-01
Quantifying animals' home ranges is a key problem in ecology and has important conservation and wildlife management applications. Kernel density estimation (KDE) is a workhorse technique for range delineation problems that is both statistically efficient and nonparametric. KDE assumes that the data are independent and identically distributed (IID). However, animal tracking data, which are routinely used as inputs to KDEs, are inherently autocorrelated and violate this key assumption. As we demonstrate, using realistically autocorrelated data in conventional KDEs results in grossly underestimated home ranges. We further show that the performance of conventional KDEs actually degrades as data quality improves, because autocorrelation strength increases as movement paths become more finely resolved. To remedy these flaws with the traditional KDE method, we derive an autocorrelated KDE (AKDE) from first principles to use autocorrelated data, making it perfectly suited for movement data sets. We illustrate the vastly improved performance of AKDE using analytical arguments, relocation data from Mongolian gazelles, and simulations based upon the gazelle's observed movement process. By yielding better minimum area estimates for threatened wildlife populations, we believe that future widespread use of AKDE will have significant impact on ecology and conservation biology.
Estimation of current density distribution under electrodes for external defibrillation
Directory of Open Access Journals (Sweden)
Papazov Sava P
2002-12-01
Full Text Available Abstract Background Transthoracic defibrillation is the most common life-saving technique for the restoration of the heart rhythm of cardiac arrest victims. The procedure requires adequate application of large electrodes on the patient chest, to ensure low-resistance electrical contact. The current density distribution under the electrodes is non-uniform, leading to muscle contraction and pain, or risks of burning. The recent introduction of automatic external defibrillators and even wearable defibrillators, presents new demanding requirements for the structure of electrodes. Method and Results Using the pseudo-elliptic differential equation of Laplace type with appropriate boundary conditions and applying finite element method modeling, electrodes of various shapes and structure were studied. The non-uniformity of the current density distribution was shown to be moderately improved by adding a low resistivity layer between the metal and tissue and by a ring around the electrode perimeter. The inclusion of openings in long-term wearable electrodes additionally disturbs the current density profile. However, a number of small-size perforations may result in acceptable current density distribution. Conclusion The current density distribution non-uniformity of circular electrodes is about 30% less than that of square-shaped electrodes. The use of an interface layer of intermediate resistivity, comparable to that of the underlying tissues, and a high-resistivity perimeter ring, can further improve the distribution. The inclusion of skin aeration openings disturbs the current paths, but an appropriate selection of number and size provides a reasonable compromise.
Crowding during restricted and free viewing.
Wallace, Julian M; Chiu, Michael K; Nandy, Anirvan S; Tjan, Bosco S
2013-05-24
Crowding impairs the perception of form in peripheral vision. It is likely to be a key limiting factor of form vision in patients without central vision. Crowding has been extensively studied in normally sighted individuals, typically with a stimulus duration of a few hundred milliseconds to avoid eye movements. These restricted testing conditions do not reflect the natural behavior of a patient with central field loss. Could unlimited stimulus duration and unrestricted eye movements change the properties of crowding in any fundamental way? We studied letter identification in the peripheral vision of normally sighted observers in three conditions: (i) a fixation condition with a brief stimulus presentation of 250 ms, (ii) another fixation condition but with an unlimited viewing time, and (iii) an unrestricted eye movement condition with an artificial central scotoma and an unlimited viewing time. In all conditions, contrast thresholds were measured as a function of target-to-flanker spacing, from which we estimated the spatial extent of crowding in terms of critical spacing. We found that presentation duration beyond 250 ms had little effect on critical spacing with stable gaze. With unrestricted eye movements and a simulated central scotoma, we found a large variability in critical spacing across observers, but more importantly, the variability in critical spacing was well correlated with the variability in target eccentricity. Our results assure that the large body of findings on crowding made with briefly presented stimuli remains relevant to conditions where viewing time is unconstrained. Our results further suggest that impaired oculomotor control associated with central vision loss can confound peripheral form vision beyond the limits imposed by crowding.
Directory of Open Access Journals (Sweden)
Rongda Chen
Full Text Available Recovery rate is essential to the estimation of the portfolio's loss and economic capital. Neglecting the randomness of the distribution of recovery rate may underestimate the risk. The study introduces two kinds of models of distribution, Beta distribution estimation and kernel density distribution estimation, to simulate the distribution of recovery rates of corporate loans and bonds. As is known, models based on Beta distribution are common in daily usage, such as CreditMetrics by J.P. Morgan, Portfolio Manager by KMV and Losscalc by Moody's. However, it has a fatal defect that it can't fit the bimodal or multimodal distributions such as recovery rates of corporate loans and bonds as Moody's new data show. In order to overcome this flaw, the kernel density estimation is introduced and we compare the simulation results by histogram, Beta distribution estimation and kernel density estimation to reach the conclusion that the Gaussian kernel density distribution really better imitates the distribution of the bimodal or multimodal data samples of corporate loans and bonds. Finally, a Chi-square test of the Gaussian kernel density estimation proves that it can fit the curve of recovery rates of loans and bonds. So using the kernel density distribution to precisely delineate the bimodal recovery rates of bonds is optimal in credit risk management.
Density and hazard rate estimation for censored and a-mixing data using gamma kernels
2006-01-01
In this paper we consider the nonparametric estimation for a density and hazard rate function for right censored -mixing survival time data using kernel smoothing techniques. Since survival times are positive with potentially a high concentration at zero, one has to take into account the bias problems when the functions are estimated in the boundary region. In this paper, gamma kernel estimators of the density and the hazard rate function are proposed. The estimators use adaptive weights depe...
EXACT MINIMAX ESTIMATION OF THE PREDICTIVE DENSITY IN SPARSE GAUSSIAN MODELS.
Mukherjee, Gourab; Johnstone, Iain M
We consider estimating the predictive density under Kullback-Leibler loss in an ℓ0 sparse Gaussian sequence model. Explicit expressions of the first order minimax risk along with its exact constant, asymptotically least favorable priors and optimal predictive density estimates are derived. Compared to the sparse recovery results involving point estimation of the normal mean, new decision theoretic phenomena are seen. Suboptimal performance of the class of plug-in density estimates reflects the predictive nature of the problem and optimal strategies need diversification of the future risk. We find that minimax optimal strategies lie outside the Gaussian family but can be constructed with threshold predictive density estimates. Novel minimax techniques involving simultaneous calibration of the sparsity adjustment and the risk diversification mechanisms are used to design optimal predictive density estimates.
Divisive latent class modeling as a density estimation method for categorical data
van der Palm, D.W.; van der Ark, L.A.; Vermunt, J.K.
2016-01-01
Traditionally latent class (LC) analysis is used by applied researchers as a tool for identifying substantively meaningful clusters. More recently, LC models have also been used as a density estimation tool for categorical variables. We introduce a divisive LC (DLC) model as a density estimation too
Inference-less Density Estimation using Copula Bayesian Networks
Elidan, Gal
2012-01-01
We consider learning continuous probabilistic graphical models in the face of missing data. For non-Gaussian models, learning the parameters and structure of such models depends on our ability to perform efficient inference, and can be prohibitive even for relatively modest domains. Recently, we introduced the Copula Bayesian Network (CBN) density model - a flexible framework that captures complex high-dimensional dependency structures while offering direct control over the univariate marginals, leading to improved generalization. In this work we show that the CBN model also offers significant computational advantages when training data is partially observed. Concretely, we leverage on the specialized form of the model to derive a computationally amenable learning objective that is a lower bound on the log-likelihood function. Importantly, our energy-like bound circumvents the need for costly inference of an auxiliary distribution, thus facilitating practical learning of highdimensional densities. We demonstr...
Confidence estimates in simulation of phase noise or spectral density.
Ashby, Neil
2017-02-13
In this paper we apply the method of discrete simulation of power law noise, developed in [1],[3],[4], to the problem of simulating phase noise for a combination of power law noises. We derive analytic expressions for the probability of observing a value of phase noise L(f) or of any of the onesided spectral densities S(f); Sy(f), or Sx(f), for arbitrary superpositions of power law noise.
Optimisation of in-situ dry density estimation
Directory of Open Access Journals (Sweden)
Morvan Mathilde
2016-01-01
Full Text Available Nowadays, field experiments are mostly used to determine the resistance and settlements of a soil before building. The needed devices were heavy so they cannot be used in every situation. It is the reason why Gourves et al (1998 developed a light dynamic penetrometer called Panda. For this penetrometer, a standardized hammer has to be blown on the head of the piston. For each blow, it measures the driving energy as well as the driving depth of the cone into the soil. The obtained penetrogram gives us the cone resistance variation with depth. For homogeneous soils, three parameters can determined: the critical depth zc, the initial cone resistance qd0 and the cone resistance in depth qd1. In parallel to the improvement of this apparatus, some researches were lead to obtain a relationship between the dry density of soil and the cone resistance in depth qd1. Knowing dry density of soil can allow to evaluate compaction efficiency for example. To achieve this point, a database of soils was initiated. Each of these soils was tested and classified using laboratory tests, among others, grain size distribution, proctor results, Atterberg limits. Penetrometer tests were also performed for three to five densities and for three to five water contents. Using this database, Chaigneau managed to obtain a logarithmic relation linking qd1 and dry density. But this relation varies with the water content. This article presents our recent researches on a mean to obtain a unified relation using water content, saturation degree or suction. To achieve this point, at first we studied the CNR silt responses with saturation degree and water content. Its water retention curve was realised using filter paper method so we can obtain suction. Then we verified the conclusion of this study to seven soils of the database to validate our hypotheses.
Private Giving Crowding Government Funding in Public Higher Education
Directory of Open Access Journals (Sweden)
G. Thomas Sav
2010-01-01
Full Text Available Problem statement: Private giving and government funding are critical revenue sources for public colleges and universities. If increased private giving reduces government funding, then that type and extent of crowding out carries important managerial and public policy implications. Approach: The study used a government funding reaction function and an instrumental variable approach to empirically estimate the potential for crowding out. Results: The study examined the extent to which private giving reduces or crowds out state government funding of public colleges and universities. Government free riding was at question and investigated to determine how active it is in terms of private donations partially or wholly displacing state government funding. The findings suggested that the rate of crowding out was 43% on the dollar. That compares to the 45% political substitution of the 1960’s but is much diminished from the 1980’s dollar for dollar crowding out. Those are aggregate comparisons for all public institutions. A disaggregated approach in this study additionally revealed that doctoral universities were victims of the same 43% crowd out but that at two other levels, master degree granting and associate degree granting colleges, there was the opposite effect of crowding in. Those colleges received state funding augmentations of 32-92% on their dollar of privately provided donations. Conclusion/Recommendations: The study’s finding of the existence of both crowding out and crowding in can carry important policy implications for college and university funding. Future managerial and public policy decision making should take that into account. However, political sustainability and economy wide and localized effects over time of crowding out and in could prove fruitful avenues of inquiry for future research.
Volumetric breast density estimation from full-field digital mammograms.
Engeland, S. van; Snoeren, P.R.; Huisman, H.J.; Boetes, C.; Karssemeijer, N.
2006-01-01
A method is presented for estimation of dense breast tissue volume from mammograms obtained with full-field digital mammography (FFDM). The thickness of dense tissue mapping to a pixel is determined by using a physical model of image acquisition. This model is based on the assumption that the breast
Energy Technology Data Exchange (ETDEWEB)
Singh, Harpreet; Arvind; Dorai, Kavita, E-mail: kavita@iisermohali.ac.in
2016-09-07
Estimation of quantum states is an important step in any quantum information processing experiment. A naive reconstruction of the density matrix from experimental measurements can often give density matrices which are not positive, and hence not physically acceptable. How do we ensure that at all stages of reconstruction, we keep the density matrix positive? Recently a method has been suggested based on maximum likelihood estimation, wherein the density matrix is guaranteed to be positive definite. We experimentally implement this protocol on an NMR quantum information processor. We discuss several examples and compare with the standard method of state estimation. - Highlights: • State estimation using maximum likelihood method was performed on an NMR quantum information processor. • Physically valid density matrices were obtained every time in contrast to standard quantum state tomography. • Density matrices of several different entangled and separable states were reconstructed for two and three qubits.
Niching method using clustering crowding
Institute of Scientific and Technical Information of China (English)
GUO Guan-qi; GUI Wei-hua; WU Min; YU Shou-yi
2005-01-01
This study analyzes drift phenomena of deterministic crowding and probabilistic crowding by using equivalence class model and expectation proportion equations. It is proved that the replacement errors of deterministic crowding cause the population converging to a single individual, thus resulting in premature stagnation or losing optional optima. And probabilistic crowding can maintain equilibrium multiple subpopulations as the population size is adequate large. An improved niching method using clustering crowding is proposed. By analyzing topology of fitness landscape using hill valley function and extending the search space for similarity analysis, clustering crowding determines the locality of search space more accurately, thus greatly decreasing replacement errors of crowding. The integration of deterministic and probabilistic replacement increases the capacity of both parallel local hill climbing and maintaining multiple subpopulations. The experimental results optimizing various multimodal functions show that,the performances of clustering crowding, such as the number of effective peaks maintained, average peak ratio and global optimum ratio are uniformly superior to those of the evolutionary algorithms using fitness sharing, simple deterministic crowding and probabilistic crowding.
Crowding by a repeating pattern.
Rosen, Sarah; Pelli, Denis G
2015-01-01
Theinability to recognize a peripheral target among flankers is called crowding. For a foveal target, crowding can be distinguished from overlap masking by its sparing of detection, linear scaling with eccentricity, and invariance with target size.Crowding depends on the proximity and similarity of the flankers to the target. Flankers that are far from or dissimilar to the target do not crowd it. On a gray page, text whose neighboring letters have different colors, alternately black and white, has enough dissimilarity that it might escape crowding. Since reading speed is normally limited by crowding, escape from crowding should allow faster reading. Yet reading speed is unchanged (Chung & Mansfield, 2009). Why? A recent vernier study found that using alternating-color flankers produces strong crowding (Manassi, Sayim, & Herzog, 2012). Might that effect occur with letters and reading? Critical spacing is the minimum center-to-center target-flanker spacing needed to correctly identify the target. We measure it for a target letter surrounded by several equidistant flanker letters of the same polarity, opposite polarity, or mixed polarity: alternately white and black. We find strong crowding in the alternating condition, even though each flanker letter is beyond its own critical spacing (as measured in a separate condition). Thus a periodic repeating pattern can produce crowding even when the individual elements do not. Further, in all conditions we find that, once a periodic pattern repeats (two cycles), further repetition does not affect critical spacing of the innermost flanker.
Institute of Scientific and Technical Information of China (English)
Stephanie Teufel; Bernd Teufel
2014-01-01
⎯After the Fukushima disaster, European politicians began to reassess the energy strategy for their countries. The focus is now on renewable energy sources and as a result on decentralization. The decentralized generation, storage, and of course the consumption of energy is the central point. Now with the new develop-ments under the roof of energy turnaround the way back from the centralized architecture of our energy system to a more decentralized one is predetermined. Decentralization implies the change in the role of today’s consumers. They become energy prosumers. This is the basis for the crowd energy concept. In this position paper the crowd energy concept is introduced and necessary research fields are identified.
DEFF Research Database (Denmark)
Boudreau, Kevin J.; Jeppesen, Lars Bo
2014-01-01
on network effects and strategies to attract large numbers of complementors remain advisable in such contexts? We test hypotheses related to these issues using data from 85 online multi-player game platforms with unpaid complementors. We find that complementor development responds to platform growth even...... without sales incentives, but that attracting complementors has a net zero effect on on-going development and fails to stimulate network effects. We discuss conditions under which a strategy of using unpaid crowd complementors remains advantageous....
Estimated global nitrogen deposition using NO2 column density
Lu, Xuehe; Jiang, Hong; Zhang, Xiuying; Liu, Jinxun; Zhang, Zhen; Jin, Jiaxin; Wang, Ying; Xu, Jianhui; Cheng, Miaomiao
2013-01-01
Global nitrogen deposition has increased over the past 100 years. Monitoring and simulation studies of nitrogen deposition have evaluated nitrogen deposition at both the global and regional scale. With the development of remote-sensing instruments, tropospheric NO2 column density retrieved from Global Ozone Monitoring Experiment (GOME) and Scanning Imaging Absorption Spectrometer for Atmospheric Chartography (SCIAMACHY) sensors now provides us with a new opportunity to understand changes in reactive nitrogen in the atmosphere. The concentration of NO2 in the atmosphere has a significant effect on atmospheric nitrogen deposition. According to the general nitrogen deposition calculation method, we use the principal component regression method to evaluate global nitrogen deposition based on global NO2 column density and meteorological data. From the accuracy of the simulation, about 70% of the land area of the Earth passed a significance test of regression. In addition, NO2 column density has a significant influence on regression results over 44% of global land. The simulated results show that global average nitrogen deposition was 0.34 g m−2 yr−1 from 1996 to 2009 and is increasing at about 1% per year. Our simulated results show that China, Europe, and the USA are the three hotspots of nitrogen deposition according to previous research findings. In this study, Southern Asia was found to be another hotspot of nitrogen deposition (about 1.58 g m−2 yr−1 and maintaining a high growth rate). As nitrogen deposition increases, the number of regions threatened by high nitrogen deposits is also increasing. With N emissions continuing to increase in the future, areas whose ecosystem is affected by high level nitrogen deposition will increase.
METAPHOR: Probability density estimation for machine learning based photometric redshifts
Amaro, V.; Cavuoti, S.; Brescia, M.; Vellucci, C.; Tortora, C.; Longo, G.
2017-06-01
We present METAPHOR (Machine-learning Estimation Tool for Accurate PHOtometric Redshifts), a method able to provide a reliable PDF for photometric galaxy redshifts estimated through empirical techniques. METAPHOR is a modular workflow, mainly based on the MLPQNA neural network as internal engine to derive photometric galaxy redshifts, but giving the possibility to easily replace MLPQNA with any other method to predict photo-z's and their PDF. We present here the results about a validation test of the workflow on the galaxies from SDSS-DR9, showing also the universality of the method by replacing MLPQNA with KNN and Random Forest models. The validation test include also a comparison with the PDF's derived from a traditional SED template fitting method (Le Phare).
Felin, Teppo; Lakhani, Karim R; Tushman, Michael L
2017-01-01
The purpose of this article is to suggest a (preliminary) taxonomy and research agenda for the topic of “firms, crowds, and innovation” and to provide an introduction to the associated special issue. We specifically discuss how various crowd-related phenomena and practices—for example, crowdsourcing, crowdfunding, user innovation, and peer production—relate to theories of the firm, with particular attention on “sociality” in firms and markets. We first briefly review extant theories of the firm and then discuss three theoretical aspects of sociality related to crowds in the context of strategy, organizations, and innovation: (1) the functions of sociality (sociality as extension of rationality, sociality as sensing and signaling, sociality as matching and identity), (2) the forms of sociality (independent/aggregate and interacting/emergent forms of sociality), and (3) the failures of sociality (misattribution and misapplication). We conclude with an outline of future research directions and introduce the special issue papers and essays. PMID:28690428
Temporal crowding and its interplay with spatial crowding.
Yeshurun, Yaffa; Rashal, Einat; Tkacz-Domb, Shira
2015-03-18
Spatial crowding refers to impaired target identification when the target is surrounded by other stimuli in space temporal crowding refers to impaired target identification when the target is surrounded by other stimuli in time previously, when spatial and temporal crowding were measured in the fovea they were interrelated with amblyopic observers but almost absent with normal observers bonneh, sagi, & polat, 2007. In the current study we examined whether reliable temporal crowding can be found for normal observers with peripheral presentation 9° of eccentricity, and whether similar relations between temporal and spatial crowding will emerge to that end, we presented a sequence of three displays separated by a varying interstimulus interval (ISI). Each display included either one letter : experiments 1a ,: 1b ,: 1c or three letters separated by a varying interletter spacing: Experiments 2a ,: 2b). One of these displays included an oriented T. Observers indicated the T's orientation. As expected, we found spatial crowding: accuracy improved as the interletter spacing increased. Critically, we also found temporal crowding: in all experiments accuracy increased as the ISI increased, even when only stimulus-onset asynchronies (SOAs) larger than 150 ms were included, ensuring this effect does not reflect mere ordinary masking. Thus, with peripheral presentation, temporal crowding also emerged for normal observers. However, only a weak interaction between temporal and spatial crowding was found.
Directory of Open Access Journals (Sweden)
Ashot Davtian
2011-05-01
Full Text Available Two methods for the estimation of number per unit volume NV of spherical particles are discussed: the (physical disector (Sterio, 1984 and Saltykov's estimator (Saltykov, 1950; Fullman, 1953. A modification of Saltykov's estimator is proposed which reduces the variance. Formulae for bias and variance are given for both disector and improved Saltykov estimator for the case of randomly positioned particles. They enable the comparison of the two estimators with respect to their precision in terms of mean squared error.
Molecular crowding and protein enzymatic dynamics.
Echeverria, Carlos; Kapral, Raymond
2012-05-21
The effects of molecular crowding on the enzymatic conformational dynamics and transport properties of adenylate kinase are investigated. This tridomain protein undergoes large scale hinge motions in the course of its enzymatic cycle and serves as prototype for the study of crowding effects on the cyclic conformational dynamics of proteins. The study is carried out at a mesoscopic level where both the protein and the solvent in which it is dissolved are treated in a coarse grained fashion. The amino acid residues in the protein are represented by a network of beads and the solvent dynamics is described by multiparticle collision dynamics that includes effects due to hydrodynamic interactions. The system is crowded by a stationary random array of hard spherical objects. Protein enzymatic dynamics is investigated as a function of the obstacle volume fraction and size. In addition, for comparison, results are presented for a modification of the dynamics that suppresses hydrodynamic interactions. Consistent with expectations, simulations of the dynamics show that the protein prefers a closed conformation for high volume fractions. This effect becomes more pronounced as the obstacle radius decreases for a given volume fraction since the average void size in the obstacle array is smaller for smaller radii. At high volume fractions for small obstacle radii, the average enzymatic cycle time and characteristic times of internal conformational motions of the protein deviate substantially from their values in solution or in systems with small density of obstacles. The transport properties of the protein are strongly affected by molecular crowding. Diffusive motion adopts a subdiffusive character and the effective diffusion coefficients can change by more than an order of magnitude. The orientational relaxation time of the protein is also significantly altered by crowding.
Efficient estimation of dynamic density functions with an application to outlier detection
Qahtan, Abdulhakim Ali Ali
2012-01-01
In this paper, we propose a new method to estimate the dynamic density over data streams, named KDE-Track as it is based on a conventional and widely used Kernel Density Estimation (KDE) method. KDE-Track can efficiently estimate the density with linear complexity by using interpolation on a kernel model, which is incrementally updated upon the arrival of streaming data. Both theoretical analysis and experimental validation show that KDE-Track outperforms traditional KDE and a baseline method Cluster-Kernels on estimation accuracy of the complex density structures in data streams, computing time and memory usage. KDE-Track is also demonstrated on timely catching the dynamic density of synthetic and real-world data. In addition, KDE-Track is used to accurately detect outliers in sensor data and compared with two existing methods developed for detecting outliers and cleaning sensor data. © 2012 ACM.
Energy Technology Data Exchange (ETDEWEB)
Humbert, Ludovic, E-mail: ludohumberto@gmail.com [Galgo Medical, Barcelona 08036 (Spain); Hazrati Marangalou, Javad; Rietbergen, Bert van [Orthopaedic Biomechanics, Department of Biomedical Engineering, Eindhoven University of Technology, Eindhoven 5600 MB (Netherlands); Río Barquero, Luis Miguel del [CETIR Centre Medic, Barcelona 08029 (Spain); Lenthe, G. Harry van [Biomechanics Section, KU Leuven–University of Leuven, Leuven 3001 (Belgium)
2016-04-15
Purpose: Cortical thickness and density are critical components in determining the strength of bony structures. Computed tomography (CT) is one possible modality for analyzing the cortex in 3D. In this paper, a model-based approach for measuring the cortical bone thickness and density from clinical CT images is proposed. Methods: Density variations across the cortex were modeled as a function of the cortical thickness and density, location of the cortex, density of surrounding tissues, and imaging blur. High resolution micro-CT data of cadaver proximal femurs were analyzed to determine a relationship between cortical thickness and density. This thickness-density relationship was used as prior information to be incorporated in the model to obtain accurate measurements of cortical thickness and density from clinical CT volumes. The method was validated using micro-CT scans of 23 cadaver proximal femurs. Simulated clinical CT images with different voxel sizes were generated from the micro-CT data. Cortical thickness and density were estimated from the simulated images using the proposed method and compared with measurements obtained using the micro-CT images to evaluate the effect of voxel size on the accuracy of the method. Then, 19 of the 23 specimens were imaged using a clinical CT scanner. Cortical thickness and density were estimated from the clinical CT images using the proposed method and compared with the micro-CT measurements. Finally, a case-control study including 20 patients with osteoporosis and 20 age-matched controls with normal bone density was performed to evaluate the proposed method in a clinical context. Results: Cortical thickness (density) estimation errors were 0.07 ± 0.19 mm (−18 ± 92 mg/cm{sup 3}) using the simulated clinical CT volumes with the smallest voxel size (0.33 × 0.33 × 0.5 mm{sup 3}), and 0.10 ± 0.24 mm (−10 ± 115 mg/cm{sup 3}) using the volumes with the largest voxel size (1.0 × 1.0 × 3.0 mm{sup 3}). A trend for the
Impact of Building Heights on 3d Urban Density Estimation from Spaceborne Stereo Imagery
Peng, Feifei; Gong, Jianya; Wang, Le; Wu, Huayi; Yang, Jiansi
2016-06-01
In urban planning and design applications, visualization of built up areas in three dimensions (3D) is critical for understanding building density, but the accurate building heights required for 3D density calculation are not always available. To solve this problem, spaceborne stereo imagery is often used to estimate building heights; however estimated building heights might include errors. These errors vary between local areas within a study area and related to the heights of the building themselves, distorting 3D density estimation. The impact of building height accuracy on 3D density estimation must be determined across and within a study area. In our research, accurate planar information from city authorities is used during 3D density estimation as reference data, to avoid the errors inherent to planar information extracted from remotely sensed imagery. Our experimental results show that underestimation of building heights is correlated to underestimation of the Floor Area Ratio (FAR). In local areas, experimental results show that land use blocks with low FAR values often have small errors due to small building height errors for low buildings in the blocks; and blocks with high FAR values often have large errors due to large building height errors for high buildings in the blocks. Our study reveals that the accuracy of 3D density estimated from spaceborne stereo imagery is correlated to heights of buildings in a scene; therefore building heights must be considered when spaceborne stereo imagery is used to estimate 3D density to improve precision.
An Efficient Acoustic Density Estimation Method with Human Detectors Applied to Gibbons in Cambodia.
Directory of Open Access Journals (Sweden)
Darren Kidney
Full Text Available Some animal species are hard to see but easy to hear. Standard visual methods for estimating population density for such species are often ineffective or inefficient, but methods based on passive acoustics show more promise. We develop spatially explicit capture-recapture (SECR methods for territorial vocalising species, in which humans act as an acoustic detector array. We use SECR and estimated bearing data from a single-occasion acoustic survey of a gibbon population in northeastern Cambodia to estimate the density of calling groups. The properties of the estimator are assessed using a simulation study, in which a variety of survey designs are also investigated. We then present a new form of the SECR likelihood for multi-occasion data which accounts for the stochastic availability of animals. In the context of gibbon surveys this allows model-based estimation of the proportion of groups that produce territorial vocalisations on a given day, thereby enabling the density of groups, instead of the density of calling groups, to be estimated. We illustrate the performance of this new estimator by simulation. We show that it is possible to estimate density reliably from human acoustic detections of visually cryptic species using SECR methods. For gibbon surveys we also show that incorporating observers' estimates of bearings to detected groups substantially improves estimator performance. Using the new form of the SECR likelihood we demonstrate that estimates of availability, in addition to population density and detection function parameters, can be obtained from multi-occasion data, and that the detection function parameters are not confounded with the availability parameter. This acoustic SECR method provides a means of obtaining reliable density estimates for territorial vocalising species. It is also efficient in terms of data requirements since since it only requires routine survey data. We anticipate that the low-tech field requirements will
A generalized model for estimating the energy density of invertebrates
James, Daniel A.; Csargo, Isak J.; Von Eschen, Aaron; Thul, Megan D.; Baker, James M.; Hayer, Cari-Ann; Howell, Jessica; Krause, Jacob; Letvin, Alex; Chipps, Steven R.
2012-01-01
Invertebrate energy density (ED) values are traditionally measured using bomb calorimetry. However, many researchers rely on a few published literature sources to obtain ED values because of time and sampling constraints on measuring ED with bomb calorimetry. Literature values often do not account for spatial or temporal variability associated with invertebrate ED. Thus, these values can be unreliable for use in models and other ecological applications. We evaluated the generality of the relationship between invertebrate ED and proportion of dry-to-wet mass (pDM). We then developed and tested a regression model to predict ED from pDM based on a taxonomically, spatially, and temporally diverse sample of invertebrates representing 28 orders in aquatic (freshwater, estuarine, and marine) and terrestrial (temperate and arid) habitats from 4 continents and 2 oceans. Samples included invertebrates collected in all seasons over the last 19 y. Evaluation of these data revealed a significant relationship between ED and pDM (r2 = 0.96, p calorimetry approaches. This model should prove useful for a wide range of ecological studies because it is unaffected by taxonomic, seasonal, or spatial variability.
Crowding-induced Cooperativity in DNA Surface Hybridization
Lei, Qun-li; Ren, Chun-lai; Su, Xiao-hang; Ma, Yu-qiang
2015-01-01
High density DNA brush is not only used to model cellular crowding, but also has a wide application in DNA-functionalized materials. Experiments have shown complicated cooperative hybridization/melting phenomena in these systems, raising the question that how molecular crowding influences DNA hybridization. In this work, a theoretical modeling including all possible inter and intramolecular interactions, as well as molecular details for different species, is proposed. We find that molecular crowding can lead to two distinct cooperative behaviours: negatively cooperative hybridization marked by a broader transition width, and positively cooperative hybridization with a sharper transition, well reconciling the experimental findings. Moreover, a phase transition as a result of positive cooperativity is also found. Our study provides new insights in crowding and compartmentation in cell, and has the potential value in controlling surface morphologies of DNA functionalized nano-particles. PMID:25875056
Colour, Luminance and Crowding
Directory of Open Access Journals (Sweden)
BJ Jennings
2013-10-01
Full Text Available Three experiments were performed to assess the effect backgrounds have on object discrimination. Experiment 1 investigated the discrimination of foveally presented Gaborised objects and non-objects with and without a surrounding background. Thresholds were obtained by modulating the Gabor patches in 7 different directions, either isolating the L-M, S-(L+M and L+M geniculate mechanisms, or stimulating these mechanisms in combination. The spacing between background Gabor elements and the object contour was chosen so as to not cause crowding, on the basis of previously published work with luminance stimuli. No differences were found between the Michelson contrasts required for threshold with or without a background, except when signals in the S-(L+M and L+M were combined. The signals were combined at an elevation of 30° in DKL colour space, which resulted in a mixture with a proportionally strong chromatic signal. Experiment 2 investigated this finding further using three background conditions: no background, a sparse background and a densely populated background. Object vs. non-object discrimination thresholds were obtained for the L+M and S-(L+M isolating directions, along with two conditions that combined them at DKL luminance elevations of 30° and 60°. In the 60° combination, the proportion of the chromatic signal was lower than in the 30° combination. Thresholds were found to be largely stable across chromatic and luminance conditions and background class, again with the exception of the combination at 30° elevation. The final experiment examined Gabor orientation discrimination over the same conditions as experiment 2 using a classical crowding paradigm, with a peripheral target and a set of three target-flanker separations. Crowding was most pronounced in the 30° combination. We conclude that when S-(L+M signals above a certain level are combined with luminance signals, an increase in crowding results. This is likely to underlie the
Crowd simulation and visualization
Perez, Hugo; Rudomin, Isaac; Ayguadé Parra, Eduard; Hernandez, Benjamin; Espinosa-Oviedo, Javier A.; Vargas-Solar, Genoveva
2015-01-01
This paper presents a methodology to simulate and visualize crowds. Our goal is to represent the most realistic possible scenarios in a city. Due to the high demand of resources a GPU Cluster is used. We use real data from which we identify the behavior of the masses applying statistical and artificial intelligence techniques. In order to take advantge of the processing power of the GPU cluster we use the following programming models during the characters simulation: MPI, OmpSs and CUDA. We d...
School Crowding, Year-Round Schooling, and Mobile Classroom Use: Evidence from North Carolina
McMullen, Steven C.; Rouse, Kathryn E.
2012-01-01
This study exploits a unique policy environment and a large panel dataset to evaluate the impact of school crowding on student achievement in Wake County, NC. We also estimate the effects of two education policy initiatives that are often used to address crowding: multi-track year-round calendars and mobile classrooms. We estimate a multi-level…
EnviroAtlas Estimated Intersection Density of Walkable Roads Web Service
U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in each EnviroAtlas community....
EnviroAtlas - Paterson, NJ - Estimated Intersection Density of Walkable Roads
U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...
EnviroAtlas - Minneapolis/St. Paul, MN - Estimated Intersection Density of Walkable Roads
U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...
EnviroAtlas - New Bedford, MA - Estimated Intersection Density of Walkable Roads
U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...
EnviroAtlas - Pittsburgh, PA - Estimated Intersection Density of Walkable Roads
U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...
National Research Council Canada - National Science Library
Williamson, Laura D; Brookes, Kate L; Scott, Beth E; Graham, Isla M; Bradbury, Gareth; Hammond, Philip S; Thompson, Paul M; McPherson, Jana
2016-01-01
...‐based visual surveys. Surveys of cetaceans using acoustic loggers or digital cameras provide alternative methods to estimate relative density that have the potential to reduce cost and provide a verifiable record of all detections...
EnviroAtlas - New York, NY - Estimated Intersection Density of Walkable Roads
U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...
EnviroAtlas - Memphis, TN - Estimated Intersection Density of Walkable Roads
U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...
EnviroAtlas - Cleveland, OH - Estimated Intersection Density of Walkable Roads
U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...
EnviroAtlas - Fresno, CA - Estimated Intersection Density of Walkable Roads
U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...
EnviroAtlas - Green Bay, WI - Estimated Intersection Density of Walkable Roads
U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...
EnviroAtlas - Tampa, FL - Estimated Intersection Density of Walkable Roads
U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...
EnviroAtlas - Portland, ME - Estimated Intersection Density of Walkable Roads
U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...
EnviroAtlas - Phoenix, AZ - Estimated Intersection Density of Walkable Roads
U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...
EnviroAtlas - Des Moines, IA - Estimated Intersection Density of Walkable Roads
U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...
EnviroAtlas - Austin, TX - Estimated Intersection Density of Walkable Roads
U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...
EnviroAtlas - Woodbine, IA - Estimated Intersection Density of Walkable Roads
U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...
EnviroAtlas - Milwaukee, WI - Estimated Intersection Density of Walkable Roads
U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...
EnviroAtlas - Portland, OR - Estimated Intersection Density of Walkable Roads
U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...
EnviroAtlas - Durham, NC - Estimated Intersection Density of Walkable Roads
U.S. Environmental Protection Agency — This EnviroAtlas dataset estimates the intersection density of walkable roads within a 750 meter radius of any given 10 meter pixel in the community. Intersections...
Holistic crowding of Mooney faces.
Farzin, Faraz; Rivera, Susan M; Whitney, David
2009-06-29
An object or feature is generally more difficult to identify when other objects are presented nearby, an effect referred to as crowding. Here, we used Mooney faces to examine whether crowding can also occur within and between holistic face representations (C. M. Mooney, 1957). Mooney faces are ideal stimuli for this test because no cues exist to distinguish facial features in a Mooney face; to find any facial feature, such as an eye or a nose, one must first holistically perceive the image as a face. Through a series of six experiments we tested the effect of crowding on Mooney face recognition. Our results demonstrate crowding between and within Mooney faces and fulfill the diagnostic criteria for crowding, including eccentricity dependence and lack of crowding in the fovea, critical flanker spacing consistent with less than half the eccentricity of the target, and inner-outer flanker asymmetry. Further, our results show that recognition of an upright Mooney face is more strongly impaired by upright Mooney face flankers than inverted ones. Taken together, these results suggest crowding can occur selectively between high-level representations of faces and that crowding must occur at multiple levels in the visual system.
Macaque monkeys experience visual crowding.
Crowder, Erin A; Olson, Carl R
2015-01-01
In peripheral vision, objects that are easily discriminated on their own become less discriminable in the presence of surrounding clutter. This phenomenon is known as crowding.The neural mechanisms underlying crowding are not well understood. Better insight might come from single-neuron recording in nonhuman primates, provided they exhibit crowding; however, previous demonstrations of crowding have been confined to humans. In the present study, we set out to determine whether crowding occurs in rhesus macaque monkeys. We found that animals trained to identify a target letter among flankers displayed three hallmarks of crowding as established in humans. First, at a given eccentricity, increasing the spacing between the target and the flankers improved recognition accuracy. Second, the critical spacing, defined as the minimal spacing at which target discrimination was reliable, was proportional to eccentricity. Third, the critical spacing was largely unaffected by object size. We conclude that monkeys, like humans, experience crowding. These findings open the door to studies of crowding at the neuronal level in the monkey visual system.
Illusory contour formation survives crowding.
Lau, Jonathan Siu Fung; Cheung, Sing-Hang
2012-06-12
Flanked objects are difficult to identify using peripheral vision due to visual crowding, which limits conscious access to target identity. Nonetheless, certain types of visual information have been shown to survive crowding. Such resilience to crowding provides valuable information about the underlying neural mechanism of crowding. Here we ask whether illusory contour formation survives crowding of the inducers. We manipulated the presence of illusory contours through the (mis)alignment of the four inducers of a Kanizsa square. In the inducer-aligned condition, the observers judged the perceived shape (thin vs. fat) of the illusory Kanizsa square, manipulated by small rotations of the inducers. In the inducer-misaligned condition, three of the four inducers (all except the upper-left) were rotated 90°. The observers judged the orientation of the upper-left inducer. Crowding of the inducers worsened observers' performance significantly only in the inducer-misaligned condition. Our findings suggest that information for illusory contour formation survives crowding of the inducers. Crowding happens at a stage where the low-level featural information is integrated for inducer orientation discrimination, but not at a stage where the same information is used for illusory contour formation.
Pedestrian, Crowd, and Evacuation Dynamics
Helbing, Dirk
2013-01-01
This contribution describes efforts to model the behavior of individual pedestrians and their interactions in crowds, which generate certain kinds of self-organized patterns of motion. Moreover, this article focusses on the dynamics of crowds in panic or evacuation situations, methods to optimize building designs for egress, and factors potentially causing the breakdown of orderly motion.
Crowd funding voor juridische fijnproevers
I. van der Vlies
2012-01-01
Crowd funding is een eigentijds instrument om geld van goede gevers te verzamelen via de sociale media. Bij crowd funding hoeft de ‘crowd’ niet als groep of gemeenschap te bestaan. Het gaat om een verzameling personen die één ding gemeen heeft: geld geven voor een bepaald doel. De oude media speelde
Cetacean Density Estimation from Novel Acoustic Datasets by Acoustic Propagation Modeling
2014-09-30
1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Cetacean Density Estimation from Novel Acoustic Datasets...OBJECTIVES The objectives of this research are to apply existing methods for cetacean density estimation from passive acoustic recordings made by single...sensors, to novel data sets and cetacean species, as well as refine the existing techniques in order to develop a more generalized model that can be
Compressive and Noncompressive Power Spectral Density Estimation from Periodic Nonuniform Samples
Lexa, Michael A; Thompson, John S
2011-01-01
This paper presents a novel power spectral density estimation technique for bandlimited, wide-sense stationary signals from sub-Nyquist sampled data. The technique employs multi-coset sampling and applies to spectrally sparse and nonsparse power spectra alike. For sparse density functions, we apply compressed sensing theory and the resulting compressive estimates exhibit better tradeoffs among the estimator's resolution, system complexity, and average sampling rate compared to their noncompressive counterparts. Both compressive and noncompressive estimates, however, can be computed at arbitrarily low sampling rates. The estimator does not require signal reconstruction and can be directly obtained from solving either a least squares or a nonnegative least squares problem. The estimates are piecewise constant approximations whose resolutions (width of the piecewise constant segments) are controlled by the periodicity of the multi-coset sampling. The estimates are also statistically consistent. This method is wi...
The importance of spatial models for estimating the strength of density dependence
DEFF Research Database (Denmark)
Thorson, James T.; Skaug, Hans J.; Kristensen, Kasper;
2014-01-01
Identifying the existence and magnitude of density dependence is one of the oldest concerns in ecology. Ecologists have aimed to estimate density dependence in population and community data by fitting a simple autoregressive (Gompertz) model for density dependence to time series of abundance...... for an entire population. However, it is increasingly recognized that spatial heterogeneity in population densities has implications for population and community dynamics. We therefore adapt the Gompertz model to approximate local densities over continuous space instead of population-wide abundance......, and to allow productivity to vary spatially. Using simulated data generated from a spatial model, we show that the conventional (nonspatial) Gompertz model will result in biased estimates of density dependence, e.g., identifying oscillatory dynamics when not present. By contrast, the spatial Gompertz model...
Directory of Open Access Journals (Sweden)
YU Wenhao
2015-01-01
Full Text Available The distribution pattern and the distribution density of urban facility POIs are of great significance in the fields of infrastructure planning and urban spatial analysis. The kernel density estimation, which has been usually utilized for expressing these spatial characteristics, is superior to other density estimation methods (such as Quadrat analysis, Voronoi-based method, for that the Kernel density estimation considers the regional impact based on the first law of geography. However, the traditional kernel density estimation is mainly based on the Euclidean space, ignoring the fact that the service function and interrelation of urban feasibilities is carried out on the network path distance, neither than conventional Euclidean distance. Hence, this research proposed a computational model of network kernel density estimation, and the extension type of model in the case of adding constraints. This work also discussed the impacts of distance attenuation threshold and height extreme to the representation of kernel density. The large-scale actual data experiment for analyzing the different POIs' distribution patterns (random type, sparse type, regional-intensive type, linear-intensive type discusses the POI infrastructure in the city on the spatial distribution of characteristics, influence factors, and service functions.
Kimura, Satoko; Akamatsu, Tomonari; Li, Songhai; Dong, Shouyue; Dong, Lijun; Wang, Kexiong; Wang, Ding; Arai, Nobuaki
2010-09-01
A method is presented to estimate the density of finless porpoises using stationed passive acoustic monitoring. The number of click trains detected by stereo acoustic data loggers (A-tag) was converted to an estimate of the density of porpoises. First, an automated off-line filter was developed to detect a click train among noise, and the detection and false-alarm rates were calculated. Second, a density estimation model was proposed. The cue-production rate was measured by biologging experiments. The probability of detecting a cue and the area size were calculated from the source level, beam patterns, and a sound-propagation model. The effect of group size on the cue-detection rate was examined. Third, the proposed model was applied to estimate the density of finless porpoises at four locations from the Yangtze River to the inside of Poyang Lake. The estimated mean density of porpoises in a day decreased from the main stream to the lake. Long-term monitoring during 466 days from June 2007 to May 2009 showed variation in the density 0-4.79. However, the density was fewer than 1 porpoise/km(2) during 94% of the period. These results suggest a potential gap and seasonal migration of the population in the bottleneck of Poyang Lake.
KDE-Track: An Efficient Dynamic Density Estimator for Data Streams
Qahtan, Abdulhakim Ali Ali
2016-11-08
Recent developments in sensors, global positioning system devices and smart phones have increased the availability of spatiotemporal data streams. Developing models for mining such streams is challenged by the huge amount of data that cannot be stored in the memory, the high arrival speed and the dynamic changes in the data distribution. Density estimation is an important technique in stream mining for a wide variety of applications. The construction of kernel density estimators is well studied and documented. However, existing techniques are either expensive or inaccurate and unable to capture the changes in the data distribution. In this paper, we present a method called KDE-Track to estimate the density of spatiotemporal data streams. KDE-Track can efficiently estimate the density function with linear time complexity using interpolation on a kernel model, which is incrementally updated upon the arrival of new samples from the stream. We also propose an accurate and efficient method for selecting the bandwidth value for the kernel density estimator, which increases its accuracy significantly. Both theoretical analysis and experimental validation show that KDE-Track outperforms a set of baseline methods on the estimation accuracy and computing time of complex density structures in data streams.
Using gravity data to estimate the density of surface rocks of Taiwan region
Lo, Y. T.; Horng-Yen, Y.
2016-12-01
Surface rock density within terrain correction step is one of the important parameters for obtaining Bouguer anomaly map. In the past study, we obtain the Bouguer anomaly map considering the average density correction of a wide range of the study area. In this study, we will be the better estimate for the correction of the density of each observation point. A correction density that coincides with surface geology is in order to improve the accuracy of the cloth cover anomaly map. The main idea of estimating correction of the density using gravity data statistics are two method, g-H relationship and Nettleton density profile method, respectively. The common advantages of these methods are in the following: First, density estimating is calculated using existing gravity observations data, it may be avoided the trouble of directly measure the rock density. Second, after the establishment the measuring point s of absolute gravity value, latitude, longitude and elevation into the database, you can always apply its database of information and terrain data with the value to calculate the average rock density on any range. In addition, each measuring point and numerical data of each terrain mesh are independent, if found to be more accurate gravity or terrain data, simply update a document data alone, without having to rebuild the entire database. According the results of estimating density distribution map, the trends are broadly distributed close to Taiwan Geology Division. The average density of the backbone mountain region is about 2.5 to 2.6 g/cm^3, the average density of east Central Mountain Range and Hsuehshan Range are about 2.3 to 2.5 g/cm^3, compared with the western foothills of 2.1-2.3 g/cm^3, the western plains is from 1.8 to 2.0 g/cm^3.
An asymptotically unbiased minimum density power divergence estimator for the Pareto-tail index
DEFF Research Database (Denmark)
Dierckx, Goedele; Goegebeur, Yuri; Guillou, Armelle
2013-01-01
We introduce a robust and asymptotically unbiased estimator for the tail index of Pareto-type distributions. The estimator is obtained by fitting the extended Pareto distribution to the relative excesses over a high threshold with the minimum density power divergence criterion. Consistency...
Nonparametric estimate of spectral density functions of sample covariance matrices: A first step
2012-01-01
The density function of the limiting spectral distribution of general sample covariance matrices is usually unknown. We propose to use kernel estimators which are proved to be consistent. A simulation study is also conducted to show the performance of the estimators.
Technical Summary Objectives: Determine the effect of body mass index (BMI) on the accuracy of body density (Db) estimated with skinfold thickness (SFT) measurements compared to air displacement plethysmography (ADP) in adults. Subjects/Methods: We estimated Db with SFT and ADP in 131 healthy men an...
An asymptotically unbiased minimum density power divergence estimator for the Pareto-tail index
DEFF Research Database (Denmark)
Dierckx, G.; Goegebeur, Y.; Guillou, A.
2013-01-01
We introduce a robust and asymptotically unbiased estimator for the tail index of Pareto-type distributions. The estimator is obtained by fitting the extended Pareto distribution to the relative excesses over a high threshold with the minimum density power divergence criterion. Consistency and as...... by a small simulation experiment involving both uncontaminated and contaminated samples. (C) 2013 Elsevier Inc. All rights reserved....
Kocovsky, Patrick M.; Rudstam, Lars G.; Yule, Daniel L.; Warner, David M.; Schaner, Ted; Pientka, Bernie; Deller, John W.; Waterfield, Holly A.; Witzel, Larry D.; Sullivan, Patrick J.
2013-01-01
Standardized methods of data collection and analysis ensure quality and facilitate comparisons among systems. We evaluated the importance of three recommendations from the Standard Operating Procedure for hydroacoustics in the Laurentian Great Lakes (GLSOP) on density estimates of target species: noise subtraction; setting volume backscattering strength (Sv) thresholds from user-defined minimum target strength (TS) of interest (TS-based Sv threshold); and calculations of an index for multiple targets (Nv index) to identify and remove biased TS values. Eliminating noise had the predictable effect of decreasing density estimates in most lakes. Using the TS-based Sv threshold decreased fish densities in the middle and lower layers in the deepest lakes with abundant invertebrates (e.g., Mysis diluviana). Correcting for biased in situ TS increased measured density up to 86% in the shallower lakes, which had the highest fish densities. The current recommendations by the GLSOP significantly influence acoustic density estimates, but the degree of importance is lake dependent. Applying GLSOP recommendations, whether in the Laurentian Great Lakes or elsewhere, will improve our ability to compare results among lakes. We recommend further development of standards, including minimum TS and analytical cell size, for reducing the effect of biased in situ TS on density estimates.
Rojas-Lima, J. E.; Domínguez-Pacheco, A.; Hernández-Aguilar, C.; Cruz-Orea, A.
2016-09-01
Considering the necessity of photothermal alternative approaches for characterizing nonhomogeneous materials like maize seeds, the objective of this research work was to analyze statistically the amplitude variations of photopyroelectric signals, by means of nonparametric techniques such as the histogram and the kernel density estimator, and the probability density function of the amplitude variations of two genotypes of maize seeds with different pigmentations and structural components: crystalline and floury. To determine if the probability density function had a known parametric form, the histogram was determined which did not present a known parametric form, so the kernel density estimator using the Gaussian kernel, with an efficiency of 95 % in density estimation, was used to obtain the probability density function. The results obtained indicated that maize seeds could be differentiated in terms of the statistical values for floury and crystalline seeds such as the mean (93.11, 159.21), variance (1.64× 103, 1.48× 103), and standard deviation (40.54, 38.47) obtained from the amplitude variations of photopyroelectric signals in the case of the histogram approach. For the case of the kernel density estimator, seeds can be differentiated in terms of kernel bandwidth or smoothing constant h of 9.85 and 6.09 for floury and crystalline seeds, respectively.
A method to estimate plant density and plant spacing heterogeneity: application to wheat crops.
Liu, Shouyang; Baret, Fred; Allard, Denis; Jin, Xiuliang; Andrieu, Bruno; Burger, Philippe; Hemmerlé, Matthieu; Comar, Alexis
2017-01-01
Plant density and its non-uniformity drive the competition among plants as well as with weeds. They need thus to be estimated with small uncertainties accuracy. An optimal sampling method is proposed to estimate the plant density in wheat crops from plant counting and reach a given precision. Three experiments were conducted in 2014 resulting in 14 plots across varied sowing density, cultivars and environmental conditions. The coordinates of the plants along the row were measured over RGB high resolution images taken from the ground level. Results show that the spacing between consecutive plants along the row direction are independent and follow a gamma distribution under the varied conditions experienced. A gamma count model was then derived to define the optimal sample size required to estimate plant density for a given precision. Results suggest that measuring the length of segments containing 90 plants will achieve a precision better than 10%, independently from the plant density. This approach appears more efficient than the usual method based on fixed length segments where the number of plants are counted: the optimal length for a given precision on the density estimation will depend on the actual plant density. The gamma count model parameters may also be used to quantify the heterogeneity of plant spacing along the row by exploiting the variability between replicated samples. Results show that to achieve a 10% precision on the estimates of the 2 parameters of the gamma model, 200 elementary samples corresponding to the spacing between 2 consecutive plants should be measured. This method provides an optimal sampling strategy to estimate the plant density and quantify the plant spacing heterogeneity along the row.
Ago, Yukio; Tanaka, Tatsunori; Ota, Yuki; Kitamoto, Mari; Imoto, Emina; Takuma, Kazuhiro; Matsuda, Toshio
2014-08-15
Rearing in crowded conditions is a psychosocial stressor that affects biological functions. The effects of continuous crowding for many days have been studied, but those of crowding over a limited time have not. In this study, we examined the effects of night-time or daytime crowding over 2 weeks on behavior in adolescent and adult mice. Crowding (20 mice/cage) in either the night-time or daytime did not affect locomotor activity in the open field test or cognitive function in the fear conditioning test. In contrast, night-time crowding, but not daytime crowding, had an anxiolytic effect in the elevated plus-maze test and increased social interaction in adolescent mice, but not in adult mice. The first night-time, but not daytime, crowding increased plasma corticosterone levels in adolescent mice, although night-time crowding over 2 weeks did not affect the corticosterone levels. Furthermore, no significant effects of the first crowding were observed in adult mice. In a second crowding condition (six mice/small cage), the anxiolytic-like effects of night-time crowding and the change in plasma corticosterone levels were not observed, suggesting that the density of mice is not important for the behavioral consequences of crowding. Night-time crowding did not affect neurotrophic/growth factor levels and hippocampal neurogenesis in adolescent mice. These findings suggest that night-time crowding leads to anxiolytic-like behaviors in adolescent mice, and imply that night-time crowding stress in adolescence may be beneficial to brain functions.
Relationships among phenotypic traits of sweet corn and tolerance to crowding stress
Crowding stress tolerance is defined as the extent to which the crop maintains yield per unit area as plant population density increases beyond standard levels. Sweet corn (Zea mays L.) hybrids grown for processing vary widely in tolerance to crowding stress; however, the mechanisms involved in crow...
Glacial density and GIA in Alaska estimated from ICESat, GPS and GRACE measurements
Jin, Shuanggen; Zhang, T. Y.; Zou, F.
2017-01-01
The density of glacial volume change in Alaska is a key factor in estimating the glacier mass loss from altimetry observations. However, the density of Alaskan glaciers has large uncertainty due to the lack of in situ measurements. In this paper, using the measurements of Ice, Cloud, and land Elevation Satellite (ICESat), Global Positioning System (GPS), and Gravity Recovery and Climate Experiment (GRACE) from 2003 to 2009, an optimal density of glacial volume change with 750 kg/m3 is estimated for the first time to fit the measurements. The glacier mass loss is -57.5 ± 6.5 Gt by converting the volumetric change from ICESat with the estimated density 750 kg/m3. Based on the empirical relation, the depth-density profiles are constructed, which show glacial density variation information with depths in Alaska. By separating the glacier mass loss from glacial isostatic adjustment (GIA) effects in GPS uplift rates and GRACE total water storage trends, the GIA uplift rates are estimated in Alaska. The best fitting model consists of a 60 km elastic lithosphere and 110 km thick asthenosphere with a viscosity of 2.0 × 1019 Pa s over a two-layer mantle.
Estimating population density and connectivity of American mink using spatial capture-recapture
Fuller, Angela K.; Sutherland, Christopher S.; Royle, Andy; Hare, Matthew P.
2016-01-01
Estimating the abundance or density of populations is fundamental to the conservation and management of species, and as landscapes become more fragmented, maintaining landscape connectivity has become one of the most important challenges for biodiversity conservation. Yet these two issues have never been formally integrated together in a model that simultaneously models abundance while accounting for connectivity of a landscape. We demonstrate an application of using capture–recapture to develop a model of animal density using a least-cost path model for individual encounter probability that accounts for non-Euclidean connectivity in a highly structured network. We utilized scat detection dogs (Canis lupus familiaris) as a means of collecting non-invasive genetic samples of American mink (Neovison vison) individuals and used spatial capture–recapture models (SCR) to gain inferences about mink population density and connectivity. Density of mink was not constant across the landscape, but rather increased with increasing distance from city, town, or village centers, and mink activity was associated with water. The SCR model allowed us to estimate the density and spatial distribution of individuals across a 388 km2 area. The model was used to investigate patterns of space usage and to evaluate covariate effects on encounter probabilities, including differences between sexes. This study provides an application of capture–recapture models based on ecological distance, allowing us to directly estimate landscape connectivity. This approach should be widely applicable to provide simultaneous direct estimates of density, space usage, and landscape connectivity for many species.
Wavelet Optimal Estimations for Density Functions under Severely Ill-Posed Noises
Directory of Open Access Journals (Sweden)
Rui Li
2013-01-01
Full Text Available Motivated by Lounici and Nickl's work (2011, this paper considers the problem of estimation of a density f based on an independent and identically distributed sample Y1,…,Yn from g=f*φ. We show a wavelet optimal estimation for a density (function over Besov ball Br,qs(L and Lp risk (1≤p<∞ in the presence of severely ill-posed noises. A wavelet linear estimation is firstly presented. Then, we prove a lower bound, which shows our wavelet estimator optimal. In other words, nonlinear wavelet estimations are not needed in that case. It turns out that our results extend some theorems of Pensky and Vidakovic (1999, as well as Fan and Koo (2002.
Maccione, Alessandro; Garofalo, Matteo; Nieus, Thierry; Tedesco, Mariateresa; Berdondini, Luca; Martinoia, Sergio
2012-06-15
We used electrophysiological signals recorded by CMOS Micro Electrode Arrays (MEAs) at high spatial resolution to estimate the functional-effective connectivity of sparse hippocampal neuronal networks in vitro by applying a cross-correlation (CC) based method and ad hoc developed spatio-temporal filtering. Low-density cultures were recorded by a recently introduced CMOS-MEA device providing simultaneous multi-site acquisition at high-spatial (21 μm inter-electrode separation) as well as high-temporal resolution (8 kHz per channel). The method is applied to estimate functional connections in different cultures and it is refined by applying spatio-temporal filters that allow pruning of those functional connections not compatible with signal propagation. This approach permits to discriminate between possible causal influence and spurious co-activation, and to obtain detailed maps down to cellular resolution. Further, a thorough analysis of the links strength and time delays (i.e., amplitude and peak position of the CC function) allows characterizing the inferred interconnected networks and supports a possible discrimination of fast mono-synaptic propagations, and slow poly-synaptic pathways. By focusing on specific regions of interest we could observe and analyze microcircuits involving connections among a few cells. Finally, the use of the high-density MEA with low density cultures analyzed with the proposed approach enables to compare the inferred effective links with the network structure obtained by staining procedures.
Royle, J. Andrew; Chandler, Richard B.; Gazenski, Kimberly D.; Graves, Tabitha A.
2013-01-01
Population size and landscape connectivity are key determinants of population viability, yet no methods exist for simultaneously estimating density and connectivity parameters. Recently developed spatial capture–recapture (SCR) models provide a framework for estimating density of animal populations but thus far have not been used to study connectivity. Rather, all applications of SCR models have used encounter probability models based on the Euclidean distance between traps and animal activity centers, which implies that home ranges are stationary, symmetric, and unaffected by landscape structure. In this paper we devise encounter probability models based on “ecological distance,” i.e., the least-cost path between traps and activity centers, which is a function of both Euclidean distance and animal movement behavior in resistant landscapes. We integrate least-cost path models into a likelihood-based estimation scheme for spatial capture–recapture models in order to estimate population density and parameters of the least-cost encounter probability model. Therefore, it is possible to make explicit inferences about animal density, distribution, and landscape connectivity as it relates to animal movement from standard capture–recapture data. Furthermore, a simulation study demonstrated that ignoring landscape connectivity can result in negatively biased density estimators under the naive SCR model.
Effect of compression paddle tilt correction on volumetric breast density estimation.
Kallenberg, Michiel G J; van Gils, Carla H; Lokate, Mariëtte; den Heeten, Gerard J; Karssemeijer, Nico
2012-08-21
For the acquisition of a mammogram, a breast is compressed between a compression paddle and a support table. When compression is applied with a flexible compression paddle, the upper plate may be tilted, which results in variation in breast thickness from the chest wall to the breast margin. Paddle tilt has been recognized as a major problem in volumetric breast density estimation methods. In previous work, we developed a fully automatic method to correct the image for the effect of compression paddle tilt. In this study, we investigated in three experiments the effect of paddle tilt and its correction on volumetric breast density estimation. Results showed that paddle tilt considerably affected accuracy of volumetric breast density estimation, but that effect could be reduced by tilt correction. By applying tilt correction, a significant increase in correspondence between mammographic density estimates and measurements on MRI was established. We argue that in volumetric breast density estimation, tilt correction is both feasible and essential when mammographic images are acquired with a flexible compression paddle.
Guided crowd dynamics via modified social force model
Yang, Xiaoxia; Dong, Hairong; Wang, Qianling; Chen, Yao; Hu, Xiaoming
2014-10-01
Pedestrian dynamics is of great theoretical significance for strategy design of emergency evacuation. Modification of pedestrian dynamics based on the social force model is presented to better reflect pedestrians' behavioral characteristics in emergency. Specifically, the modified model can be used for guided crowd dynamics in large-scale public places such as subway stations and stadiums. This guided crowd model is validated by explicitly comparing its density-speed and density-flow diagrams with fundamental diagrams. Some social phenomena such as gathering, balance and conflicts are clearly observed in simulation, which further illustrate the effectiveness of the proposed modeling method. Also, time delay for pedestrians with time-dependent desired velocities is observed and explained using the established model in this paper. Furthermore, this guided crowd model is applied to the simulation system of Beijing South Railway Station for predictive evacuation experiments.
Estimation of tiger densities in India using photographic captures and recaptures
Karanth, U.; Nichols, J.D.
1998-01-01
Previously applied methods for estimating tiger (Panthera tigris) abundance using total counts based on tracks have proved unreliable. In this paper we use a field method proposed by Karanth (1995), combining camera-trap photography to identify individual tigers based on stripe patterns, with capture-recapture estimators. We developed a sampling design for camera-trapping and used the approach to estimate tiger population size and density in four representative tiger habitats in different parts of India. The field method worked well and provided data suitable for analysis using closed capture-recapture models. The results suggest the potential for applying this methodology for estimating abundances, survival rates and other population parameters in tigers and other low density, secretive animal species with distinctive coat patterns or other external markings. Estimated probabilities of photo-capturing tigers present in the study sites ranged from 0.75 - 1.00. The estimated mean tiger densities ranged from 4.1 (SE hat= 1.31) to 11.7 (SE hat= 1.93) tigers/100 km2. The results support the previous suggestions of Karanth and Sunquist (1995) that densities of tigers and other large felids may be primarily determined by prey community structure at a given site.
Estimating detection and density of the Andean cat in the high Andes
Reppucci, Juan; Gardner, Beth; Lucherini, Mauro
2011-01-01
The Andean cat (Leopardus jacobita) is one of the most endangered, yet least known, felids. Although the Andean cat is considered at risk of extinction, rigorous quantitative population studies are lacking. Because physical observations of the Andean cat are difficult to make in the wild, we used a camera-trapping array to photo-capture individuals. The survey was conducted in northwestern Argentina at an elevation of approximately 4,200 m during October–December 2006 and April–June 2007. In each year we deployed 22 pairs of camera traps, which were strategically placed. To estimate detection probability and density we applied models for spatial capture–recapture using a Bayesian framework. Estimated densities were 0.07 and 0.12 individual/km2 for 2006 and 2007, respectively. Mean baseline detection probability was estimated at 0.07. By comparison, densities of the Pampas cat (Leopardus colocolo), another poorly known felid that shares its habitat with the Andean cat, were estimated at 0.74–0.79 individual/km2 in the same study area for 2006 and 2007, and its detection probability was estimated at 0.02. Despite having greater detectability, the Andean cat is rarer in the study region than the Pampas cat. Properly accounting for the detection probability is important in making reliable estimates of density, a key parameter in conservation and management decisions for any species.
Crowding and the Educational Process
Baron, Reuben M.
1975-01-01
Important conceptual and practical problems are involved in specifying the conditions under which individuals or groups are likely to experience a sense of crowding. An ecological/environmental perspective can help educators determine humane space utilization. (Author/DW)
Mobility, Fertility, and Residential Crowding
Morris, Earl W.
1977-01-01
Regression analyses predicting fertility and mobility in a sample of a metropolitan county in New York State indicate that residential mobility serves to release the negative pressure that residential crowding might exert on fertility behavior. (Author)
Effects of tissue heterogeneity on the optical estimate of breast density
Taroni, Paola; Pifferi, Antonio; Quarto, Giovanna; Spinelli, Lorenzo; Torricelli, Alessandro; Abbate, Francesca; Balestreri, Nicola; Ganino, Serena; Menna, Simona; Cassano, Enrico; Cubeddu, Rinaldo
2012-01-01
Breast density is a recognized strong and independent risk factor for developing breast cancer. At present, breast density is assessed based on the radiological appearance of breast tissue, thus relying on the use of ionizing radiation. We have previously obtained encouraging preliminary results with our portable instrument for time domain optical mammography performed at 7 wavelengths (635–1060 nm). In that case, information was averaged over four images (cranio-caudal and oblique views of both breasts) available for each subject. In the present work, we tested the effectiveness of just one or few point measurements, to investigate if tissue heterogeneity significantly affects the correlation between optically derived parameters and mammographic density. Data show that parameters estimated through a single optical measurement correlate strongly with mammographic density estimated by using BIRADS categories. A central position is optimal for the measurement, but its exact location is not critical. PMID:23082283
[Estimation of Hunan forest carbon density based on spectral mixture analysis of MODIS data].
Yan, En-ping; Lin, Hui; Wang, Guang-xing; Chen, Zhen-xiong
2015-11-01
With the fast development of remote sensing technology, combining forest inventory sample plot data and remotely sensed images has become a widely used method to map forest carbon density. However, the existence of mixed pixels often impedes the improvement of forest carbon density mapping, especially when low spatial resolution images such as MODIS are used. In this study, MODIS images and national forest inventory sample plot data were used to conduct the study of estimation for forest carbon density. Linear spectral mixture analysis with and without constraint, and nonlinear spectral mixture analysis were compared to derive the fractions of different land use and land cover (LULC) types. Then sequential Gaussian co-simulation algorithm with and without the fraction images from spectral mixture analyses were employed to estimate forest carbon density of Hunan Province. Results showed that 1) Linear spectral mixture analysis with constraint, leading to a mean RMSE of 0.002, more accurately estimated the fractions of LULC types than linear spectral and nonlinear spectral mixture analyses; 2) Integrating spectral mixture analysis model and sequential Gaussian co-simulation algorithm increased the estimation accuracy of forest carbon density to 81.5% from 74.1%, and decreased the RMSE to 5.18 from 7.26; and 3) The mean value of forest carbon density for the province was 30.06 t · hm(-2), ranging from 0.00 to 67.35 t · hm(-2). This implied that the spectral mixture analysis provided a great potential to increase the estimation accuracy of forest carbon density on regional and global level.
LSTA, Rawane Samb
2010-01-01
This thesis deals with the nonparametric estimation of density f of the regression error term E of the model Y=m(X)+E, assuming its independence with the covariate X. The difficulty linked to this study is the fact that the regression error E is not observed. In a such setup, it would be unwise, for estimating f, to use a conditional approach based upon the probability distribution function of Y given X. Indeed, this approach is affected by the curse of dimensionality, so that the resulting estimator of the residual term E would have considerably a slow rate of convergence if the dimension of X is very high. Two approaches are proposed in this thesis to avoid the curse of dimensionality. The first approach uses the estimated residuals, while the second integrates a nonparametric conditional density estimator of Y given X. If proceeding so can circumvent the curse of dimensionality, a challenging issue is to evaluate the impact of the estimated residuals on the final estimator of the density f. We will also at...
Fast and accurate probability density estimation in large high dimensional astronomical datasets
Gupta, Pramod; Connolly, Andrew J.; Gardner, Jeffrey P.
2015-01-01
Astronomical surveys will generate measurements of hundreds of attributes (e.g. color, size, shape) on hundreds of millions of sources. Analyzing these large, high dimensional data sets will require efficient algorithms for data analysis. An example of this is probability density estimation that is at the heart of many classification problems such as the separation of stars and quasars based on their colors. Popular density estimation techniques use binning or kernel density estimation. Kernel density estimation has a small memory footprint but often requires large computational resources. Binning has small computational requirements but usually binning is implemented with multi-dimensional arrays which leads to memory requirements which scale exponentially with the number of dimensions. Hence both techniques do not scale well to large data sets in high dimensions. We present an alternative approach of binning implemented with hash tables (BASH tables). This approach uses the sparseness of data in the high dimensional space to ensure that the memory requirements are small. However hashing requires some extra computation so a priori it is not clear if the reduction in memory requirements will lead to increased computational requirements. Through an implementation of BASH tables in C++ we show that the additional computational requirements of hashing are negligible. Hence this approach has small memory and computational requirements. We apply our density estimation technique to photometric selection of quasars using non-parametric Bayesian classification and show that the accuracy of the classification is same as the accuracy of earlier approaches. Since the BASH table approach is one to three orders of magnitude faster than the earlier approaches it may be useful in various other applications of density estimation in astrostatistics.
Estimation Prospects of the Source Number Density of Ultra-high-energy Cosmic Rays
Takami, Hajime; Sato, Katsuhiko
2007-01-01
We discuss the possibility of accurately estimating the source number density of ultra-high-energy cosmic rays (UHECRs) using small-scale anisotropy in their arrival distribution. The arrival distribution has information on their source and source distribution. We calculate the propagation of UHE protons in a structured extragalactic magnetic field (EGMF) and simulate their arrival distribution at the Earth using our previously developed method. The source number density that can best reprodu...
Bulk density estimation using a 3-dimensional image acquisition and analysis system
Directory of Open Access Journals (Sweden)
Heyduk Adam
2016-01-01
Full Text Available The paper presents a concept of dynamic bulk density estimation of a particulate matter stream using a 3-d image analysis system and a conveyor belt scale. A method of image acquisition should be adjusted to the type of scale. The paper presents some laboratory results of static bulk density measurements using the MS Kinect time-of-flight camera and OpenCV/Matlab software. Measurements were made for several different size classes.
Statistical Analysis of the Spectral Density Estimate Obtained via Coifman Scaling Function
2007-01-01
Spectral density built as Fourier transform of covariance sequence of stationary random process is determining the process characteristics and makes for analysis of it’s structure. Thus, one of the main problems in time series analysis is constructing consistent estimates of spectral density via successive, taken after equal periods of time observations of stationary random process. This article is devoted to investigation of problems dealing with application of wavelet anal...
Estimate of the density of Eucalyptus grandis W. Hill ex Maiden using near infrared spectroscopy
Directory of Open Access Journals (Sweden)
Silviana Rosso
2013-12-01
Full Text Available This study aimed to analyze use of near infrared spectroscopy (NIRS to estimate wood density of Eucalyptus grandis. For that, 66 27-year-old trees were logged and central planks were removed from each log. Test pieces 2.5 x 2.5 x 5.0 cm in size were removed from the base of each plank, in the pith-bark direction, and subjected to determination of bulk and basic density at 12% moisture (dry basis, followed by spectral readings in the radial, tangential and transverse directions using a Bruker Tensor 37 infrared spectrophotometer. The calibration to estimate wood density was developed based on the matrix of spectra obtained from the radial face, containing 216 samples. The partial least squares regression to estimate bulk wood density of Eucalyptus grandis provided a coefficient of determination of validation of 0.74 and a ratio performance deviation of 2.29. Statistics relating to the predictive models had adequate magnitudes for estimating wood density from unknown samples, indicating that the above technique has potential for use in replacement of conventional testing.
Estimating the amount and distribution of radon flux density from the soil surface in China.
Zhuo, Weihai; Guo, Qiuju; Chen, Bo; Cheng, Guan
2008-07-01
Based on an idealized model, both the annual and the seasonal radon ((222)Rn) flux densities from the soil surface at 1099 sites in China were estimated by linking a database of soil (226)Ra content and a global ecosystems database. Digital maps of the (222)Rn flux density in China were constructed in a spatial resolution of 25 km x 25 km by interpolation among the estimated data. An area-weighted annual average (222)Rn flux density from the soil surface across China was estimated to be 29.7+/-9.4 mBq m(-2)s(-1). Both regional and seasonal variations in the (222)Rn flux densities are significant in China. Annual average flux densities in the southeastern and northwestern China are generally higher than those in other regions of China, because of high soil (226)Ra content in the southeastern area and high soil aridity in the northwestern one. The seasonal average flux density is generally higher in summer/spring than winter, since relatively higher soil temperature and lower soil water saturation in summer/spring than other seasons are common in China.
PEDO-TRANSFER FUNCTIONS FOR ESTIMATING SOIL BULK DENSITY IN CENTRAL AMAZONIA
Directory of Open Access Journals (Sweden)
Henrique Seixas Barros
2015-04-01
Full Text Available Under field conditions in the Amazon forest, soil bulk density is difficult to measure. Rigorous methodological criteria must be applied to obtain reliable inventories of C stocks and soil nutrients, making this process expensive and sometimes unfeasible. This study aimed to generate models to estimate soil bulk density based on parameters that can be easily and reliably measured in the field and that are available in many soil-related inventories. Stepwise regression models to predict bulk density were developed using data on soil C content, clay content and pH in water from 140 permanent plots in terra firme (upland forests near Manaus, Amazonas State, Brazil. The model results were interpreted according to the coefficient of determination (R2 and Akaike information criterion (AIC and were validated with a dataset consisting of 125 plots different from those used to generate the models. The model with best performance in estimating soil bulk density under the conditions of this study included clay content and pH in water as independent variables and had R2 = 0.73 and AIC = -250.29. The performance of this model for predicting soil density was compared with that of models from the literature. The results showed that the locally calibrated equation was the most accurate for estimating soil bulk density for upland forests in the Manaus region.
The walking behaviour of pedestrian social groups and its impact on crowd dynamics
Moussaid, Mehdi; Garnier, Simon; Helbing, Dirk; Theraulaz, Guy
2010-01-01
Human crowd motion is mainly driven by self-organized processes based on local interactions among pedestrians. While most studies of crowd behavior consider only interactions among isolated individuals, it turns out that up to 70% of people in a crowd are actually moving in groups, such as friends, couples, or families walking together. These groups constitute medium-scale aggregated structures and their impact on crowd dynamics is still largely unknown. In this work, we analyze the motion of approximately 1500 pedestrian groups under natural condition, and show that social interactions among group members generate typical group walking patterns that influence crowd dynamics. At low density, group members tend to walk side by side, forming a line perpendicular to the walking direction. As the density increases, however, the linear walking formation is bent forward, turning it into a V-like pattern. These spatial patterns can be well described by a model based on social communication between group members. We ...
Directory of Open Access Journals (Sweden)
Eléanor Brassine
Full Text Available Camera trapping studies have become increasingly popular to produce population estimates of individually recognisable mammals. Yet, monitoring techniques for rare species which occur at extremely low densities are lacking. Additionally, species which have unpredictable movements may make obtaining reliable population estimates challenging due to low detectability. Our study explores the effectiveness of intensive camera trapping for estimating cheetah (Acinonyx jubatus numbers. Using both a more traditional, systematic grid approach and pre-determined, targeted sites for camera placement, the cheetah population of the Northern Tuli Game Reserve, Botswana was sampled between December 2012 and October 2013. Placement of cameras in a regular grid pattern yielded very few (n = 9 cheetah images and these were insufficient to estimate cheetah density. However, pre-selected cheetah scent-marking posts provided 53 images of seven adult cheetahs (0.61 ± 0.18 cheetahs/100 km². While increasing the length of the camera trapping survey from 90 to 130 days increased the total number of cheetah images obtained (from 53 to 200, no new individuals were recorded and the estimated population density remained stable. Thus, our study demonstrates that targeted camera placement (irrespective of survey duration is necessary for reliably assessing cheetah densities where populations are naturally very low or dominated by transient individuals. Significantly our approach can easily be applied to other rare predator species.
Williams, C R; Johnson, P H; Ball, T S; Ritchie, S A
2013-09-01
New mosquito control strategies centred on the modifying of populations require knowledge of existing population densities at release sites and an understanding of breeding site ecology. Using a quantitative pupal survey method, we investigated production of the dengue vector Aedes aegypti (L.) (Stegomyia aegypti) (Diptera: Culicidae) in Cairns, Queensland, Australia, and found that garden accoutrements represented the most common container type. Deliberately placed 'sentinel' containers were set at seven houses and sampled for pupae over 10 weeks during the wet season. Pupal production was approximately constant; tyres and buckets represented the most productive container types. Sentinel tyres produced the largest female mosquitoes, but were relatively rare in the field survey. We then used field-collected data to make estimates of per premises population density using three different approaches. Estimates of female Ae. aegypti abundance per premises made using the container-inhabiting mosquito simulation (CIMSiM) model [95% confidence interval (CI) 18.5-29.1 females] concorded reasonably well with estimates obtained using a standing crop calculation based on pupal collections (95% CI 8.8-22.5) and using BG-Sentinel traps and a sampling rate correction factor (95% CI 6.2-35.2). By first describing local Ae. aegypti productivity, we were able to compare three separate population density estimates which provided similar results. We anticipate that this will provide researchers and health officials with several tools with which to make estimates of population densities.
Marques, Tiago A; Thomas, Len; Ward, Jessica; DiMarzio, Nancy; Tyack, Peter L
2009-04-01
Methods are developed for estimating the size/density of cetacean populations using data from a set of fixed passive acoustic sensors. The methods convert the number of detected acoustic cues into animal density by accounting for (i) the probability of detecting cues, (ii) the rate at which animals produce cues, and (iii) the proportion of false positive detections. Additional information is often required for estimation of these quantities, for example, from an acoustic tag applied to a sample of animals. Methods are illustrated with a case study: estimation of Blainville's beaked whale density over a 6 day period in spring 2005, using an 82 hydrophone wide-baseline array located in the Tongue of the Ocean, Bahamas. To estimate the required quantities, additional data are used from digital acoustic tags, attached to five whales over 21 deep dives, where cues recorded on some of the dives are associated with those received on the fixed hydrophones. Estimated density was 25.3 or 22.5 animals/1000 km(2), depending on assumptions about false positive detections, with 95% confidence intervals 17.3-36.9 and 15.4-32.9. These methods are potentially applicable to a wide variety of marine and terrestrial species that are hard to survey using conventional visual methods.
High-order ionospheric effects on electron density estimation from Fengyun-3C GPS radio occultation
Li, Junhai; Jin, Shuanggen
2017-03-01
GPS radio occultation can estimate ionospheric electron density and total electron content (TEC) with high spatial resolution, e.g., China's recent Fengyun-3C GPS radio occultation. However, high-order ionospheric delays are normally ignored. In this paper, the high-order ionospheric effects on electron density estimation from the Fengyun-3C GPS radio occultation data are estimated and investigated using the NeQuick2 ionosphere model and the IGRF12 (International Geomagnetic Reference Field, 12th generation) geomagnetic model. Results show that the high-order ionospheric delays have large effects on electron density estimation with up to 800 el cm-3, which should be corrected in high-precision ionospheric density estimation and applications. The second-order ionospheric effects are more significant, particularly at 250-300 km, while third-order ionospheric effects are much smaller. Furthermore, the high-order ionospheric effects are related to the location, the local time, the radio occultation azimuth and the solar activity. The large high-order ionospheric effects are found in the low-latitude area and in the daytime as well as during strong solar activities. The second-order ionospheric effects have a maximum positive value when the radio occultation azimuth is around 0-20°, and a maximum negative value when the radio occultation azimuth is around -180 to -160°. Moreover, the geomagnetic storm also affects the high-order ionospheric delay, which should be carefully corrected.
Brassine, Eléanor; Parker, Daniel
2015-01-01
Camera trapping studies have become increasingly popular to produce population estimates of individually recognisable mammals. Yet, monitoring techniques for rare species which occur at extremely low densities are lacking. Additionally, species which have unpredictable movements may make obtaining reliable population estimates challenging due to low detectability. Our study explores the effectiveness of intensive camera trapping for estimating cheetah (Acinonyx jubatus) numbers. Using both a more traditional, systematic grid approach and pre-determined, targeted sites for camera placement, the cheetah population of the Northern Tuli Game Reserve, Botswana was sampled between December 2012 and October 2013. Placement of cameras in a regular grid pattern yielded very few (n = 9) cheetah images and these were insufficient to estimate cheetah density. However, pre-selected cheetah scent-marking posts provided 53 images of seven adult cheetahs (0.61 ± 0.18 cheetahs/100 km²). While increasing the length of the camera trapping survey from 90 to 130 days increased the total number of cheetah images obtained (from 53 to 200), no new individuals were recorded and the estimated population density remained stable. Thus, our study demonstrates that targeted camera placement (irrespective of survey duration) is necessary for reliably assessing cheetah densities where populations are naturally very low or dominated by transient individuals. Significantly our approach can easily be applied to other rare predator species.
A hierarchical model for estimating density in camera-trap studies
Royle, J. Andrew; Nichols, James D.; Karanth, K.Ullas; Gopalaswamy, Arjun M.
2009-01-01
Estimating animal density using capture–recapture data from arrays of detection devices such as camera traps has been problematic due to the movement of individuals and heterogeneity in capture probability among them induced by differential exposure to trapping.We develop a spatial capture–recapture model for estimating density from camera-trapping data which contains explicit models for the spatial point process governing the distribution of individuals and their exposure to and detection by traps.We adopt a Bayesian approach to analysis of the hierarchical model using the technique of data augmentation.The model is applied to photographic capture–recapture data on tigers Panthera tigris in Nagarahole reserve, India. Using this model, we estimate the density of tigers to be 14·3 animals per 100 km2 during 2004.Synthesis and applications. Our modelling framework largely overcomes several weaknesses in conventional approaches to the estimation of animal density from trap arrays. It effectively deals with key problems such as individual heterogeneity in capture probabilities, movement of traps, presence of potential ‘holes’ in the array and ad hoc estimation of sample area. The formulation, thus, greatly enhances flexibility in the conduct of field surveys as well as in the analysis of data, from studies that may involve physical, photographic or DNA-based ‘captures’ of individual animals.
Estimation of current density distribution of PAFC by analysis of cell exhaust gas
Energy Technology Data Exchange (ETDEWEB)
Kato, S.; Seya, A. [Fuji Electric Co., Ltd., Ichihara-shi (Japan); Asano, A. [Fuji Electric Corporate, Ltd., Yokosuka-shi (Japan)
1996-12-31
To estimate distributions of Current densities, voltages, gas concentrations, etc., in phosphoric acid fuel cell (PAFC) stacks, is very important for getting fuel cells with higher quality. In this work, we leave developed a numerical simulation tool to map out the distribution in a PAFC stack. And especially to Study Current density distribution in the reaction area of the cell, we analyzed gas composition in several positions inside a gas outlet manifold of the PAFC stack. Comparing these measured data with calculated data, the current density distribution in a cell plane calculated by the simulation, was certified.
Scent Lure Effect on Camera-Trap Based Leopard Density Estimates.
Directory of Open Access Journals (Sweden)
Alexander Richard Braczkowski
Full Text Available Density estimates for large carnivores derived from camera surveys often have wide confidence intervals due to low detection rates. Such estimates are of limited value to authorities, which require precise population estimates to inform conservation strategies. Using lures can potentially increase detection, improving the precision of estimates. However, by altering the spatio-temporal patterning of individuals across the camera array, lures may violate closure, a fundamental assumption of capture-recapture. Here, we test the effect of scent lures on the precision and veracity of density estimates derived from camera-trap surveys of a protected African leopard population. We undertook two surveys (a 'control' and 'treatment' survey on Phinda Game Reserve, South Africa. Survey design remained consistent except a scent lure was applied at camera-trap stations during the treatment survey. Lures did not affect the maximum movement distances (p = 0.96 or temporal activity of female (p = 0.12 or male leopards (p = 0.79, and the assumption of geographic closure was met for both surveys (p >0.05. The numbers of photographic captures were also similar for control and treatment surveys (p = 0.90. Accordingly, density estimates were comparable between surveys (although estimates derived using non-spatial methods (7.28-9.28 leopards/100km2 were considerably higher than estimates from spatially-explicit methods (3.40-3.65 leopards/100km2. The precision of estimates from the control and treatment surveys, were also comparable and this applied to both non-spatial and spatial methods of estimation. Our findings suggest that at least in the context of leopard research in productive habitats, the use of lures is not warranted.
Retrieval of mesospheric electron densities using an optimal estimation inverse method
Grant, J.; Grainger, R. G.; Lawrence, B. N.; Fraser, G. J.; von Biel, H. A.; Heuff, D. N.; Plank, G. E.
2004-03-01
We present a new method to determine mesospheric electron densities from partially reflected medium frequency radar pulses. The technique uses an optimal estimation inverse method and retrieves both an electron density profile and a gradient electron density profile. As well as accounting for the absorption of the two magnetoionic modes formed by ionospheric birefringence of each radar pulse, the forward model of the retrieval parameterises possible Fresnel scatter of each mode by fine electronic structure, phase changes of each mode due to Faraday rotation and the dependence of the amplitudes of the backscattered modes upon pulse width. Validation results indicate that known profiles can be retrieved and that χ2 tests upon retrieval parameters satisfy validity criteria. Application to measurements shows that retrieved electron density profiles are consistent with accepted ideas about seasonal variability of electron densities and their dependence upon nitric oxide production and transport.
Crowding increases salivary cortisol but not self-directed behavior in captive baboons.
Pearson, Brandon L; Reeder, DeeAnn M; Judge, Peter G
2015-04-01
Reduced space can lead to crowding in social animals. Crowding increases the risk of agonistic interactions that, in turn, may require additional physiological defensive coping mechanisms affecting health. To determine the stress induced from increased social density in a group of nineteen baboons living in an indoor/outdoor enclosure, saliva cortisol levels and rates of anxiety-related behavior were analyzed across two unique crowding episodes. Initially, mean salivary cortisol levels when animals were restricted to their indoor quarters were compared to those when they also had access to their larger outdoor enclosure. Then, mean cortisol levels were compared before, during, and after two distinct crowding periods of long and short duration. Crowding resulted in significantly elevated cortisol during crowding periods compared to non-crowded periods. Cortisol levels returned to baseline following two crowding episodes contrasting in their length and ambient climate conditions. These cortisol elevations indicate greater metabolic costs of maintaining homeostasis under social stress resulting from reduced space. Self-directed behavior, conversely, was not reliably elevated during crowding. Results suggest that the potential for negative social interactions, and/or the uncertainty associated with social threat can cause physiological stress responses detected by salivary cortisol. Self-directed behavioral measures of stress may constitute inadequate indicators of social stress in colony-housed monkeys or represent subjective emotional arousal unrelated to hypothalamic-pituitary adrenal axis activation.
Variational estimation of the drift for stochastic differential equations from the empirical density
Batz, Philipp; Ruttor, Andreas; Opper, Manfred
2016-08-01
We present a method for the nonparametric estimation of the drift function of certain types of stochastic differential equations from the empirical density. It is based on a variational formulation of the Fokker-Planck equation. The minimization of an empirical estimate of the variational functional using kernel based regularization can be performed in closed form. We demonstrate the performance of the method on second order, Langevin-type equations and show how the method can be generalized to other noise models.
Variational estimation of the drift for stochastic differential equations from the empirical density
Batz, Philipp; Opper, Manfred
2016-01-01
We present a method for the nonparametric estimation of the drift function of certain types of stochastic differential equations from the empirical density. It is based on a variational formulation of the Fokker-Planck equation. The minimization of an empirical estimate of the variational functional using kernel based regularization can be performed in closed form. We demonstrate the performance of the method on second order, Langevin-type equations and show how the method can be generalized to other noise models.
Application of Density Estimation Methods to Datasets Collected From a Glider
2015-09-30
buoyancy. The methodology employed in this study to estimate population density of marine mammals is based on the works of Zimmer et al. (2008), Marques ...estimation modalities (Thomas and Marques , 2012), such as individual or group counting. In this sense, bearings to received sounds on both hydrophones will...the sea trial. Figure 2. Left: Image showing the area of REP14-MED sea-trial (red box) in the context of the Western Mediterranean Sea and
Shi, Xiaomeng; Ye, Zhirui; Shiwakoti, Nirajan; Tang, Dounan; Wang, Chao; Wang, Wei
2016-10-01
A recent crowd stampede during a New Year's Eve celebration in Shanghai, China resulted in 36 fatalities and over 49 serious injuries. Many of such tragic crowd accidents around the world resulted from complex multi-direction crowd movement such as merging behavior. Although there are a few studies on merging crowd behavior, none of them have conducted a systematic analysis considering the impact of both merging angle and flow direction towards the safety of pedestrian crowd movement. In this study, a series of controlled laboratory experiments were conducted to examine the safety constraints of merging pedestrian crowd movements considering merging angle (60°, 90° and 180°) and flow direction under slow running and blocked vision condition. Then, macroscopic and microscopic properties of crowd dynamics are obtained and visualized through the analysis of pedestrian crowd trajectory data derived from video footage. It was found that merging angle had a significant influence on the fluctuations of pedestrian flows, which is important in a critical situation such as emergency evacuation. As the merging angle increased, mean velocity and mean flow at the measuring region in the exit corridors decreased, while mean density increased. A similar trend was observed for the number of weaving and overtaking conflicts, which resulted in the increase of mean headway. Further, flow direction had a significant impact on the outflow of the individuals while blocked vision had an influence on pedestrian crowd interactions and merging process. Finally, this paper discusses safety assessments on crowd merging behaviors along with some recommendations for future research. Findings from this study can assist in the development and validation of pedestrian crowd simulation models as well as organization and control of crowd events.
Breast percent density estimation from 3D reconstructed digital breast tomosynthesis images
Bakic, Predrag R.; Kontos, Despina; Carton, Ann-Katherine; Maidment, Andrew D. A.
2008-03-01
Breast density is an independent factor of breast cancer risk. In mammograms breast density is quantitatively measured as percent density (PD), the percentage of dense (non-fatty) tissue. To date, clinical estimates of PD have varied significantly, in part due to the projective nature of mammography. Digital breast tomosynthesis (DBT) is a 3D imaging modality in which cross-sectional images are reconstructed from a small number of projections acquired at different x-ray tube angles. Preliminary studies suggest that DBT is superior to mammography in tissue visualization, since superimposed anatomical structures present in mammograms are filtered out. We hypothesize that DBT could also provide a more accurate breast density estimation. In this paper, we propose to estimate PD from reconstructed DBT images using a semi-automated thresholding technique. Preprocessing is performed to exclude the image background and the area of the pectoral muscle. Threshold values are selected manually from a small number of reconstructed slices; a combination of these thresholds is applied to each slice throughout the entire reconstructed DBT volume. The proposed method was validated using images of women with recently detected abnormalities or with biopsy-proven cancers; only contralateral breasts were analyzed. The Pearson correlation and kappa coefficients between the breast density estimates from DBT and the corresponding digital mammogram indicate moderate agreement between the two modalities, comparable with our previous results from 2D DBT projections. Percent density appears to be a robust measure for breast density assessment in both 2D and 3D x-ray breast imaging modalities using thresholding.
Crowding of molecular motors determines microtubule depolymerization
Reese, Louis; Frey, Erwin
2011-01-01
Assembly and disassembly dynamics of microtubules (MTs) is tightly controlled by MT associated proteins. Here, we investigate how plus-end-directed depolymerases of the kinesin-8 family regulate MT depolymerization dynamics. Employing an individual-based model, we reproduce experimental findings. Moreover, crowding is identified as the key regulatory mechanism of depolymerization dynamics. Our analysis gives two qualitatively distinct regimes. For motor densities above a particular threshold, a macroscopic traffic jam emerges at the plus-end and the MT dynamics become independent of the motor concentration. Below this threshold, microscopic traffic jams at the tip arise which cancel out the effect of the depolymerization kinetics such that the depolymerization speed is solely determined by the motor density. Because this density changes over the MT length, length-dependent regulation is possible. Remarkably, motor cooperativity does not affect the depolymerization speed but only the end-residence time of depo...
Estimating abundance and density of Amur tigers along the Sino-Russian border.
Xiao, Wenhong; Feng, Limin; Mou, Pu; Miquelle, Dale G; Hebblewhite, Mark; Goldberg, Joshua F; Robinson, Hugh S; Zhao, Xiaodan; Zhou, Bo; Wang, Tianming; Ge, Jianping
2016-07-01
As an apex predator the Amur tiger (Panthera tigris altaica) could play a pivotal role in maintaining the integrity of forest ecosystems in Northeast Asia. Due to habitat loss and harvest over the past century, tigers rapidly declined in China and are now restricted to the Russian Far East and bordering habitat in nearby China. To facilitate restoration of the tiger in its historical range, reliable estimates of population size are essential to assess effectiveness of conservation interventions. Here we used camera trap data collected in Hunchun National Nature Reserve from April to June 2013 and 2014 to estimate tiger density and abundance using both maximum likelihood and Bayesian spatially explicit capture-recapture (SECR) methods. A minimum of 8 individuals were detected in both sample periods and the documentation of marking behavior and reproduction suggests the presence of a resident population. Using Bayesian SECR modeling within the 11 400 km(2) state space, density estimates were 0.33 and 0.40 individuals/100 km(2) in 2013 and 2014, respectively, corresponding to an estimated abundance of 38 and 45 animals for this transboundary Sino-Russian population. In a maximum likelihood framework, we estimated densities of 0.30 and 0.24 individuals/100 km(2) corresponding to abundances of 34 and 27, in 2013 and 2014, respectively. These density estimates are comparable to other published estimates for resident Amur tiger populations in the Russian Far East. This study reveals promising signs of tiger recovery in Northeast China, and demonstrates the importance of connectivity between the Russian and Chinese populations for recovering tigers in Northeast China.
Goldenshluger, Alexander
2010-01-01
We address the problem of density estimation with $\\bL_p$--loss by selection of kernel estimators. We develop a selection procedure and derive corresponiding $\\bL_p$--risk oracle inequalities. It is shown that the proposed selection rule leads to the minimax estimator that is adaptive over a scale of the anisotropic Nikol'ski classes. The main technical tools used in our derivations are uniform bounds on the $\\bL_p$--norms of empirical processes developed recently in Goldenshluger and Lepski~(2010).
Multi-objective mixture-based iterated density estimation evolutionary algorithms
Thierens, D.; Bosman, P.A.N.
2001-01-01
We propose an algorithm for multi-objective optimization using a mixture-based iterated density estimation evolutionary algorithm (MIDEA). The MIDEA algorithm is a prob- abilistic model building evolutionary algo- rithm that constructs at each generation a mixture of factorized probability
ks: Kernel Density Estimation and Kernel Discriminant Analysis for Multivariate Data in R
Directory of Open Access Journals (Sweden)
Tarn Duong
2007-09-01
Full Text Available Kernel smoothing is one of the most widely used non-parametric data smoothing techniques. We introduce a new R package ks for multivariate kernel smoothing. Currently it contains functionality for kernel density estimation and kernel discriminant analysis. It is a comprehensive package for bandwidth matrix selection, implementing a wide range of data-driven diagonal and unconstrained bandwidth selectors.
Gao, Nuo; Zhu, S A; He, Bin
2005-06-01
We have developed a new algorithm for magnetic resonance electrical impedance tomography (MREIT), which uses only one component of the magnetic flux density to reconstruct the electrical conductivity distribution within the body. The radial basis function (RBF) network and simplex method are used in the present approach to estimate the conductivity distribution by minimizing the errors between the 'measured' and model-predicted magnetic flux densities. Computer simulations were conducted in a realistic-geometry head model to test the feasibility of the proposed approach. Single-variable and three-variable simulations were performed to estimate the brain-skull conductivity ratio and the conductivity values of the brain, skull and scalp layers. When SNR = 15 for magnetic flux density measurements with the target skull-to-brain conductivity ratio being 1/15, the relative error (RE) between the target and estimated conductivity was 0.0737 +/- 0.0746 in the single-variable simulations. In the three-variable simulations, the RE was 0.1676 +/- 0.0317. Effects of electrode position uncertainty were also assessed by computer simulations. The present promising results suggest the feasibility of estimating important conductivity values within the head from noninvasive magnetic flux density measurements.
ASYMPTOTIC NORMALITY OF KERNEL ESTIMATES OF A DENSITY FUNCTION UNDER ASSOCIATION DEPENDENCE
Institute of Scientific and Technical Information of China (English)
林正炎
2003-01-01
Let {Xn,n> _ 1} be a strictly stationary sequence of random variables,which are either associated or negatively associated,f(·)be their common density.In this paper,the author shows a central limit theorem for a kernel estimate of f(·)under certain regular conditions.
DEFF Research Database (Denmark)
Rosholm, A; Hyldstrup, L; Backsgaard, L
2002-01-01
A new automated radiogrammetric method to estimate bone mineral density (BMD) from a single radiograph of the hand and forearm is described. Five regions of interest in radius, ulna and the three middle metacarpal bones are identified and approximately 1800 geometrical measurements from these bon...
Crowding and visual acuity measured in adults using paediatric test letters, pictures and symbols.
Lalor, Sarah J H; Formankiewicz, Monika A; Waugh, Sarah J
2016-04-01
Crowding refers to the degradation of visual acuity for target optotypes with, versus without, surrounding features. Crowding is important clinically, however the effect of target-flanker spacing on acuity for symbols and pictures, compared to letters, has not been investigated. Five adults with corrected-to-normal vision had visual acuity measured for modified single target versions of Kay Pictures, Lea Symbols, HOTV and Cambridge Crowding Cards, tests. Single optotypes were presented in isolation and with surrounding features placed 0-5 stroke-widths away. Visual acuity measured with Kay Picture optotypes is 0.13-0.19logMAR better than for other test optotypes and varies significantly across picture. The magnitude of crowding is strongest when the surrounding features abut, or are placed 1 stroke-width away from the target optotype. The slope of the psychometric function is steeper in the region just beyond maximum crowding. Crowding is strongest and the psychometric function steepest, with the Cambridge Crowding Cards arrangement, than when any single optotype is surrounded by a box. Estimates of crowding extent are less variable across test when expressed in units of stroke-width, than optotype-width. Crowding for single target presentations of letters, symbols and pictures used in paediatric visual acuity tests can be maximised and made more sensitive to change in visual acuity, by careful selection of optotype, by surrounding the target with similar flankers, and by using a closer target-flanker separation than half an optotype-width.
Topological Pressure and Coding Sequence Density Estimation in the Human Genome
Koslicki, David
2011-01-01
Inspired by concepts from ergodic theory, we give new insight into coding sequence (CDS) density estimation for the human genome. Our approach is based on the introduction and study of topological pressure: a numerical quantity assigned to any finite sequence based on an appropriate notion of `weighted information content'. For human DNA sequences, each codon is assigned a suitable weight, and using a window size of approximately 60,000bp, we obtain a very strong positive correlation between CDS density and topological pressure. The weights are selected by an optimization procedure, and can be interpreted as quantitative data on the relative importance of different codons for the density estimation of coding sequences. This gives new insight into codon usage bias which is an important subject where long standing questions remain open. Inspired again by ergodic theory, we use the weightings on the codons to define a probability measure on finite sequences. We demonstrate that this measure is effective in disti...
Distributed Density Estimation Based on a Mixture of Factor Analyzers in a Sensor Network
Directory of Open Access Journals (Sweden)
Xin Wei
2015-08-01
Full Text Available Distributed density estimation in sensor networks has received much attention due to its broad applicability. When encountering high-dimensional observations, a mixture of factor analyzers (MFA is taken to replace mixture of Gaussians for describing the distributions of observations. In this paper, we study distributed density estimation based on a mixture of factor analyzers. Existing estimation algorithms of the MFA are for the centralized case, which are not suitable for distributed processing in sensor networks. We present distributed density estimation algorithms for the MFA and its extension, the mixture of Student’s t-factor analyzers (MtFA. We first define an objective function as the linear combination of local log-likelihoods. Then, we give the derivation process of the distributed estimation algorithms for the MFA and MtFA in details, respectively. In these algorithms, the local sufficient statistics (LSS are calculated at first and diffused. Then, each node performs a linear combination of the received LSS from nodes in its neighborhood to obtain the combined sufficient statistics (CSS. Parameters of the MFA and the MtFA can be obtained by using the CSS. Finally, we evaluate the performance of these algorithms by numerical simulations and application example. Experimental results validate the promising performance of the proposed algorithms.
Application of Kernel Density Estimation in Lamb Wave-Based Damage Detection
Directory of Open Access Journals (Sweden)
Long Yu
2012-01-01
Full Text Available The present work concerns the estimation of the probability density function (p.d.f. of measured data in the Lamb wave-based damage detection. Although there was a number of research work which focused on the consensus algorithm of combining all the results of individual sensors, the p.d.f. of measured data, which was the fundamental part of the probability-based method, was still given by experience in existing work. Based on the analysis about the noise-induced errors in measured data, it was learned that the type of distribution was related with the level of noise. In the case of weak noise, the p.d.f. of measured data could be considered as the normal distribution. The empirical methods could give satisfied estimating results. However, in the case of strong noise, the p.d.f. was complex and did not belong to any type of common distribution function. Nonparametric methods, therefore, were needed. As the most popular nonparametric method, kernel density estimation was introduced. In order to demonstrate the performance of the kernel density estimation methods, a numerical model was built to generate the signals of Lamb waves. Three levels of white Gaussian noise were intentionally added into the simulated signals. The estimation results showed that the nonparametric methods outperformed the empirical methods in terms of accuracy.
Importance of tree basic density in biomass estimation and associated uncertainties
DEFF Research Database (Denmark)
Njana, Marco Andrew; Meilby, Henrik; Eid, Tron
2016-01-01
Key message Aboveground and belowground tree basic densities varied between and within the three mangrove species. If appropriately determined and applied, basic density may be useful in estimation of tree biomass. Predictive accuracy of the common (i.e. multi-species) models including aboveground...... of sustainable forest management, conservation and enhancement of carbon stocks (REDD+) initiatives offer an opportunity for sustainable management of forests including mangroves. In carbon accounting for REDD+, it is required that carbon estimates prepared for monitoring reporting and verification schemes...... and examine uncertainties in estimation of tree biomass using indirect methods. Methods This study focused on three dominant mangrove species (Avicennia marina (Forssk.) Vierh, Sonneratia alba J. Smith and Rhizophora mucronata Lam.) in Tanzania. A total of 120 trees were destructively sampled for aboveground...
How odgcrnwi becomes crowding: stimulus-specific learning reduces crowding.
Huckauf, Anke; Nazir, Tatjana A
2007-08-16
Processes underlying crowding in visual letter recognition were examined by investigating effects of training. Experiment 1 revealed that training reduces crowding mainly for trained strings. This was corroborated in Experiment 2, where no training effects were obvious after 3 days of training when strings changed from trial to trial. Experiment 3 specified that after a short amount of training, learning effects remained specific to trained strings and also to the trained retinal eccentricity and the interletter spacing used in training. Transfer to other than trained conditions was observed only after further training. Experiment 4 showed that transfer occurred earlier when words were used as stimuli. These results thus demonstrate that part of crowding results from the absence of higher level representations of the stimulus. Such representations can be acquired through learning visual properties of the stimulus.
Estimation of dislocation density from precession electron diffraction data using the Nye tensor.
Leff, A C; Weinberger, C R; Taheri, M L
2015-06-01
The Nye tensor offers a means to estimate the geometrically necessary dislocation density of a crystalline sample based on measurements of the orientation changes within individual crystal grains. In this paper, the Nye tensor theory is applied to precession electron diffraction automated crystallographic orientation mapping (PED-ACOM) data acquired using a transmission electron microscope (TEM). The resulting dislocation density values are mapped in order to visualize the dislocation structures present in a quantitative manner. These density maps are compared with other related methods of approximating local strain dependencies in dislocation-based microstructural transitions from orientation data. The effect of acquisition parameters on density measurements is examined. By decreasing the step size and spot size during data acquisition, an increasing fraction of the dislocation content becomes accessible. Finally, the method described herein is applied to the measurement of dislocation emission during in situ annealing of Cu in TEM in order to demonstrate the utility of the technique for characterizing microstructural dynamics.
Quantifying error distributions in crowding.
Hanus, Deborah; Vul, Edward
2013-03-22
When multiple objects are in close proximity, observers have difficulty identifying them individually. Two classes of theories aim to account for this crowding phenomenon: spatial pooling and spatial substitution. Variations of these accounts predict different patterns of errors in crowded displays. Here we aim to characterize the kinds of errors that people make during crowding by comparing a number of error models across three experiments in which we manipulate flanker spacing, display eccentricity, and precueing duration. We find that both spatial intrusions and individual letter confusions play a considerable role in errors. Moreover, we find no evidence that a naïve pooling model that predicts errors based on a nonadditive combination of target and flankers explains errors better than an independent intrusion model (indeed, in our data, an independent intrusion model is slightly, but significantly, better). Finally, we find that manipulating trial difficulty in any way (spacing, eccentricity, or precueing) produces homogenous changes in error distributions. Together, these results provide quantitative baselines for predictive models of crowding errors, suggest that pooling and spatial substitution models are difficult to tease apart, and imply that manipulations of crowding all influence a common mechanism that impacts subject performance.
Mammographic density and estimation of breast cancer risk in intermediate risk population.
Tesic, Vanja; Kolaric, Branko; Znaor, Ariana; Kuna, Sanja Kusacic; Brkljacic, Boris
2013-01-01
It is not clear to what extent mammographic density represents a risk factor for breast cancer among women with moderate risk for disease. We conducted a population-based study to estimate the independent effect of breast density on breast cancer risk and to evaluate the potential of breast density as a marker of risk in an intermediate risk population. From November 2006 to April 2009, data that included American College of Radiology Breast Imaging Reporting and Data System (BI-RADS) breast density categories and risk information were collected on 52,752 women aged 50-69 years without previously diagnosed breast cancer who underwent screening mammography examination. A total of 257 screen-detected breast cancers were identified. Logistic regression was used to assess the effect of breast density on breast carcinoma risk and to control for other risk factors. The risk increased with density and the odds ratio for breast cancer among women with dense breast (heterogeneously and extremely dense breast), was 1.9 (95% confidence interval, 1.3-2.8) compared with women with almost entirely fat breasts, after adjustment for age, body mass index, age at menarche, age at menopause, age at first childbirth, number of live births, use of oral contraceptive, family history of breast cancer, prior breast procedures, and hormone replacement therapy use that were all significantly related to breast density (p density and decreased with number of live births. Our finding that mammographic density is an independent risk factor for breast cancer indicates the importance of breast density measurements for breast cancer risk assessment also in moderate risk populations. © 2012 Wiley Periodicals, Inc.
Estimation of energy density of Li-S batteries with liquid and solid electrolytes
Li, Chunmei; Zhang, Heng; Otaegui, Laida; Singh, Gurpreet; Armand, Michel; Rodriguez-Martinez, Lide M.
2016-09-01
With the exponential growth of technology in mobile devices and the rapid expansion of electric vehicles into the market, it appears that the energy density of the state-of-the-art Li-ion batteries (LIBs) cannot satisfy the practical requirements. Sulfur has been one of the best cathode material choices due to its high charge storage (1675 mAh g-1), natural abundance and easy accessibility. In this paper, calculations are performed for different cell design parameters such as the active material loading, the amount/thickness of electrolyte, the sulfur utilization, etc. to predict the energy density of Li-S cells based on liquid, polymeric and ceramic electrolytes. It demonstrates that Li-S battery is most likely to be competitive in gravimetric energy density, but not volumetric energy density, with current technology, when comparing with LIBs. Furthermore, the cells with polymer and thin ceramic electrolytes show promising potential in terms of high gravimetric energy density, especially the cells with the polymer electrolyte. This estimation study of Li-S energy density can be used as a good guidance for controlling the key design parameters in order to get desirable energy density at cell-level.
A method to estimate the neutral atmospheric density near the ionospheric main peak of Mars
Zou, Hong; Ye, Yu Guang; Wang, Jin Song; Nielsen, Erling; Cui, Jun; Wang, Xiao Dong
2016-04-01
A method to estimate the neutral atmospheric density near the ionospheric main peak of Mars is introduced in this study. The neutral densities at 130 km can be derived from the ionospheric and atmospheric measurements of the Radio Science experiment on board Mars Global Surveyor (MGS). The derived neutral densities cover a large longitude range in northern high latitudes from summer to late autumn during 3 Martian years, which fills the gap of the previous observations for the upper atmosphere of Mars. The simulations of the Laboratoire de Météorologie Dynamique Mars global circulation model can be corrected with a simple linear equation to fit the neutral densities derived from the first MGS/RS (Radio Science) data sets (EDS1). The corrected simulations with the same correction parameters as for EDS1 match the derived neutral densities from two other MGS/RS data sets (EDS2 and EDS3) very well. The derived neutral density from EDS3 shows a dust storm effect, which is in accord with the Mars Express (MEX) Spectroscopy for Investigation of Characteristics of the Atmosphere of Mars measurement. The neutral density derived from the MGS/RS measurements can be used to validate the Martian atmospheric models. The method presented in this study can be applied to other radio occultation measurements, such as the result of the Radio Science experiment on board MEX.
Ahn, Chul Kyun; Heo, Changyong; Jin, Heongmin; Kim, Jong Hyo
2017-03-01
Mammographic breast density is a well-established marker for breast cancer risk. However, accurate measurement of dense tissue is a difficult task due to faint contrast and significant variations in background fatty tissue. This study presents a novel method for automated mammographic density estimation based on Convolutional Neural Network (CNN). A total of 397 full-field digital mammograms were selected from Seoul National University Hospital. Among them, 297 mammograms were randomly selected as a training set and the rest 100 mammograms were used for a test set. We designed a CNN architecture suitable to learn the imaging characteristic from a multitudes of sub-images and classify them into dense and fatty tissues. To train the CNN, not only local statistics but also global statistics extracted from an image set were used. The image set was composed of original mammogram and eigen-image which was able to capture the X-ray characteristics in despite of the fact that CNN is well known to effectively extract features on original image. The 100 test images which was not used in training the CNN was used to validate the performance. The correlation coefficient between the breast estimates by the CNN and those by the expert's manual measurement was 0.96. Our study demonstrated the feasibility of incorporating the deep learning technology into radiology practice, especially for breast density estimation. The proposed method has a potential to be used as an automated and quantitative assessment tool for mammographic breast density in routine practice.
Theoretical mechanics: Crowd synchrony on the Millennium Bridge
Strogatz, Steven H.; Abrams, Daniel M.; McRobie, Allan; Eckhardt, Bruno; Ott, Edward
2005-11-01
Soon after the crowd streamed on to London's Millennium Bridge on the day it opened, the bridge started to sway from side to side: many pedestrians fell spontaneously into step with the bridge's vibrations, inadvertently amplifying them. Here we model this unexpected and now notorious phenomenon - which was not due to the bridge's innovative design as was first thought - by adapting ideas originally developed to describe the collective synchronization of biological oscillators such as neurons and fireflies. Our approach should help engineers to estimate the damping needed to stabilize other exceptionally crowded footbridges against synchronous lateral excitation by pedestrians.
Is Malnutrition Associated with Crowding in Permanent Dentition?
Directory of Open Access Journals (Sweden)
Erika B. A. F. Thomaz
2010-09-01
Full Text Available Evidence suggests that energy-protein malnutrition is associated with impaired growth and development of facial bones. The objective of this study was to investigate the association between nutritional status and reduced space for dental eruption (crowding in permanent dentition. A cross-sectional study with probabilistic sampling design was used. We evaluated 2,060 students aged 12 to 15 years enrolled in schools in the northeast of Brazil. Crowding was defined according to World Health Organization (WHO as misalignment of teeth due to lack of space for them to erupt in the correct position. Nutritional status was evaluated by means of body mass index and height-for-age, using the WHO’s reference curves. Parents and adolescents responded to a questionnaire about demographic, socioeconomic, biological and behavioral characteristics. The associations were estimated by odds ratio (OR in multivariate logistic regression analysis (alpha = 0.05. Confounding and effect-modification were taken into account. An association between low height-for-age (z-score < –1SD and crowding was only observed in adolescents with a prolonged history of mouth breathing (OR = 3.1. No association was observed between underweight and crowding. Malnutrition is related to crowding in permanent dentition among mouth-breathing adolescents. Policy actions aimed at reducing low height-for-age and unhealthy oral habits are strongly recommended. However, further studies are needed to increase the consistency of these findings and improve understanding of the subject.
Joint estimation of crown of thorns (Acanthaster planci densities on the Great Barrier Reef
Directory of Open Access Journals (Sweden)
M. Aaron MacNeil
2016-08-01
Full Text Available Crown-of-thorns starfish (CoTS; Acanthaster spp. are an outbreaking pest among many Indo-Pacific coral reefs that cause substantial ecological and economic damage. Despite ongoing CoTS research, there remain critical gaps in observing CoTS populations and accurately estimating their numbers, greatly limiting understanding of the causes and sources of CoTS outbreaks. Here we address two of these gaps by (1 estimating the detectability of adult CoTS on typical underwater visual count (UVC surveys using covariates and (2 inter-calibrating multiple data sources to estimate CoTS densities within the Cairns sector of the Great Barrier Reef (GBR. We find that, on average, CoTS detectability is high at 0.82 [0.77, 0.87] (median highest posterior density (HPD and [95% uncertainty intervals], with CoTS disc width having the greatest influence on detection. Integrating this information with coincident surveys from alternative sampling programs, we estimate CoTS densities in the Cairns sector of the GBR averaged 44 [41, 48] adults per hectare in 2014.
Energy Technology Data Exchange (ETDEWEB)
Burke, TImothy P. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Kiedrowski, Brian C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Martin, William R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-11-19
Kernel Density Estimators (KDEs) are a non-parametric density estimation technique that has recently been applied to Monte Carlo radiation transport simulations. Kernel density estimators are an alternative to histogram tallies for obtaining global solutions in Monte Carlo tallies. With KDEs, a single event, either a collision or particle track, can contribute to the score at multiple tally points with the uncertainty at those points being independent of the desired resolution of the solution. Thus, KDEs show potential for obtaining estimates of a global solution with reduced variance when compared to a histogram. Previously, KDEs have been applied to neutronics for one-group reactor physics problems and fixed source shielding applications. However, little work was done to obtain reaction rates using KDEs. This paper introduces a new form of the MFP KDE that is capable of handling general geometries. Furthermore, extending the MFP KDE to 2-D problems in continuous energy introduces inaccuracies to the solution. An ad-hoc solution to these inaccuracies is introduced that produces errors smaller than 4% at material interfaces.
Lennox, Kristin P; Dahl, David B; Vannucci, Marina; Tsai, Jerry W
2009-06-01
Interest in predicting protein backbone conformational angles has prompted the development of modeling and inference procedures for bivariate angular distributions. We present a Bayesian approach to density estimation for bivariate angular data that uses a Dirichlet process mixture model and a bivariate von Mises distribution. We derive the necessary full conditional distributions to fit the model, as well as the details for sampling from the posterior predictive distribution. We show how our density estimation method makes it possible to improve current approaches for protein structure prediction by comparing the performance of the so-called "whole" and "half" position distributions. Current methods in the field are based on whole position distributions, as density estimation for the half positions requires techniques, such as ours, that can provide good estimates for small datasets. With our method we are able to demonstrate that half position data provides a better approximation for the distribution of conformational angles at a given sequence position, therefore providing increased efficiency and accuracy in structure prediction.
Density estimation in a wolverine population using spatial capture-recapture models
Royle, J. Andrew; Magoun, Audrey J.; Gardner, Beth; Valkenbury, Patrick; Lowell, Richard E.; McKelvey, Kevin
2011-01-01
Classical closed-population capture-recapture models do not accommodate the spatial information inherent in encounter history data obtained from camera-trapping studies. As a result, individual heterogeneity in encounter probability is induced, and it is not possible to estimate density objectively because trap arrays do not have a well-defined sample area. We applied newly-developed, capture-recapture models that accommodate the spatial attribute inherent in capture-recapture data to a population of wolverines (Gulo gulo) in Southeast Alaska in 2008. We used camera-trapping data collected from 37 cameras in a 2,140-km2 area of forested and open habitats largely enclosed by ocean and glacial icefields. We detected 21 unique individuals 115 times. Wolverines exhibited a strong positive trap response, with an increased tendency to revisit previously visited traps. Under the trap-response model, we estimated wolverine density at 9.7 individuals/1,000-km2(95% Bayesian CI: 5.9-15.0). Our model provides a formal statistical framework for estimating density from wolverine camera-trapping studies that accounts for a behavioral response due to baited traps. Further, our model-based estimator does not have strict requirements about the spatial configuration of traps or length of trapping sessions, providing considerable operational flexibility in the development of field studies.
Joint estimation of crown of thorns (Acanthaster planci) densities on the Great Barrier Reef.
MacNeil, M Aaron; Mellin, Camille; Pratchett, Morgan S; Hoey, Jessica; Anthony, Kenneth R N; Cheal, Alistair J; Miller, Ian; Sweatman, Hugh; Cowan, Zara L; Taylor, Sascha; Moon, Steven; Fonnesbeck, Chris J
2016-01-01
Crown-of-thorns starfish (CoTS; Acanthaster spp.) are an outbreaking pest among many Indo-Pacific coral reefs that cause substantial ecological and economic damage. Despite ongoing CoTS research, there remain critical gaps in observing CoTS populations and accurately estimating their numbers, greatly limiting understanding of the causes and sources of CoTS outbreaks. Here we address two of these gaps by (1) estimating the detectability of adult CoTS on typical underwater visual count (UVC) surveys using covariates and (2) inter-calibrating multiple data sources to estimate CoTS densities within the Cairns sector of the Great Barrier Reef (GBR). We find that, on average, CoTS detectability is high at 0.82 [0.77, 0.87] (median highest posterior density (HPD) and [95% uncertainty intervals]), with CoTS disc width having the greatest influence on detection. Integrating this information with coincident surveys from alternative sampling programs, we estimate CoTS densities in the Cairns sector of the GBR averaged 44 [41, 48] adults per hectare in 2014.
Limit Distribution Theory for Maximum Likelihood Estimation of a Log-Concave Density.
Balabdaoui, Fadoua; Rufibach, Kaspar; Wellner, Jon A
2009-06-01
We find limiting distributions of the nonparametric maximum likelihood estimator (MLE) of a log-concave density, i.e. a density of the form f(0) = exp varphi(0) where varphi(0) is a concave function on R. Existence, form, characterizations and uniform rates of convergence of the MLE are given by Rufibach (2006) and Dümbgen and Rufibach (2007). The characterization of the log-concave MLE in terms of distribution functions is the same (up to sign) as the characterization of the least squares estimator of a convex density on [0, infinity) as studied by Groeneboom, Jongbloed and Wellner (2001b). We use this connection to show that the limiting distributions of the MLE and its derivative are, under comparable smoothness assumptions, the same (up to sign) as in the convex density estimation problem. In particular, changing the smoothness assumptions of Groeneboom, Jongbloed and Wellner (2001b) slightly by allowing some higher derivatives to vanish at the point of interest, we find that the pointwise limiting distributions depend on the second and third derivatives at 0 of H(k), the "lower invelope" of an integrated Brownian motion process minus a drift term depending on the number of vanishing derivatives of varphi(0) = log f(0) at the point of interest. We also establish the limiting distribution of the resulting estimator of the mode M(f(0)) and establish a new local asymptotic minimax lower bound which shows the optimality of our mode estimator in terms of both rate of convergence and dependence of constants on population values.
Eskelson, Bianca N.I.; Hagar, Joan; Temesgen, Hailemariam
2012-01-01
Snags (standing dead trees) are an essential structural component of forests. Because wildlife use of snags depends on size and decay stage, snag density estimation without any information about snag quality attributes is of little value for wildlife management decision makers. Little work has been done to develop models that allow multivariate estimation of snag density by snag quality class. Using climate, topography, Landsat TM data, stand age and forest type collected for 2356 forested Forest Inventory and Analysis plots in western Washington and western Oregon, we evaluated two multivariate techniques for their abilities to estimate density of snags by three decay classes. The density of live trees and snags in three decay classes (D1: recently dead, little decay; D2: decay, without top, some branches and bark missing; D3: extensive decay, missing bark and most branches) with diameter at breast height (DBH) ≥ 12.7 cm was estimated using a nonparametric random forest nearest neighbor imputation technique (RF) and a parametric two-stage model (QPORD), for which the number of trees per hectare was estimated with a Quasipoisson model in the first stage and the probability of belonging to a tree status class (live, D1, D2, D3) was estimated with an ordinal regression model in the second stage. The presence of large snags with DBH ≥ 50 cm was predicted using a logistic regression and RF imputation. Because of the more homogenous conditions on private forest lands, snag density by decay class was predicted with higher accuracies on private forest lands than on public lands, while presence of large snags was more accurately predicted on public lands, owing to the higher prevalence of large snags on public lands. RF outperformed the QPORD model in terms of percent accurate predictions, while QPORD provided smaller root mean square errors in predicting snag density by decay class. The logistic regression model achieved more accurate presence/absence classification
Efficient Estimation of Dynamic Density Functions with Applications in Streaming Data
Qahtan, Abdulhakim
2016-05-11
Recent advances in computing technology allow for collecting vast amount of data that arrive continuously in the form of streams. Mining data streams is challenged by the speed and volume of the arriving data. Furthermore, the underlying distribution of the data changes over the time in unpredicted scenarios. To reduce the computational cost, data streams are often studied in forms of condensed representation, e.g., Probability Density Function (PDF). This thesis aims at developing an online density estimator that builds a model called KDE-Track for characterizing the dynamic density of the data streams. KDE-Track estimates the PDF of the stream at a set of resampling points and uses interpolation to estimate the density at any given point. To reduce the interpolation error and computational complexity, we introduce adaptive resampling where more/less resampling points are used in high/low curved regions of the PDF. The PDF values at the resampling points are updated online to provide up-to-date model of the data stream. Comparing with other existing online density estimators, KDE-Track is often more accurate (as reflected by smaller error values) and more computationally efficient (as reflected by shorter running time). The anytime available PDF estimated by KDE-Track can be applied for visualizing the dynamic density of data streams, outlier detection and change detection in data streams. In this thesis work, the first application is to visualize the taxi traffic volume in New York city. Utilizing KDE-Track allows for visualizing and monitoring the traffic flow on real time without extra overhead and provides insight analysis of the pick up demand that can be utilized by service providers to improve service availability. The second application is to detect outliers in data streams from sensor networks based on the estimated PDF. The method detects outliers accurately and outperforms baseline methods designed for detecting and cleaning outliers in sensor data. The
Recursive Density Estimation of NA Samples%样本的递归密度估计
Institute of Scientific and Technical Information of China (English)
张冬霞; 梁汉营
2008-01-01
Let{Xn,n≥1} be a strictly stationary sequence of negatively associated random variables with the marginal probability density function f (x). In this paper, we discuss the point asymptotic normality for recursive kernel density estimator of f (x).%设{{Xn, n≥1}是一个严平稳的负相协的随机变量序列,其概率密谋函数为f(x).本文讨论了f(x)的递归核估计量的联合渐近正态性.
Institute of Scientific and Technical Information of China (English)
XUE Yun-feng; WANG Yu-jia; YANG Jie
2009-01-01
A new algorithm for linear instantaneous independent component analysis is proposed based on max-imizing the log-likelihood contrast function which can be changed into a gradient equation. An iterative method is introduced to solve this equation efficiently. The unknown probability density functions as well as their first and second derivatives in the gradient equation are estimated by kernel density method. Computer simulations on artificially generated signals and gray scale natural scene images confirm the efficiency and accuracy of the proposed algorithm.
Normative Mediation of Reactions to Crowding
Karlin, Robert A.; And Others
1976-01-01
This study manipulated norms governing interaction levels in crowded groups of women. Results indicated norms influenced reactions to crowding as predicted. Women reacted most positively when interaction levels were high and most negatively when interaction levels were low. (Author)
Institute of Scientific and Technical Information of China (English)
LI Ning; SHI Tielin
2007-01-01
Blind source Separation and estimation of the number of sources usually demand that the number of sensors should be greater than or equal to that of the sources, which, however, is very difficult to satisfy for the complex Systems. A new estimating method based on power spectral density (PSD) is presented. When the relation between the number of sensors and that of sources is unknown, the PSD matrix is first obtained by the ratio of PSD of the observation signals, and then the bound of the number of correlated sources with common frequencies can be estimated by comparing every column vector of PSD matrix. The effectiveness of the proposed method is verified by theoretical analysis and experiments, and the influence of noise on the estimation of number of source is simulated.
Cavuoti, Stefano; Brescia, Massimo; Vellucci, Civita; Tortora, Crescenzo; Longo, Giuseppe
2016-01-01
A variety of fundamental astrophysical science topics require the determination of very accurate photometric redshifts (photo-z's). A wide plethora of methods have been developed, based either on template models fitting or on empirical explorations of the photometric parameter space. Machine learning based techniques are not explicitly dependent on the physical priors and able to produce accurate photo-z estimations within the photometric ranges derived from the spectroscopic training set. These estimates, however, are not easy to characterize in terms of a photo-z Probability Density Function (PDF), due to the fact that the analytical relation mapping the photometric parameters onto the redshift space is virtually unknown. We present METAPHOR (Machine-learning Estimation Tool for Accurate PHOtometric Redshifts), a method designed to provide a reliable PDF of the error distribution for empirical techniques. The method is implemented as a modular workflow, whose internal engine for photo-z estimation makes use...
Lombardo, Marco; Serrao, Sebastiano; Lombardo, Giuseppe
2014-01-01
To investigate the influence of various technical factors on the variation of cone packing density estimates in adaptive optics flood illuminated retinal images. Adaptive optics images of the photoreceptor mosaic were obtained in fifteen healthy subjects. The cone density and Voronoi diagrams were assessed in sampling windows of 320×320 µm, 160×160 µm and 64×64 µm at 1.5 degree temporal and superior eccentricity from the preferred locus of fixation (PRL). The technical factors that have been analyzed included the sampling window size, the corrected retinal magnification factor (RMFcorr), the conversion from radial to linear distance from the PRL, the displacement between the PRL and foveal center and the manual checking of cone identification algorithm. Bland-Altman analysis was used to assess the agreement between cone density estimated within the different sampling window conditions. The cone density declined with decreasing sampling area and data between areas of different size showed low agreement. A high agreement was found between sampling areas of the same size when comparing density calculated with or without using individual RMFcorr. The agreement between cone density measured at radial and linear distances from the PRL and between data referred to the PRL or the foveal center was moderate. The percentage of Voronoi tiles with hexagonal packing arrangement was comparable between sampling areas of different size. The boundary effect, presence of any retinal vessels, and the manual selection of cones missed by the automated identification algorithm were identified as the factors influencing variation of cone packing arrangements in Voronoi diagrams. The sampling window size is the main technical factor that influences variation of cone density. Clear identification of each cone in the image and the use of a large buffer zone are necessary to minimize factors influencing variation of Voronoi diagrams of the cone mosaic.
Directory of Open Access Journals (Sweden)
Marco Lombardo
Full Text Available PURPOSE: To investigate the influence of various technical factors on the variation of cone packing density estimates in adaptive optics flood illuminated retinal images. METHODS: Adaptive optics images of the photoreceptor mosaic were obtained in fifteen healthy subjects. The cone density and Voronoi diagrams were assessed in sampling windows of 320×320 µm, 160×160 µm and 64×64 µm at 1.5 degree temporal and superior eccentricity from the preferred locus of fixation (PRL. The technical factors that have been analyzed included the sampling window size, the corrected retinal magnification factor (RMFcorr, the conversion from radial to linear distance from the PRL, the displacement between the PRL and foveal center and the manual checking of cone identification algorithm. Bland-Altman analysis was used to assess the agreement between cone density estimated within the different sampling window conditions. RESULTS: The cone density declined with decreasing sampling area and data between areas of different size showed low agreement. A high agreement was found between sampling areas of the same size when comparing density calculated with or without using individual RMFcorr. The agreement between cone density measured at radial and linear distances from the PRL and between data referred to the PRL or the foveal center was moderate. The percentage of Voronoi tiles with hexagonal packing arrangement was comparable between sampling areas of different size. The boundary effect, presence of any retinal vessels, and the manual selection of cones missed by the automated identification algorithm were identified as the factors influencing variation of cone packing arrangements in Voronoi diagrams. CONCLUSIONS: The sampling window size is the main technical factor that influences variation of cone density. Clear identification of each cone in the image and the use of a large buffer zone are necessary to minimize factors influencing variation of Voronoi
The Walking Behaviour of Pedestrian Social Groups and Its Impact on Crowd Dynamics
Moussaïd, Mehdi; Perozo, Niriaska; Garnier, Simon; Helbing, Dirk; Theraulaz, Guy
2010-01-01
Human crowd motion is mainly driven by self-organized processes based on local interactions among pedestrians. While most studies of crowd behaviour consider only interactions among isolated individuals, it turns out that up to 70% of people in a crowd are actually moving in groups, such as friends, couples, or families walking together. These groups constitute medium-scale aggregated structures and their impact on crowd dynamics is still largely unknown. In this work, we analyze the motion of approximately 1500 pedestrian groups under natural condition, and show that social interactions among group members generate typical group walking patterns that influence crowd dynamics. At low density, group members tend to walk side by side, forming a line perpendicular to the walking direction. As the density increases, however, the linear walking formation is bent forward, turning it into a V-like pattern. These spatial patterns can be well described by a model based on social communication between group members. We show that the V-like walking pattern facilitates social interactions within the group, but reduces the flow because of its “non-aerodynamic” shape. Therefore, when crowd density increases, the group organization results from a trade-off between walking faster and facilitating social exchange. These insights demonstrate that crowd dynamics is not only determined by physical constraints induced by other pedestrians and the environment, but also significantly by communicative, social interactions among individuals. PMID:20383280
The walking behaviour of pedestrian social groups and its impact on crowd dynamics.
Directory of Open Access Journals (Sweden)
Mehdi Moussaïd
Full Text Available Human crowd motion is mainly driven by self-organized processes based on local interactions among pedestrians. While most studies of crowd behaviour consider only interactions among isolated individuals, it turns out that up to 70% of people in a crowd are actually moving in groups, such as friends, couples, or families walking together. These groups constitute medium-scale aggregated structures and their impact on crowd dynamics is still largely unknown. In this work, we analyze the motion of approximately 1500 pedestrian groups under natural condition, and show that social interactions among group members generate typical group walking patterns that influence crowd dynamics. At low density, group members tend to walk side by side, forming a line perpendicular to the walking direction. As the density increases, however, the linear walking formation is bent forward, turning it into a V-like pattern. These spatial patterns can be well described by a model based on social communication between group members. We show that the V-like walking pattern facilitates social interactions within the group, but reduces the flow because of its "non-aerodynamic" shape. Therefore, when crowd density increases, the group organization results from a trade-off between walking faster and facilitating social exchange. These insights demonstrate that crowd dynamics is not only determined by physical constraints induced by other pedestrians and the environment, but also significantly by communicative, social interactions among individuals.
人群运动与密度估计技术研究%Estimate technology study of crowd movement and density
Institute of Scientific and Technical Information of China (English)
王尔丹
2005-01-01
人群的密度和运动估计对于人群安全和建筑物设计至关重要.本文研究了基于图像和视频处理的智能化人群自动估计方法.在人群密度估计上,对于低密度人群图像采用基于像素统计的方法,对于较高密度人群图像采用基于多尺度分析和分形的纹理分析方法,并应用支撑向量机进行人群密度等级分类.在人群运动估计上,使用块匹配法对人群运动速度进行估计.对人群图像和视频的实验表明,本文的方法较以前的方法更为准确有效.
Somershoe, S.G.; Twedt, D.J.; Reid, B.
2006-01-01
We combined Breeding Bird Survey point count protocol and distance sampling to survey spring migrant and breeding birds in Vicksburg National Military Park on 33 days between March and June of 2003 and 2004. For 26 of 106 detected species, we used program DISTANCE to estimate detection probabilities and densities from 660 3-min point counts in which detections were recorded within four distance annuli. For most species, estimates of detection probability, and thereby density estimates, were improved through incorporation of the proportion of forest cover at point count locations as a covariate. Our results suggest Breeding Bird Surveys would benefit from the use of distance sampling and a quantitative characterization of habitat at point count locations. During spring migration, we estimated that the most common migrant species accounted for a population of 5000-9000 birds in Vicksburg National Military Park (636 ha). Species with average populations of 300 individuals during migration were: Blue-gray Gnatcatcher (Polioptila caerulea), Cedar Waxwing (Bombycilla cedrorum), White-eyed Vireo (Vireo griseus), Indigo Bunting (Passerina cyanea), and Ruby-crowned Kinglet (Regulus calendula). Of 56 species that bred in Vicksburg National Military Park, we estimated that the most common 18 species accounted for 8150 individuals. The six most abundant breeding species, Blue-gray Gnatcatcher, White-eyed Vireo, Summer Tanager (Piranga rubra), Northern Cardinal (Cardinalis cardinalis), Carolina Wren (Thryothorus ludovicianus), and Brown-headed Cowbird (Molothrus ater), accounted for 5800 individuals.
Nearest neighbor density ratio estimation for large-scale applications in astronomy
Kremer, J.; Gieseke, F.; Steenstrup Pedersen, K.; Igel, C.
2015-09-01
In astronomical applications of machine learning, the distribution of objects used for building a model is often different from the distribution of the objects the model is later applied to. This is known as sample selection bias, which is a major challenge for statistical inference as one can no longer assume that the labeled training data are representative. To address this issue, one can re-weight the labeled training patterns to match the distribution of unlabeled data that are available already in the training phase. There are many examples in practice where this strategy yielded good results, but estimating the weights reliably from a finite sample is challenging. We consider an efficient nearest neighbor density ratio estimator that can exploit large samples to increase the accuracy of the weight estimates. To solve the problem of choosing the right neighborhood size, we propose to use cross-validation on a model selection criterion that is unbiased under covariate shift. The resulting algorithm is our method of choice for density ratio estimation when the feature space dimensionality is small and sample sizes are large. The approach is simple and, because of the model selection, robust. We empirically find that it is on a par with established kernel-based methods on relatively small regression benchmark datasets. However, when applied to large-scale photometric redshift estimation, our approach outperforms the state-of-the-art.
Pedotransfer functions for Irish soils - estimation of bulk density (ρb) per horizon type
Reidy, B.; Simo, I.; Sills, P.; Creamer, R. E.
2016-01-01
Soil bulk density is a key property in defining soil characteristics. It describes the packing structure of the soil and is also essential for the measurement of soil carbon stock and nutrient assessment. In many older surveys this property was neglected and in many modern surveys this property is omitted due to cost both in laboratory and labour and in cases where the core method cannot be applied. To overcome these oversights pedotransfer functions are applied using other known soil properties to estimate bulk density. Pedotransfer functions have been derived from large international data sets across many studies, with their own inherent biases, many ignoring horizonation and depth variances. Initially pedotransfer functions from the literature were used to predict different horizon type bulk densities using local known bulk density data sets. Then the best performing of the pedotransfer functions were selected to recalibrate and then were validated again using the known data. The predicted co-efficient of determination was 0.5 or greater in 12 of the 17 horizon types studied. These new equations allowed gap filling where bulk density data were missing in part or whole soil profiles. This then allowed the development of an indicative soil bulk density map for Ireland at 0-30 and 30-50 cm horizon depths. In general the horizons with the largest known data sets had the best predictions, using the recalibrated and validated pedotransfer functions.
Haben, Stephen
2016-01-01
We present a model for generating probabilistic forecasts by combining kernel density estimation (KDE) and quantile regression techniques, as part of the probabilistic load forecasting track of the Global Energy Forecasting Competition 2014. The KDE method is initially implemented with a time-decay parameter. We later improve this method by conditioning on the temperature or the period of the week variables to provide more accurate forecasts. Secondly, we develop a simple but effective quantile regression forecast. The novel aspects of our methodology are two-fold. First, we introduce symmetry into the time-decay parameter of the kernel density estimation based forecast. Secondly we combine three probabilistic forecasts with different weights for different periods of the month.
Directory of Open Access Journals (Sweden)
José Fajardo
2012-12-01
Full Text Available This paper uses the Liu et al. (2007 approach to estimate the optionimplied Risk-Neutral Densities (RND, real-world density (RWD, and relative risk aversion from the Brazilian Real/US Dollar exchange rate distribution. Our empirical application uses a sample of exchange-traded Brazilian Real currency options from 1999 to 2011. Our estimated value of the relative risk aversion is around 2.7, which is in line with other articles for the Brazilian Economy. Our out-of-sample results showed that the RND has some ability to forecast the Brazilian Real exchange rate, but when we incorporate the risk aversion, the out-of-sample performance improves substantially.
Comparing crowding in human and ideal observers
van den Berg, Ronald; Johnson, Addie; Anton, Angela Martinez; Schepers, Anne L.; Cornelissen, Frans W.
2012-01-01
A visual target is more difficult to recognize when it is surrounded by other, similar objects. This breakdown in object recognition is known as crowding. Despite a long history of experimental work, computational models of crowding are still sparse. Specifically, few studies have examined crowding
Emergency department crowding: Factors influencing flow
van der Linden, M.C.
2015-01-01
This thesis focuses on emergency department (ED) crowding. In the first part (ED crowding in the Netherlands) the current state of EDs regarding patients’ length of stay and ED managers’ experiences of crowding are described. Part two (input factors) contains three studies which describe the case lo
Crowd behaviour during high-stress evacuations in an immersive virtual environment
Kapadia, Mubbasir; Thrash, Tyler; Sumner, Robert W.; Gross, Markus; Helbing, Dirk; Hölscher, Christoph
2016-01-01
Understanding the collective dynamics of crowd movements during stressful emergency situations is central to reducing the risk of deadly crowd disasters. Yet, their systematic experimental study remains a challenging open problem due to ethical and methodological constraints. In this paper, we demonstrate the viability of shared three-dimensional virtual environments as an experimental platform for conducting crowd experiments with real people. In particular, we show that crowds of real human subjects moving and interacting in an immersive three-dimensional virtual environment exhibit typical patterns of real crowds as observed in real-life crowded situations. These include the manifestation of social conventions and the emergence of self-organized patterns during egress scenarios. High-stress evacuation experiments conducted in this virtual environment reveal movements characterized by mass herding and dangerous overcrowding as they occur in crowd disasters. We describe the behavioural mechanisms at play under such extreme conditions and identify critical zones where overcrowding may occur. Furthermore, we show that herding spontaneously emerges from a density effect without the need to assume an increase of the individual tendency to imitate peers. Our experiments reveal the promise of immersive virtual environments as an ethical, cost-efficient, yet accurate platform for exploring crowd behaviour in high-risk situations with real human subjects. PMID:27605166
The Financial Impact of Emergency Department Crowding
Directory of Open Access Journals (Sweden)
Foley, Mathew
2011-05-01
Full Text Available Objective: The economic benefits of reducing emergency department (ED crowding are potentially substantial as they may decrease hospital length of stay. Hospital administrators and public officials may therefore be motivated to implement crowding protocols. We sought to identify a potential cost of ED crowding by evaluating the contribution of excess ED length of stay (LOS to overall hospital length of stay. Methods: We performed a retrospective review of administrative data of adult patients from two urban hospitals (one county and one university in Brooklyn, New York from 2006-2007. Data was provided by each facility. Extrapolating from prior research (Krochmal and Riley, 2005, we determined the increase in total hospital LOS due to extended ED lengths of stay, and applied cost and charge analyses for the two separate facilities. Results: We determined that 6,205 (5.0% admitted adult patients from the county facility and 3,017 (3.4% patients from the university facility were held in the ED greater than one day over a one-year period. From prior research, it has been estimated that each of these patient’s total hospital length of stay was increased on average by 11.7% (0.61 days at the county facility, and 0.71 days at the university facility. The increased charges over one year at the county facility due to the extended ED LOS was therefore approximately $9.8 million, while the increased costs at the university facility were approximately $3.9 million. Conclusion: Based on extrapolations from Krochmal and Riley applied to two New York urban hospitals, the county hospital could potentially save $9.8 million in charges and the university hospital $3.9 million in costs per year if they eliminate ED boarding of adult admitted patients by improving movement to the inpatient setting. [West J Emerg Med. 2011;12(2:192-197.
Stewart, Robert; White, Devin; Urban, Marie; Morton, April; Webster, Clayton; Stoyanov, Miroslav; Bright, Eddie; Bhaduri, Budhendra L.
2013-05-01
The Population Density Tables (PDT) project at Oak Ridge National Laboratory (www.ornl.gov) is developing population density estimates for specific human activities under normal patterns of life based largely on information available in open source. Currently, activity-based density estimates are based on simple summary data statistics such as range and mean. Researchers are interested in improving activity estimation and uncertainty quantification by adopting a Bayesian framework that considers both data and sociocultural knowledge. Under a Bayesian approach, knowledge about population density may be encoded through the process of expert elicitation. Due to the scale of the PDT effort which considers over 250 countries, spans 50 human activity categories, and includes numerous contributors, an elicitation tool is required that can be operationalized within an enterprise data collection and reporting system. Such a method would ideally require that the contributor have minimal statistical knowledge, require minimal input by a statistician or facilitator, consider human difficulties in expressing qualitative knowledge in a quantitative setting, and provide methods by which the contributor can appraise whether their understanding and associated uncertainty was well captured. This paper introduces an algorithm that transforms answers to simple, non-statistical questions into a bivariate Gaussian distribution as the prior for the Beta distribution. Based on geometric properties of the Beta distribution parameter feasibility space and the bivariate Gaussian distribution, an automated method for encoding is developed that responds to these challenging enterprise requirements. Though created within the context of population density, this approach may be applicable to a wide array of problem domains requiring informative priors for the Beta distribution.
Energy Technology Data Exchange (ETDEWEB)
Stewart, Robert N [ORNL; White, Devin A [ORNL; Urban, Marie L [ORNL; Morton, April M [ORNL; Webster, Clayton G [ORNL; Stoyanov, Miroslav K [ORNL; Bright, Eddie A [ORNL; Bhaduri, Budhendra L [ORNL
2013-01-01
The Population Density Tables (PDT) project at the Oak Ridge National Laboratory (www.ornl.gov) is developing population density estimates for specific human activities under normal patterns of life based largely on information available in open source. Currently, activity based density estimates are based on simple summary data statistics such as range and mean. Researchers are interested in improving activity estimation and uncertainty quantification by adopting a Bayesian framework that considers both data and sociocultural knowledge. Under a Bayesian approach knowledge about population density may be encoded through the process of expert elicitation. Due to the scale of the PDT effort which considers over 250 countries, spans 40 human activity categories, and includes numerous contributors, an elicitation tool is required that can be operationalized within an enterprise data collection and reporting system. Such a method would ideally require that the contributor have minimal statistical knowledge, require minimal input by a statistician or facilitator, consider human difficulties in expressing qualitative knowledge in a quantitative setting, and provide methods by which the contributor can appraise whether their understanding and associated uncertainty was well captured. This paper introduces an algorithm that transforms answers to simple, non-statistical questions into a bivariate Gaussian distribution as the prior for the Beta distribution. Based on geometric properties of the Beta distribution parameter feasibility space and the bivariate Gaussian distribution, an automated method for encoding is developed that responds to these challenging enterprise requirements. Though created within the context of population density, this approach may be applicable to a wide array of problem domains requiring informative priors for the Beta distribution.
On the rate of convergence of the maximum likelihood estimator of a k-monotone density
Institute of Scientific and Technical Information of China (English)
WELLNER; Jon; A
2009-01-01
Bounds for the bracketing entropy of the classes of bounded k-monotone functions on [0,A] are obtained under both the Hellinger distance and the Lp(Q) distance,where 1 p < ∞ and Q is a probability measure on [0,A].The result is then applied to obtain the rate of convergence of the maximum likelihood estimator of a k-monotone density.
On the rate of convergence of the maximum likelihood estimator of a K-monotone density
Institute of Scientific and Technical Information of China (English)
GAO FuChang; WELLNER Jon A
2009-01-01
Bounds for the bracketing entropy of the classes of bounded K-monotone functions on [0, A] are obtained under both the Hellinger distance and the LP(Q) distance, where 1 ≤ p < ∞ and Q is a probability measure on [0, A]. The result is then applied to obtain the rate of convergence of the maximum likelihood estimator of a K-monotone density.
DEFF Research Database (Denmark)
Rosholm, A; Hyldstrup, L; Backsgaard, L
2002-01-01
A new automated radiogrammetric method to estimate bone mineral density (BMD) from a single radiograph of the hand and forearm is described. Five regions of interest in radius, ulna and the three middle metacarpal bones are identified and approximately 1800 geometrical measurements from these bones......-ray absoptiometry (r = 0.86, p Relative to this age-related loss, the reported short...... sites and a precision that potentially allows for relatively short observation intervals. Udgivelsesdato: 2001-null...
2014-06-29
Centra de Geofisica Universidade de Lisboa Lisbon, Portugal. Award Number: N00014-11 -1 -0615 This project was a collaborative project between...submitted or in prep) from the University of St Andrews (UStA) and Universidade de Lisboa (UL) research effort. The work has also generated multiple...routines. Task 1.4. Use distance sampling software , Distance (Thomas et al. 2010), to estimate seasonal density, incorporating covariates affecting
Stochastic estimation of level density in nuclear shell-model calculations
Directory of Open Access Journals (Sweden)
Shimizu Noritaka
2016-01-01
Full Text Available An estimation method of the nuclear level density stochastically based on nuclear shell-model calculations is introduced. In order to count the number of the eigen-values of the shell-model Hamiltonian matrix, we perform the contour integral of the matrix element of a resolvent. The shifted block Krylov subspace method enables us its efficient computation. Utilizing this method, the contamination of center-of-mass motion is clearly removed.
2015-09-30
1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Effect of Broadband Nature of Marine Mammal Echolocation...modeled for different marine mammal species and detectors and assess the magnitude of error on the estimated density due to various commonly used...noise limited (von Benda-Beckmann et al. 2010). A three hour segment, previously audited by human operators to ensure no marine mammals were present in
Use of spatial capture-recapture modeling and DNA data to estimate densities of elusive animals
Kery, Marc; Gardner, Beth; Stoeckle, Tabea; Weber, Darius; Royle, J. Andrew
2011-01-01
Assessment of abundance, survival, recruitment rates, and density (i.e., population assessment) is especially challenging for elusive species most in need of protection (e.g., rare carnivores). Individual identification methods, such as DNA sampling, provide ways of studying such species efficiently and noninvasively. Additionally, statistical methods that correct for undetected animals and account for locations where animals are captured are available to efficiently estimate density and other demographic parameters. We collected hair samples of European wildcat (Felis silvestris) from cheek-rub lure sticks, extracted DNA from the samples, and identified each animals' genotype. To estimate the density of wildcats, we used Bayesian inference in a spatial capture-recapture model. We used WinBUGS to fit a model that accounted for differences in detection probability among individuals and seasons and between two lure arrays. We detected 21 individual wildcats (including possible hybrids) 47 times. Wildcat density was estimated at 0.29/km2 (SE 0.06), and 95% of the activity of wildcats was estimated to occur within 1.83 km from their home-range center. Lures located systematically were associated with a greater number of detections than lures placed in a cell on the basis of expert opinion. Detection probability of individual cats was greatest in late March. Our model is a generalized linear mixed model; hence, it can be easily extended, for instance, to incorporate trap- and individual-level covariates. We believe that the combined use of noninvasive sampling techniques and spatial capture-recapture models will improve population assessments, especially for rare and elusive animals.
Kernel Density Feature Points Estimator for Content-Based Image Retrieval
Zuva, Tranos; Ojo, Sunday O; Ngwira, Seleman M
2012-01-01
Research is taking place to find effective algorithms for content-based image representation and description. There is a substantial amount of algorithms available that use visual features (color, shape, texture). Shape feature has attracted much attention from researchers that there are many shape representation and description algorithms in literature. These shape image representation and description algorithms are usually not application independent or robust, making them undesirable for generic shape description. This paper presents an object shape representation using Kernel Density Feature Points Estimator (KDFPE). In this method, the density of feature points within defined rings around the centroid of the image is obtained. The KDFPE is then applied to the vector of the image. KDFPE is invariant to translation, scale and rotation. This method of image representation shows improved retrieval rate when compared to Density Histogram Feature Points (DHFP) method. Analytic analysis is done to justify our m...
Estimation of Plasma Density by Surface Plasmons for Surface-Wave Plasmas
Institute of Scientific and Technical Information of China (English)
CHEN Zhao-Quan; LIU Ming-Hai; LAN Chao-Hui; CHEN Wei; LUO Zhi-Qing; HU Xi-Wei
2008-01-01
@@ An estimation method of plasma density based on surface plasmons theory for surface-wave plasmas is proposed. The number of standing-wave is obtained directly from the discharge image, and the propagation constant is calculated with the trim size of the apparatus in this method, then plasma density can be determined with the value of 9.1 × 1017 m-3. Plasma density is measured using a Langmuir probe, the value is 8.1 × 1017 m-3 which is very close to the predicted value of surface plasmons theory. Numerical simulation is used to check the number of standing-wave by the finite-difference time-domain (FDTD) method also. All results are compatible both of theoretical analysis and experimental measurement.
A new approach on seismic mortality estimations based on average population density
Zhu, Xiaoxin; Sun, Baiqing; Jin, Zhanyong
2016-12-01
This study examines a new methodology to predict the final seismic mortality from earthquakes in China. Most studies established the association between mortality estimation and seismic intensity without considering the population density. In China, however, the data are not always available, especially when it comes to the very urgent relief situation in the disaster. And the population density varies greatly from region to region. This motivates the development of empirical models that use historical death data to provide the path to analyze the death tolls for earthquakes. The present paper employs the average population density to predict the final death tolls in earthquakes using a case-based reasoning model from realistic perspective. To validate the forecasting results, historical data from 18 large-scale earthquakes occurred in China are used to estimate the seismic morality of each case. And a typical earthquake case occurred in the northwest of Sichuan Province is employed to demonstrate the estimation of final death toll. The strength of this paper is that it provides scientific methods with overall forecast errors lower than 20 %, and opens the door for conducting final death forecasts with a qualitative and quantitative approach. Limitations and future research are also analyzed and discussed in the conclusion.
Wavelet-based density estimation for noise reduction in plasma simulations using particles
van yen, Romain Nguyen; del-Castillo-Negrete, Diego; Schneider, Kai; Farge, Marie; Chen, Guangye
2010-04-01
For given computational resources, the accuracy of plasma simulations using particles is mainly limited by the noise due to limited statistical sampling in the reconstruction of the particle distribution function. A method based on wavelet analysis is proposed and tested to reduce this noise. The method, known as wavelet-based density estimation (WBDE), was previously introduced in the statistical literature to estimate probability densities given a finite number of independent measurements. Its novel application to plasma simulations can be viewed as a natural extension of the finite size particles (FSP) approach, with the advantage of estimating more accurately distribution functions that have localized sharp features. The proposed method preserves the moments of the particle distribution function to a good level of accuracy, has no constraints on the dimensionality of the system, does not require an a priori selection of a global smoothing scale, and its able to adapt locally to the smoothness of the density based on the given discrete particle data. Moreover, the computational cost of the denoising stage is of the same order as one time step of a FSP simulation. The method is compared with a recently proposed proper orthogonal decomposition based method, and it is tested with three particle data sets involving different levels of collisionality and interaction with external and self-consistent fields.
Use of prediction methods to estimate true density of active pharmaceutical ingredients.
Cao, Xiaoping; Leyva, Norma; Anderson, Stephen R; Hancock, Bruno C
2008-05-01
True density is a fundamental and important property of active pharmaceutical ingredients (APIs). Using prediction methods to estimate the API true density can be very beneficial in pharmaceutical research and development, especially when experimental measurements cannot be made due to lack of material or sample handling restrictions. In this paper, two empirical prediction methods developed by Girolami and Immirzi and Perini were used to estimate the true density of APIs, and the estimation results were compared with experimentally measured values by helium pycnometry. The Girolami method is simple and can be used for both liquids and solids. For the tested APIs, the Girolami method had a maximum error of -12.7% and an average percent error of -3.0% with a 95% CI of (-3.8, -2.3%). The Immirzi and Perini method is more involved and is mainly used for solid crystals. In general, it gives better predictions than the Girolami method. For the tested APIs, the Immirzi and Perini method had a maximum error of 9.6% and an average percent error of 0.9% with a 95% CI of (0.3, 1.6%).
Energy Technology Data Exchange (ETDEWEB)
Balsa Terzic, Gabriele Bassi
2011-07-01
In this paper we discuss representations of charge particle densities in particle-in-cell (PIC) simulations, analyze the sources and profiles of the intrinsic numerical noise, and present efficient methods for their removal. We devise two alternative estimation methods for charged particle distribution which represent significant improvement over the Monte Carlo cosine expansion used in the 2d code of Bassi, designed to simulate coherent synchrotron radiation (CSR) in charged particle beams. The improvement is achieved by employing an alternative beam density estimation to the Monte Carlo cosine expansion. The representation is first binned onto a finite grid, after which two grid-based methods are employed to approximate particle distributions: (i) truncated fast cosine transform (TFCT); and (ii) thresholded wavelet transform (TWT). We demonstrate that these alternative methods represent a staggering upgrade over the original Monte Carlo cosine expansion in terms of efficiency, while the TWT approximation also provides an appreciable improvement in accuracy. The improvement in accuracy comes from a judicious removal of the numerical noise enabled by the wavelet formulation. The TWT method is then integrated into Bassi's CSR code, and benchmarked against the original version. We show that the new density estimation method provides a superior performance in terms of efficiency and spatial resolution, thus enabling high-fidelity simulations of CSR effects, including microbunching instability.
Examining the impact of the precision of address geocoding on estimated density of crime locations
Harada, Yutaka; Shimada, Takahito
2006-10-01
This study examines the impact of the precision of address geocoding on the estimated density of crime locations in a large urban area of Japan. The data consist of two separate sets of the same Penal Code offenses known to the police that occurred during a nine-month period of April 1, 2001 through December 31, 2001 in the central 23 wards of Tokyo. These two data sets are derived from older and newer recording system of the Tokyo Metropolitan Police Department (TMPD), which revised its crime reporting system in that year so that more precise location information than the previous years could be recorded. Each of these data sets was address-geocoded onto a large-scale digital map, using our hierarchical address-geocoding schema, and was examined how such differences in the precision of address information and the resulting differences in address-geocoded incidence locations affect the patterns in kernel density maps. An analysis using 11,096 pairs of incidences of residential burglary (each pair consists of the same incidents geocoded using older and newer address information, respectively) indicates that the kernel density estimation with a cell size of 25×25 m and a bandwidth of 500 m may work quite well in absorbing the poorer precision of geocoded locations based on data from older recording system, whereas in several areas where older recording system resulted in very poor precision level, the inaccuracy of incident locations may produce artifactitious and potentially misleading patterns in kernel density maps.
Carroll, Raymond J.
2011-03-01
In many applications we can expect that, or are interested to know if, a density function or a regression curve satisfies some specific shape constraints. For example, when the explanatory variable, X, represents the value taken by a treatment or dosage, the conditional mean of the response, Y , is often anticipated to be a monotone function of X. Indeed, if this regression mean is not monotone (in the appropriate direction) then the medical or commercial value of the treatment is likely to be significantly curtailed, at least for values of X that lie beyond the point at which monotonicity fails. In the case of a density, common shape constraints include log-concavity and unimodality. If we can correctly guess the shape of a curve, then nonparametric estimators can be improved by taking this information into account. Addressing such problems requires a method for testing the hypothesis that the curve of interest satisfies a shape constraint, and, if the conclusion of the test is positive, a technique for estimating the curve subject to the constraint. Nonparametric methodology for solving these problems already exists, but only in cases where the covariates are observed precisely. However in many problems, data can only be observed with measurement errors, and the methods employed in the error-free case typically do not carry over to this error context. In this paper we develop a novel approach to hypothesis testing and function estimation under shape constraints, which is valid in the context of measurement errors. Our method is based on tilting an estimator of the density or the regression mean until it satisfies the shape constraint, and we take as our test statistic the distance through which it is tilted. Bootstrap methods are used to calibrate the test. The constrained curve estimators that we develop are also based on tilting, and in that context our work has points of contact with methodology in the error-free case.
Eye movements, search and crowding
Vlaskamp, B.N.S.
2006-01-01
If you fixate a single letter in a text, you will notice that it is impossible to identify a letter that is only a few letters away. This is caused by letters that flank this target letter. This is an example of the 'crowding' phenomenon, i.e. items that are close enough to each other interfere with
Review on Vehicular Speed, Density Estimation and Classification Using Acoustic Signal
Directory of Open Access Journals (Sweden)
Prashant Borkar
2013-09-01
Full Text Available Traffic monitoring and parameters estimation from urban to non urban (battlefield environment traffic is fast-emerging field based on acoustic signals. We present here a comprehensive review of the state-of-the-art acoustic signal for vehicular speed estimation, density estimation and classification, critical analysis and an outlook to future research directions. This field is of increasing relevance for intelligent transport systems (ITSs. In recent years video monitoring and surveillance systems has been widely used in traffic management and hence traffic parameters can be achieved using such systems, but installation, operational and maintenance cost associated with these approaches are relatively high compared to the use of acoustic signal which is having very low installation and maintenance cost. The classification process includes sensing unit, class definition, feature extraction, classifier application and system evaluation. The acoustic classification system is part of a multi sensor real time environment for traffic surveillance and monitoring. Classification accuracy achieved by various studied algorithms shows very good performance for the ‘Heavy Weight’ class of vehicles as compared to the other category “Light Weight”. Also a slight performance degrades as vehicle speed increases. Vehicular speed estimation corresponds to average speed and traffic density measurement, and can be substantially used for traffic signal timings optimization.
Verdoolaege, G.; Von Hellermann, M. G.; Jaspers, R.; Ichir, M. M.; Van Oost, G.
2006-11-01
The validation of diagnostic date from a nuclear fusion experiment is an important issue. The concept of an Integrated Data Analysis (IDA) allows the consistent estimation of plasma parameters from heterogeneous data sets. Here, the determination of the ion effective charge (Zeff) is considered. Several diagnostic methods exist for the determination of Zeff, but the results are in general not in agreement. In this work, the problem of Zeff estimation on the TEXTOR tokamak is approached from the perspective of IDA, in the framework of Bayesian probability theory. The ultimate goal is the estimation of a full Zeff profile that is consistent both with measured bremsstrahlung emissivities, as well as individual impurity spectral line intensities obtained from Charge Exchange Recombination Spectroscopy (CXRS). We present an overview of the various uncertainties that enter the calculation of a Zeff profile from bremsstrahlung date on the one hand, and line intensity data on the other hand. We discuss a simple linear and nonlinear Bayesian model permitting the estimation of a central value for Zeff and the electron density ne on TEXTOR from bremsstrahlung emissivity measurements in the visible, and carbon densities derived from CXRS. Both the central Zeff and ne are sampled using an MCMC algorithm. An outlook is given towards possible model improvements.
Estimation of dislocation density from precession electron diffraction data using the Nye tensor
Energy Technology Data Exchange (ETDEWEB)
Leff, A.C. [Department of Materials Science & Engineering, Drexel University, Philadelphia, PA (United States); Weinberger, C.R. [Department of Mechanical Engineering and Mechanics, Drexel University, Philadelphia, PA (United States); Taheri, M.L., E-mail: mtaheri@coe.drexel.edu [Department of Materials Science & Engineering, Drexel University, Philadelphia, PA (United States)
2015-06-15
The Nye tensor offers a means to estimate the geometrically necessary dislocation density of a crystalline sample based on measurements of the orientation changes within individual crystal grains. In this paper, the Nye tensor theory is applied to precession electron diffraction automated crystallographic orientation mapping (PED-ACOM) data acquired using a transmission electron microscope (TEM). The resulting dislocation density values are mapped in order to visualize the dislocation structures present in a quantitative manner. These density maps are compared with other related methods of approximating local strain dependencies in dislocation-based microstructural transitions from orientation data. The effect of acquisition parameters on density measurements is examined. By decreasing the step size and spot size during data acquisition, an increasing fraction of the dislocation content becomes accessible. Finally, the method described herein is applied to the measurement of dislocation emission during in situ annealing of Cu in TEM in order to demonstrate the utility of the technique for characterizing microstructural dynamics. - Highlights: • Developed a method of mapping GND density using orientation mapping data from TEM. • As acquisition length-scale is decreased, all dislocations are considered GNDs. • Dislocation emission and corresponding grain rotation quantified.
Totaro, N.; Guyader, J. L.
2012-06-01
The present article deals with an extension of the Statistical modal Energy distribution Analysis (SmEdA) method to estimate kinetic and potential energy density in coupled subsystems. The SmEdA method uses the modal bases of uncoupled subsystems and focuses on the modal energies rather than the global energies of subsystems such as SEA (Statistical Energy Analysis). This method permits extending SEA to subsystems with low modal overlap or to localized excitations as it does not assume the existence of modal energy equipartition. We demonstrate that by using the modal energies of subsystems computed by SmEdA, it is possible to estimate energy distribution in subsystems. This approach has the same advantages of standard SEA, as it uses very short calculations to analyze damping effects. The estimation of energy distribution from SmEdA is applied to an academic case and an industrial example.
Sadeh, Iftach; Lahav, Ofer
2015-01-01
We present ANNz2, a new implementation of the public software for photometric redshift (photo-z) estimation of Collister and Lahav (2004). Large photometric galaxy surveys are important for cosmological studies, and in particular for characterizing the nature of dark energy. The success of such surveys greatly depends on the ability to measure photo-zs, based on limited spectral data. ANNz2 utilizes multiple machine learning methods, such as artificial neural networks, boosted decision/regression trees and k-nearest neighbours. The objective of the algorithm is to dynamically optimize the performance of the photo-z estimation, and to properly derive the associated uncertainties. In addition to single-value solutions, the new code also generates full probability density functions (PDFs) in two different ways. In addition, estimators are incorporated to mitigate possible problems of spectroscopic training samples which are not representative or are incomplete. ANNz2 is also adapted to provide optimized solution...
Optimal diffusion MRI acquisition for fiber orientation density estimation: an analytic approach.
White, Nathan S; Dale, Anders M
2009-11-01
An important challenge in the design of diffusion MRI experiments is how to optimize statistical efficiency, i.e., the accuracy with which parameters can be estimated from the diffusion data in a given amount of imaging time. In model-based spherical deconvolution analysis, the quantity of interest is the fiber orientation density (FOD). Here, we demonstrate how the spherical harmonics (SH) can be used to form an explicit analytic expression for the efficiency of the minimum variance (maximally efficient) linear unbiased estimator of the FOD. Using this expression, we calculate optimal b-values for maximum FOD estimation efficiency with SH expansion orders of L = 2, 4, 6, and 8 to be approximately b = 1,500, 3,000, 4,600, and 6,200 s/mm(2), respectively. However, the arrangement of diffusion directions and scanner-specific hardware limitations also play a role in determining the realizable efficiency of the FOD estimator that can be achieved in practice. We show how some commonly used methods for selecting diffusion directions are sometimes inefficient, and propose a new method for selecting diffusion directions in MRI based on maximizing the statistical efficiency. We further demonstrate how scanner-specific hardware limitations generally lead to optimal b-values that are slightly lower than the ideal b-values. In summary, the analytic expression for the statistical efficiency of the unbiased FOD estimator provides important insight into the fundamental tradeoff between angular resolution, b-value, and FOD estimation accuracy.
THE USE OF MATHEMATICAL MODELS FOR ESTIMATING WOOD BASIC DENSITY OF Eucalyptus sp CLONES
Directory of Open Access Journals (Sweden)
Cláudio Roberto Thiersch
2006-09-01
Full Text Available This study aimed at identifying at what point in the stem, in the longitudinal and cardinal direction, the pylodin penetration depth should be measured, for determining wood basic density, envisaging forestry inventory Data base used in compassed 36 parcels of 400 m2. Around the parcels 216 trees were sealed. Two clones (hybrid of E. grandis and E. urophylla, at the ages of 3; 4, 5 and 6 years, belonging to three different sites in East Brazil, encompassing East and Northeast of Espirito Santo state and south of Bahia state. In each measuring height of diameters it was also measured the penetration depth of the pylodin (in mm. The average basic density of scaled trees, was determined, departing from the cheaps, using the immersion method. The main conclusions were: The density equation, as function of the pylodin measures, age, site, diameters at 1.3m of ground and total height, was more precise, exact and stable than the density equation as function of pylodin, age, site and diameter, which in turn was more exact and stable than the density equation, as function of pylodin measures, age, site, diameter at a 1.3m of the ground and of total height, was precise and exact for all ages and sites, in dependent on if the pylodin measurements were taken in the South or in North fares, or in the average position between them. The height for measurement with pylodin can also be taken in the more ergonomic position of 1.3m. The density estimation, as a function of the measures with the pylodin, or as a function of the use of the pylodin, age, average dominant tree height an diameter at 1.3m of the ground, for both clones, was more precise when the measure with the pylodin was taken at the North face. The average tree basic density must always be taken by a specific equation for each clone, given that these equations differ statistically.
mBEEF-vdW: Robust fitting of error estimation density functionals
Lundgaard, Keld T.; Wellendorff, Jess; Voss, Johannes; Jacobsen, Karsten W.; Bligaard, Thomas
2016-06-01
We propose a general-purpose semilocal/nonlocal exchange-correlation functional approximation, named mBEEF-vdW. The exchange is a meta generalized gradient approximation, and the correlation is a semilocal and nonlocal mixture, with the Rutgers-Chalmers approximation for van der Waals (vdW) forces. The functional is fitted within the Bayesian error estimation functional (BEEF) framework [J. Wellendorff et al., Phys. Rev. B 85, 235149 (2012), 10.1103/PhysRevB.85.235149; J. Wellendorff et al., J. Chem. Phys. 140, 144107 (2014), 10.1063/1.4870397]. We improve the previously used fitting procedures by introducing a robust MM-estimator based loss function, reducing the sensitivity to outliers in the datasets. To more reliably determine the optimal model complexity, we furthermore introduce a generalization of the bootstrap 0.632 estimator with hierarchical bootstrap sampling and geometric mean estimator over the training datasets. Using this estimator, we show that the robust loss function leads to a 10 % improvement in the estimated prediction error over the previously used least-squares loss function. The mBEEF-vdW functional is benchmarked against popular density functional approximations over a wide range of datasets relevant for heterogeneous catalysis, including datasets that were not used for its training. Overall, we find that mBEEF-vdW has a higher general accuracy than competing popular functionals, and it is one of the best performing functionals on chemisorption systems, surface energies, lattice constants, and dispersion. We also show the potential-energy curve of graphene on the nickel(111) surface, where mBEEF-vdW matches the experimental binding length. mBEEF-vdW is currently available in gpaw and other density functional theory codes through Libxc, version 3.0.0.
Chestnut, Tara E.; Anderson, Chauncey; Popa, Radu; Blaustein, Andrew R.; Voytek, Mary; Olson, Deanna H.; Kirshtein, Julie
2014-01-01
Biodiversity losses are occurring worldwide due to a combination of stressors. For example, by one estimate, 40% of amphibian species are vulnerable to extinction, and disease is one threat to amphibian populations. The emerging infectious disease chytridiomycosis, caused by the aquatic fungus Batrachochytrium dendrobatidis (Bd), is a contributor to amphibian declines worldwide. Bd research has focused on the dynamics of the pathogen in its amphibian hosts, with little emphasis on investigating the dynamics of free-living Bd. Therefore, we investigated patterns of Bd occupancy and density in amphibian habitats using occupancy models, powerful tools for estimating site occupancy and detection probability. Occupancy models have been used to investigate diseases where the focus was on pathogen occurrence in the host. We applied occupancy models to investigate free-living Bd in North American surface waters to determine Bd seasonality, relationships between Bd site occupancy and habitat attributes, and probability of detection from water samples as a function of the number of samples, sample volume, and water quality. We also report on the temporal patterns of Bd density from a 4-year case study of a Bd-positive wetland. We provide evidence that Bd occurs in the environment year-round. Bd exhibited temporal and spatial heterogeneity in density, but did not exhibit seasonality in occupancy. Bd was detected in all months, typically at less than 100 zoospores L−1. The highest density observed was ∼3 million zoospores L−1. We detected Bd in 47% of sites sampled, but estimated that Bd occupied 61% of sites, highlighting the importance of accounting for imperfect detection. When Bd was present, there was a 95% chance of detecting it with four samples of 600 ml of water or five samples of 60 mL. Our findings provide important baseline information to advance the study of Bd disease ecology, and advance our understanding of amphibian exposure to free-living Bd in aquatic
Directory of Open Access Journals (Sweden)
Tara Chestnut
Full Text Available Biodiversity losses are occurring worldwide due to a combination of stressors. For example, by one estimate, 40% of amphibian species are vulnerable to extinction, and disease is one threat to amphibian populations. The emerging infectious disease chytridiomycosis, caused by the aquatic fungus Batrachochytrium dendrobatidis (Bd, is a contributor to amphibian declines worldwide. Bd research has focused on the dynamics of the pathogen in its amphibian hosts, with little emphasis on investigating the dynamics of free-living Bd. Therefore, we investigated patterns of Bd occupancy and density in amphibian habitats using occupancy models, powerful tools for estimating site occupancy and detection probability. Occupancy models have been used to investigate diseases where the focus was on pathogen occurrence in the host. We applied occupancy models to investigate free-living Bd in North American surface waters to determine Bd seasonality, relationships between Bd site occupancy and habitat attributes, and probability of detection from water samples as a function of the number of samples, sample volume, and water quality. We also report on the temporal patterns of Bd density from a 4-year case study of a Bd-positive wetland. We provide evidence that Bd occurs in the environment year-round. Bd exhibited temporal and spatial heterogeneity in density, but did not exhibit seasonality in occupancy. Bd was detected in all months, typically at less than 100 zoospores L(-1. The highest density observed was ∼3 million zoospores L(-1. We detected Bd in 47% of sites sampled, but estimated that Bd occupied 61% of sites, highlighting the importance of accounting for imperfect detection. When Bd was present, there was a 95% chance of detecting it with four samples of 600 ml of water or five samples of 60 mL. Our findings provide important baseline information to advance the study of Bd disease ecology, and advance our understanding of amphibian exposure to free
Directory of Open Access Journals (Sweden)
Yongjun Ahn
Full Text Available The charging infrastructure location problem is becoming more significant due to the extensive adoption of electric vehicles. Efficient charging station planning can solve deeply rooted problems, such as driving-range anxiety and the stagnation of new electric vehicle consumers. In the initial stage of introducing electric vehicles, the allocation of charging stations is difficult to determine due to the uncertainty of candidate sites and unidentified charging demands, which are determined by diverse variables. This paper introduces the Estimating the Required Density of EV Charging (ERDEC stations model, which is an analytical approach to estimating the optimal density of charging stations for certain urban areas, which are subsequently aggregated to city level planning. The optimal charging station's density is derived to minimize the total cost. A numerical study is conducted to obtain the correlations among the various parameters in the proposed model, such as regional parameters, technological parameters and coefficient factors. To investigate the effect of technological advances, the corresponding changes in the optimal density and total cost are also examined by various combinations of technological parameters. Daejeon city in South Korea is selected for the case study to examine the applicability of the model to real-world problems. With real taxi trajectory data, the optimal density map of charging stations is generated. These results can provide the optimal number of chargers for driving without driving-range anxiety. In the initial planning phase of installing charging infrastructure, the proposed model can be applied to a relatively extensive area to encourage the usage of electric vehicles, especially areas that lack information, such as exact candidate sites for charging stations and other data related with electric vehicles. The methods and results of this paper can serve as a planning guideline to facilitate the extensive
A long-term evaluation of biopsy darts and DNA to estimate cougar density
Beausoleil, Richard A.; Clark, Joseph D.; Maletzke, Benjamin T.
2016-01-01
Accurately estimating cougar (Puma concolor) density is usually based on long-term research consisting of intensive capture and Global Positioning System collaring efforts and may cost hundreds of thousands of dollars annually. Because wildlife agency budgets rarely accommodate this approach, most infer cougar density from published literature, rely on short-term studies, or use hunter harvest data as a surrogate in their jurisdictions; all of which may limit accuracy and increase risk of management actions. In an effort to develop a more cost-effective long-term strategy, we evaluated a research approach using citizen scientists with trained hounds to tree cougars and collect tissue samples with biopsy darts. We then used the DNA to individually identify cougars and employed spatially explicit capture–recapture models to estimate cougar densities. Overall, 240 tissue samples were collected in northeastern Washington, USA, producing 166 genotypes (including recaptures and excluding dependent kittens) of 133 different cougars (8-25/yr) from 2003 to 2011. Mark–recapture analyses revealed a mean density of 2.2 cougars/100 km2 (95% CI=1.1-4.3) and stable to decreasing population trends (β=-0.048, 95% CI=-0.106–0.011) over the 9 years of study, with an average annual harvest rate of 14% (range=7-21%). The average annual cost per year for field sampling and genotyping was US$11,265 ($422.24/sample or $610.73/successfully genotyped sample). Our results demonstrated that long-term biopsy sampling using citizen scientists can increase capture success and provide reliable cougar-density information at a reasonable cost.
Ahn, Yongjun; Yeo, Hwasoo
2015-01-01
The charging infrastructure location problem is becoming more significant due to the extensive adoption of electric vehicles. Efficient charging station planning can solve deeply rooted problems, such as driving-range anxiety and the stagnation of new electric vehicle consumers. In the initial stage of introducing electric vehicles, the allocation of charging stations is difficult to determine due to the uncertainty of candidate sites and unidentified charging demands, which are determined by diverse variables. This paper introduces the Estimating the Required Density of EV Charging (ERDEC) stations model, which is an analytical approach to estimating the optimal density of charging stations for certain urban areas, which are subsequently aggregated to city level planning. The optimal charging station's density is derived to minimize the total cost. A numerical study is conducted to obtain the correlations among the various parameters in the proposed model, such as regional parameters, technological parameters and coefficient factors. To investigate the effect of technological advances, the corresponding changes in the optimal density and total cost are also examined by various combinations of technological parameters. Daejeon city in South Korea is selected for the case study to examine the applicability of the model to real-world problems. With real taxi trajectory data, the optimal density map of charging stations is generated. These results can provide the optimal number of chargers for driving without driving-range anxiety. In the initial planning phase of installing charging infrastructure, the proposed model can be applied to a relatively extensive area to encourage the usage of electric vehicles, especially areas that lack information, such as exact candidate sites for charging stations and other data related with electric vehicles. The methods and results of this paper can serve as a planning guideline to facilitate the extensive adoption of electric
Directory of Open Access Journals (Sweden)
Paulo Ricardo Gherardi Hein
2009-06-01
Full Text Available Wood basic density is indicative of several other wood properties and is considered as a key feature for many industrialapplications. Near infrared spectroscopy (NIRS is a fast, efficient technique that is capable of estimating that property. However,it should be improved in order to complement the often time-consuming and costly conventional method. Research on woodtechnological properties using near infrared spectroscopy has shown promising results. Thus the aim of this study was to evaluatethe efficiency of near infrared spectroscopy for estimating wood basic density in both Eucalyptus urophylla and Eucalyptus grandis.The coefficients of determination of the predictive models for cross validation ranged between 0.74 and 0.86 and the ratio performanceof deviation (RPD ranged between 1.9 and 2.7. The application of spectral filter, detection and removal of outlier samples, andselection of variables (wavelength improved the adjustment of calibrations, thereby reducing the standard error of calibration (SECand cross validation (SECV as well as increasing the coefficient of determination (R² and the RPD value. The technique of nearinfrared spectroscopy can therefore, be used for predicting wood basic density in Eucalyptus urophylla and Eucalyptus grandis.
On the method of logarithmic cumulants for parametric probability density function estimation.
Krylov, Vladimir A; Moser, Gabriele; Serpico, Sebastiano B; Zerubia, Josiane
2013-10-01
Parameter estimation of probability density functions is one of the major steps in the area of statistical image and signal processing. In this paper we explore several properties and limitations of the recently proposed method of logarithmic cumulants (MoLC) parameter estimation approach which is an alternative to the classical maximum likelihood (ML) and method of moments (MoM) approaches. We derive the general sufficient condition for a strong consistency of the MoLC estimates which represents an important asymptotic property of any statistical estimator. This result enables the demonstration of the strong consistency of MoLC estimates for a selection of widely used distribution families originating from (but not restricted to) synthetic aperture radar image processing. We then derive the analytical conditions of applicability of MoLC to samples for the distribution families in our selection. Finally, we conduct various synthetic and real data experiments to assess the comparative properties, applicability and small sample performance of MoLC notably for the generalized gamma and K families of distributions. Supervised image classification experiments are considered for medical ultrasound and remote-sensing SAR imagery. The obtained results suggest that MoLC is a feasible and computationally fast yet not universally applicable alternative to MoM. MoLC becomes especially useful when the direct ML approach turns out to be unfeasible.
Boersen, Mark R.; Clark, Joseph D.; King, Tim L.
2003-01-01
The Recovery Plan for the federally threatened Louisiana black bear (Ursus americanus luteolus) mandates that remnant populations be estimated and monitored. In 1999 we obtained genetic material with barbed-wire hair traps to estimate bear population size and genetic diversity at the 329-km2 Tensas River Tract, Louisiana. We constructed and monitored 122 hair traps, which produced 1,939 hair samples. Of those, we randomly selected 116 subsamples for genetic analysis and used up to 12 microsatellite DNA markers to obtain multilocus genotypes for 58 individuals. We used Program CAPTURE to compute estimates of population size using multiple mark-recapture models. The area of study was almost entirely circumscribed by agricultural land, thus the population was geographically closed. Also, study-area boundaries were biologically discreet, enabling us to accurately estimate population density. Using model Chao Mh to account for possible effects of individual heterogeneity in capture probabilities, we estimated the population size to be 119 (SE=29.4) bears, or 0.36 bears/km2. We were forced to examine a substantial number of loci to differentiate between some individuals because of low genetic variation. Despite the probable introduction of genes from Minnesota bears in the 1960s, the isolated population at Tensas exhibited characteristics consistent with inbreeding and genetic drift. Consequently, the effective population size at Tensas may be as few as 32, which warrants continued monitoring or possibly genetic augmentation.
Soil Organic Carbon Density in Hebei Province, China:Estimates and Uncertainty
Institute of Scientific and Technical Information of China (English)
ZHAO Yong-Cun; SHI Xue-Zheng; YU Dong-Sheng; T. F. PAGELLA; SUN Wei-Xia; XU Xiang-Hua
2005-01-01
In order to improve the precision of soil organic carbon (SOC) estimates, the sources of uncertainty in soil organic carbon density (SOCD) estimates and SOC stocks were examined using 363 soil profiles in Hebei Province, China, with three methods: the soil profile statistics (SPS), GIS-based soil type (GST), and kriging interpolation (KI). The GST method, utilizing both pedological professional knowledge and GIS technology, was considered the most accurate method of the three estimations, with SOCD estimates for SPS 10% lower and KI 10% higher. The SOCD range for GST was 84% wider than KI as KI smoothing effect narrowed the SOCD range. Nevertheless, the coefficient of variation for SOCD with KI (41.7%) was less than GST and SPS. Comparing SOCD's lower estimates for SPS versus GST, the major sources of uncertainty were the conflicting area of proportional relations. Meanwhile, the fewer number of soil profiles and the necessity of using the smoothing effect with KI were its sources of uncertainty. Moreover, for local detailed variations of SOCD, GST was more advantageous in reflecting the distribution pattern than KI.
The Effects of Surfactants on the Estimation of Bacterial Density in Petroleum Samples
Luna, Aderval Severino; da Costa, Antonio Carlos Augusto; Gonçalves, Márcia Monteiro Machado; de Almeida, Kelly Yaeko Miyashiro
The effect of the surfactants polyoxyethylene monostearate (Tween 60), polyoxyethylene monooleate (Tween 80), cetyl trimethyl ammonium bromide (CTAB), and sodium dodecyl sulfate (SDS) on the estimation of bacterial density (sulfate-reducing bacteria [SRB] and general anaerobic bacteria [GAnB]) was examined in petroleum samples. Three different compositions of oil and water were selected to be representative of the real samples. The first one contained a high content of oil, the second one contained a medium content of oil, and the last one contained a low content of oil. The most probable number (MPN) was used to estimate the bacterial density. The results showed that the addition of surfactants did not improve the SRB quantification for the high or medium oil content in the petroleum samples. On other hand, Tween 60 and Tween 80 promoted a significant increase on the GAnB quantification at 0.01% or 0.03% m/v concentrations, respectively. CTAB increased SRB and GAnB estimation for the sample with a low oil content at 0.00005% and 0.0001% m/v, respectively.
Large-sample study of the kernel density estimators under multiplicative censoring
Asgharian, Masoud; Fakoor, Vahid; 10.1214/11-AOS954
2012-01-01
The multiplicative censoring model introduced in Vardi [Biometrika 76 (1989) 751--761] is an incomplete data problem whereby two independent samples from the lifetime distribution $G$, $\\mathcal{X}_m=(X_1,...,X_m)$ and $\\mathcal{Z}_n=(Z_1,...,Z_n)$, are observed subject to a form of coarsening. Specifically, sample $\\mathcal{X}_m$ is fully observed while $\\mathcal{Y}_n=(Y_1,...,Y_n)$ is observed instead of $\\mathcal{Z}_n$, where $Y_i=U_iZ_i$ and $(U_1,...,U_n)$ is an independent sample from the standard uniform distribution. Vardi [Biometrika 76 (1989) 751--761] showed that this model unifies several important statistical problems, such as the deconvolution of an exponential random variable, estimation under a decreasing density constraint and an estimation problem in renewal processes. In this paper, we establish the large-sample properties of kernel density estimators under the multiplicative censoring model. We first construct a strong approximation for the process $\\sqrt{k}(\\hat{G}-G)$, where $\\hat{G}$ is...
Analytical Modelling of the Spread of Disease in Confined and Crowded Spaces
Goscé, Lara; Johansson, Anders
2013-01-01
Since 1927, models describing the spread of disease have mostly been of the SIR-compartmental type, based on the assumption that populations are homogeneous and well-mixed. The aim of this work is to analyse the implications that arise by taking crowd behaviour explicitly into account. Starting with a microscopic model of pedestrian movement in confined spaces, we show how both the rate of infection as well as the walking speed will depend on the local crowd density around an infected individual. The combined effect is that the rate of infection at a population scale will have an analytically tractable non-linear dependency on crowd density. As an illustrative and simple example, we will model the spread of Influenza in a simple corridor with uni-directional crowd flow and compare our new model with a state-of-the-art model, which will highlight the regime in which current models do not produce credible results.
Banasiak, J.
2016-09-01
There has been a hierarchy of models of crowd behaviour. One can consider the crowd at the so called microscopic level, as a collection of individuals, and derive its description in the form of a (large) system of ordinary differential equations describing the position and velocity of each individual, in parallel to the Newton's description of matter, see e.g. [10]. Another possibility is to describe crowd, in analogy to fluid dynamics, by providing its density and velocity at a given point, see e.g. [11,12]. At the same time, it is recognized that crowd is 'living, social' system that is prone to exhibit rare, not easily predictable, behaviour in response to stress induced by the perception of danger, or of the action of specific agents, see e.g. [1,2]. This high probability of the occurrence of events that are far from average, makes the crowd behaviour similar to the processes with fat-tailed distribution of events. Such unlikely events have been metaphorically termed black swans in [14], or Lévy flights in [13]. While microscopic and macroscopic models can capture many features of crowd dynamics, including obstacles, see [3,8], such models are described by differential equations that inherently are local in space. At the same time, black swan events are often caused by non-local interactions such as self-organization, learning or adherence to some averaged group behaviour. It is known that such interactions are well described by mean field models best represented by integro-differential equations, such as the Boltzmann equation of the rarefied gas theory. This has made plausible to introduce crowd models at the intermediate, (meso) scale by describing the crowd by the one particle distribution function that gives the density of individuals at any particular state; that is, at a given point in the domain and moving with a specific velocity.
A pdf-Free Change Detection Test Based on Density Difference Estimation.
Bu, Li; Alippi, Cesare; Zhao, Dongbin
2016-11-16
The ability to detect online changes in stationarity or time variance in a data stream is a hot research topic with striking implications. In this paper, we propose a novel probability density function-free change detection test, which is based on the least squares density-difference estimation method and operates online on multidimensional inputs. The test does not require any assumption about the underlying data distribution, and is able to operate immediately after having been configured by adopting a reservoir sampling mechanism. Thresholds requested to detect a change are automatically derived once a false positive rate is set by the application designer. Comprehensive experiments validate the effectiveness in detection of the proposed method both in terms of detection promptness and accuracy.
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
A method to estimate the probabilistic density function (PDF) of shear strength parameters was proposed. The second Chebyshev orthogonal polynomial(SCOP) combined with sample moments (the originmoments)was used to approximate the PDF of parameters. χ2 test was adopted to verify the availability of the method. It is distribution-free because no classical theoretical distributions were assumed in advance and the inference result provides a universal form of probability density curves. Six most commonly-used theoretical distributions named normal, lognormal, extreme value Ⅰ , gama, beta and Weibull distributions were used to verify SCOP method. An example from the observed data of cohesion c of a kind of silt clay was presented for illustrative purpose. The results show that the acceptance levels in SCOP are all smaller than those in the classical finite comparative method and the SCOP function is more accurate and effective in the reliability analysis of geotechnical engineering.
Directory of Open Access Journals (Sweden)
Stefano Anile
2012-07-01
Full Text Available The wildcat is an elusive species that is threatened with extinction in many areas of its European distribution. In Sicily the wildcat lives in a wide range of habitats; this study was done on Mount Etna. A previous camera trap monitoring was conducted in 2006 (pilot study and 2007 (first estimation of wildcat population size using camera trapping with capture-recapture analyses in the same study area. In 2009 digital camera traps in pair were used at each station with the aim of obtaining photographs of the wildcat. Experience and data collected from previous studies were used to develop a protocol to estimate the density of the wildcat’s population using capture–recapture analyses and the coat-colour and markings system to recognize individuals. Two trap-lines adjacent to each other were run in two consecutive data collection periods. Camera traps worked together for 1080 trap-days and we obtained 42 pictures of wildcats from 32 events of photographic capture, from which 10 individuals ( excluding four kittens were determined. The history capture of each individual was constructed and the software CAPTURE was used to generate an estimation of the population density (0.22 to 0.44 wildcat/100 ha for our study area using two different approaches for the calculation of the effective area sampled. The wildcat’s population density on Mount Etna is higher than those found throughout Europe, and is favoured by the habitat structure, prey availability, Mediterranean climate and the protection status provided by the park.
Axonal and dendritic density field estimation from incomplete single-slice neuronal reconstructions
Directory of Open Access Journals (Sweden)
Jaap evan Pelt
2014-06-01
Full Text Available Neuronal information processing in cortical networks critically depends on the organization of synaptic connectivity. Synaptic connections can form when axons and dendrites come in close proximity of each other. The spatial innervation of neuronal arborizations can be described by their axonal and dendritic density fields. Recently we showed that potential locations of synapses between neurons can be estimated from their overlapping axonal and dendritic density fields. However, deriving density fields from single-slice neuronal reconstructions is hampered by incompleteness because of cut branches.Here, we describe a method for recovering the lost axonal and dendritic mass. This so-called completion method is based on an estimation of the mass inside the slice and an extrapolation to the space outside the slice, assuming axial symmetry in the mass distribution. We validated the method using a set of neurons generated with our NETMORPH simulator. The model-generated neurons were artificially sliced and subsequently recovered by the completion method. Depending on slice thickness and arbor extent, branches that have lost their outside parents (orphan branches may occur inside the slice. Not connected anymore to the contiguous structure of the sliced neuron, orphan branches result in an underestimation of neurite mass. For 300 m thick slices, however, the validation showed a full recovery of dendritic and an almost full recovery of axonal mass.The completion method was applied to three experimental data sets of reconstructed rat cortical L2/3 pyramidal neurons. The results showed that in 300 m thick slices intracortical axons lost about 50% and dendrites about 16% of their mass. The completion method can be applied to single-slice reconstructions as long as axial symmetry can be assumed in the mass distribution. This opens up the possibility of using incomplete neuronal reconstructions from open-access data bases to determine population mean
"Prospecting Asteroids: Indirect technique to estimate overall density and inner composition"
Such, Pamela
2016-07-01
Spectroscopic studies of asteroids make possible to obtain some information on their composition from the surface but say little about the innermost material, porosity and density of the object. In addition, spectroscopic observations are affected by the effects of "space weathering" produced by the bombardment of charged particles for certain materials that change their chemical structure, albedo and other physical properties, partly altering their chances of identification. Data such as the mass, size and density of the asteroids are essential at the time to propose space missions in order to determine the best candidates for space exploration and is of great importance to determine a priori any of them remotely from Earth. From many years ago its determined masses of largest asteroids studying the gravitational effects they have on smaller asteroids when they approach them (see Davis and Bender, 1977; Schubart and Matson, 1979; School et al 1987; Hoffman, 1989b, among others), but estimates of the masses of the smallest objects is limited to the effects that occur in extreme close encounters to other asteroids of similar size. This paper presents the results of a search for approaches of pair of asteroids that approximate distances less than 0.0004 UA (50,000 km) of each other in order to study their masses through the astrometric method and to estimate in a future their densities and internal composition. References Davis, D. R., and D. F. Bender. 1977. Asteroid mass determinations: search for futher encounter opportunities. Bull. Am. Astron. Soc. 9, 502-503. Hoffman, M. 1989b. Asteroid mass determination: Present situation and perspectives. In asteroids II (R. P. Binzel, T. Gehreis, and M. S. Matthews, Eds.), pp 228-239. Univ. Arizona Press, Tucson. School, H. L. D. Schmadel and S. Roser 1987. The mass of the asteroid (10) Hygiea derived from observations of (829) Academia. Astron. Astrophys. 179, 311-316. Schubart, J. And D. L. Matson 1979. Masses and
Measuring and Modelling Crowd Flows - Fusing Stationary and Tracking Data
Treiber, Martin
2016-01-01
The two main data categories of vehicular traffic flow, stationary detector data and floating-car data, are also available for many Marathons and other mass-sports events: Loop detectors and other stationary data sources find their counterpart in the RFID tags of the athletes recording the split times at several stations during the race. Additionally, more and more athletes use smart-phone apps generating track data points that are the equivalent of floating-car data. We present a methodology to detect congestions and estimate the location of jam-fronts, the delay times, and the spatio-temporal speed and density distribution of the athlete's crowd flow by fusing these two data sources based on a first-order macroscopic model with triangular fundamental diagram. The method can be used in real-time or for analyzing past events. Using synthetic "ground truth" data generated by simulations with the Intelligent-Driver Model, we show that, in a real-time application, the proposed algorithm is robust and effective w...
Real-time crowd safety and comfort management from CCTV images
Baqui, Muhammad; Löhner, Rainald
2017-05-01
High density pedestrian flows are a common occurrence. Pedestrian safety and comfort in high density flows can present serious challenges to organizers, businesses and safety personnel. Obtaining pedestrian density and velocity directly from Closed Circuit Television (CCTV) would significantly improve real-time crowd management. A study of high density crowd monitoring from video and its real-time application viability is presented. The video data is captured from CCTV. Both cross correlation based and optical flow based approaches are studied. Results are presented in the form of fundamental diagrams, velocity vectors and speed contours of the flow field.
Chenang Beach and its Crowding Capacity: A Malaysian Perspective
Directory of Open Access Journals (Sweden)
Mohamad Diana
2014-01-01
Full Text Available This working paper focuses in enjoyment factors, specifically: number of beach users, perceived maximum number of beach users accepted, perceived maximum number of beach users that affects the tourism experience and perceived maximum number of beach users that affects the beach quality. At a deeper extent, the evaluation is categorized by number of visitation, visitation motivations, and Chenang Island’s push and pull factors. Relationships between variables were assessed using a two-phase evaluation framework where interestingly, only one demographic factor works with all the studied independent variables. It is also learned that the density of an area number of people seen is considered as a n accepted crowding factor, as opposed to this working paper scope experienced crowding . A unique relationship was observed for crowding level, and visitation satisfaction level and overall evaluation of Chenang beach quality. This working paper further supports the previous literature on the significance of beach carrying capacity management and it is learned that the idea of crowding standard is interlinks with ‘gender, ‘time spend’ and ‘number of boaters’. From findings, this working paper envisages the preferences polar exchange where this should be of interest to tourism-related personnel. It is within this working paper interest to highlight the pressing need in brandishing the image of Chenang Beach. This is to ensure that Chenang Beach, as a field, is maintaining its importance and popularity.
Efficient 3D movement-based kernel density estimator and application to wildlife ecology
Tracey-PR, Jeff; Sheppard, James K.; Lockwood, Glenn K.; Chourasia, Amit; Tatineni, Mahidhar; Fisher, Robert N.; Sinkovits, Robert S.
2014-01-01
We describe an efficient implementation of a 3D movement-based kernel density estimator for determining animal space use from discrete GPS measurements. This new method provides more accurate results, particularly for species that make large excursions in the vertical dimension. The downside of this approach is that it is much more computationally expensive than simpler, lower-dimensional models. Through a combination of code restructuring, parallelization and performance optimization, we were able to reduce the time to solution by up to a factor of 1000x, thereby greatly improving the applicability of the method.
Directory of Open Access Journals (Sweden)
George Miliaresis
2009-04-01
Full Text Available The U.S National Elevation Dataset and the NLCD 2001 landcover data were used to test the correlation between SRTM elevation values and the height of evergreen forest vegetation in the Klamath Mountains of California.Vegetation height estimates (SRTM-NED are valid only for the two out of eight (N, NE, E, SE, S, SW, W, NW geographic directions, due to NED and SRTM grid data misregistration. Penetration depths of SRTM radar were found to linearly correlate to tree percent canopy density.
Cetacean Density Estimation from Novel Acoustic Datasets by Acoustic Propagation Modeling
2013-09-30
whales off Kona, Hawai’i, is based on the works of Zimmer et al. (2008), Marques et al. (2009), and Küsel et al. (2011). The density estimator formula...given by Marques et al. (2009) is applied here for the case of one (k=1) sensor, yielding the following formulation: � = (−�) ...2124 manually labeled false killer whale clicks, calculated in 1 kHz band intervals from 0 to 90 kHz. From the above image it can be observed the
Pettersen, Klas H; Hagen, Espen; Einevoll, Gaute T
2008-06-01
This model study investigates the validity of methods used to interpret linear (laminar) multielectrode recordings. In computer experiments extracellular potentials from a synaptically activated population of about 1,000 pyramidal neurons are calculated using biologically realistic compartmental neuron models combined with electrostatic forward modeling. The somas of the pyramidal neurons are located in a 0.4 mm high and wide columnar cylinder, mimicking a stimulus-evoked layer-5 population in a neocortical column. Current-source density (CSD) analysis of the low-frequency part (estimates of the true underlying CSD. The high-frequency part (>750 Hz) of the potentials (multi-unit activity, MUA) is found to scale approximately as the population firing rate to the power 3/4 and to give excellent estimates of the underlying population firing rate for trial-averaged data. The MUA signal is found to decay much more sharply outside the columnar populations than the LFP.
Bayesian semiparametric power spectral density estimation in gravitational wave data analysis
Edwards, Matthew C; Christensen, Nelson
2015-01-01
The standard noise model in gravitational wave (GW) data analysis assumes detector noise is stationary and Gaussian distributed, with a known power spectral density (PSD) that is usually estimated using clean off-source data. Real GW data often depart from these assumptions, and misspecified parametric models of the PSD could result in misleading inferences. We propose a Bayesian semiparametric approach to improve this. We use a nonparametric Bernstein polynomial prior on the PSD, with weights attained via a Dirichlet process distribution, and update this using the Whittle likelihood. Posterior samples are obtained using a Metropolis-within-Gibbs sampler. We simultaneously estimate the reconstruction parameters of a rotating core collapse supernova GW burst that has been embedded in simulated Advanced LIGO noise. We also discuss an approach to deal with non-stationary data by breaking longer data streams into smaller and locally stationary components.
Institute of Scientific and Technical Information of China (English)
孙燕; 李秋菊; 李剑峰
2011-01-01
城市公共区域人群高度聚集且流动性大,紧急状态时易发生群死群伤的拥挤踩踏事故,造成大量人员伤亡和社会负面影响.在FIST模型的基础上,提出以人员密度(D)、人员特性(C)、人与人的相互作用(I)以及人群聚集环境(E)作为表征公共场所人群聚集风险的基本参数,接下来对四个参数进行了相应的技术分析.第一,利用人群监控技术估计人群密度；第二,通过现场监测网络得到的人群压力值来表征人与人相互作用的强度；第三,忽略了个体差异对人群整体的影响；第四,把公共场所中导致事故发生的影响因子归结为综合扰动强度,并建立了相应的数学模型表征了这种强度的大小.最终建立了描述人群聚集风险的DICE模型.同时,给出了人群密度阈值、人群压力阈值以及人群聚集风险的总阈值及其判断标准,整个工作将人群聚集风险实时定量及管理技术推向了实用化的道路.%With high density population and its great fluidness in urban public venues, crowd crushing and trampling accidents will occur in emergency that result in great casualties and negative impacts on society. Based on the FIST model, it proposes four parameters to describe the crowd massing risk in public venues that are the density (D), mutual interaction between each others (I), the personnel characteristics (C) and the impact derived form environmental ( E) disturbance on the massing crowd. Then, it carries out the corresponding technical analysis for the four predefined parameters. First, it uses the crowd monitoring systems to estimate the crowd density on the spot; Second, the value of crowd pressure will be obtained through the pressure measurement network which describes the mutual affects between different persons; Third, it has neglected the influence of individual differences on the crowd as a whole; Fourth, it attributes all the influence factors to an index
Liu, Huaie; Feng, Guohua; Zeng, Weilin; Li, Xiaomei; Bai, Yao; Deng, Shuang; Ruan, Yonghua; Morris, James; Li, Siman; Yang, Zhaoqing; Cui, Liwang
2016-04-01
The conventional method of estimating parasite densities employ an assumption of 8000 white blood cells (WBCs)/μl. However, due to leucopenia in malaria patients, this number appears to overestimate parasite densities. In this study, we assessed the accuracy of parasite density estimated using this assumed WBC count in eastern Myanmar, where Plasmodium vivax has become increasingly prevalent. From 256 patients with uncomplicated P. vivax malaria, we estimated parasite density and counted WBCs by using an automated blood cell counter. It was found that WBC counts were not significantly different between patients of different gender, axillary temperature, and body mass index levels, whereas they were significantly different between age groups of patients and the time points of measurement. The median parasite densities calculated with the actual WBC counts (1903/μl) and the assumed WBC count of 8000/μl (2570/μl) were significantly different. We demonstrated that using the assumed WBC count of 8000 cells/μl to estimate parasite densities of P. vivax malaria patients in this area would lead to an overestimation. For P. vivax patients aged five years and older, an assumed WBC count of 5500/μl best estimated parasite densities. This study provides more realistic assumed WBC counts for estimating parasite densities in P. vivax patients from low-endemicity areas of Southeast Asia.
Directory of Open Access Journals (Sweden)
Kaihan Fakhar
Full Text Available OBJECTIVE: We aimed in this investigation to study deep brain stimulation (DBS battery drain with special attention directed toward patient symptoms prior to and following battery replacement. BACKGROUND: Previously our group developed web-based calculators and smart phone applications to estimate DBS battery life (http://mdc.mbi.ufl.edu/surgery/dbs-battery-estimator. METHODS: A cohort of 320 patients undergoing DBS battery replacement from 2002-2012 were included in an IRB approved study. Statistical analysis was performed using SPSS 20.0 (IBM, Armonk, NY. RESULTS: The mean charge density for treatment of Parkinson's disease was 7.2 µC/cm(2/phase (SD = 3.82, for dystonia was 17.5 µC/cm(2/phase (SD = 8.53, for essential tremor was 8.3 µC/cm(2/phase (SD = 4.85, and for OCD was 18.0 µC/cm(2/phase (SD = 4.35. There was a significant relationship between charge density and battery life (r = -.59, p<.001, as well as total power and battery life (r = -.64, p<.001. The UF estimator (r = .67, p<.001 and the Medtronic helpline (r = .74, p<.001 predictions of battery life were significantly positively associated with actual battery life. Battery status indicators on Soletra and Kinetra were poor predictors of battery life. In 38 cases, the symptoms improved following a battery change, suggesting that the neurostimulator was likely responsible for symptom worsening. For these cases, both the UF estimator and the Medtronic helpline were significantly correlated with battery life (r = .65 and r = .70, respectively, both p<.001. CONCLUSIONS: Battery estimations, charge density, total power and clinical symptoms were important factors. The observation of clinical worsening that was rescued following neurostimulator replacement reinforces the notion that changes in clinical symptoms can be associated with battery drain.
Constrained Kalman Filtering Via Density Function Truncation for Turbofan Engine Health Estimation
Simon, Dan; Simon, Donald L.
2006-01-01
Kalman filters are often used to estimate the state variables of a dynamic system. However, in the application of Kalman filters some known signal information is often either ignored or dealt with heuristically. For instance, state variable constraints (which may be based on physical considerations) are often neglected because they do not fit easily into the structure of the Kalman filter. This paper develops an analytic method of incorporating state variable inequality constraints in the Kalman filter. The resultant filter truncates the PDF (probability density function) of the Kalman filter estimate at the known constraints and then computes the constrained filter estimate as the mean of the truncated PDF. The incorporation of state variable constraints increases the computational effort of the filter but significantly improves its estimation accuracy. The improvement is demonstrated via simulation results obtained from a turbofan engine model. The turbofan engine model contains 3 state variables, 11 measurements, and 10 component health parameters. It is also shown that the truncated Kalman filter may be a more accurate way of incorporating inequality constraints than other constrained filters (e.g., the projection approach to constrained filtering).
Budka, Marcin; Gabrys, Bogdan
2013-01-01
Estimation of the generalization ability of a classification or regression model is an important issue, as it indicates the expected performance on previously unseen data and is also used for model selection. Currently used generalization error estimation procedures, such as cross-validation (CV) or bootstrap, are stochastic and, thus, require multiple repetitions in order to produce reliable results, which can be computationally expensive, if not prohibitive. The correntropy-inspired density-preserving sampling (DPS) procedure proposed in this paper eliminates the need for repeating the error estimation procedure by dividing the available data into subsets that are guaranteed to be representative of the input dataset. This allows the production of low-variance error estimates with an accuracy comparable to 10 times repeated CV at a fraction of the computations required by CV. This method can also be used for model ranking and selection. This paper derives the DPS procedure and investigates its usability and performance using a set of public benchmark datasets and standard classifiers.
Robinson, K. L.; Luo, J. Y.; Guigand, C.; Sponaugle, S.; Cowen, R. K.
2016-02-01
`Big biological data' sets are becoming common in oceanography with the advent of sampling technologies that can generate high-frequency observations for multiple data streams simultaneously. Identifying and implementing robust and efficient approaches to manage and analyze these big biological data (BBD) sets has become a primary challenge facing many biological oceanographers and marine ecologists alike. Using a large plankton imagery dataset generated by the "Deep Focus Plankton Imager-2" (formerly the ISIIS-2) system as an example, we present two `crowd-sourcing' approaches applied to the problem of efficiently classifying tens of millions of images of individual plankters. The first approach uses `crowd sourcing'' in its typical format by asking members of the general public to identify groups of plankton via a web-interface hosted by Zooniverse. The second approach engaged members of the data science community via a partnership with Kaggle and Booz Allen Hamilton, two data science industry leaders. We discuss how academic-industry partnerships were established, the questions we sought to answer via crowd-sourcing as well as the success, the pit-falls, and the surprising outcomes that were generated by each approach.
The Wegner Estimate and the Integrated Density of States for some Random Operators
Indian Academy of Sciences (India)
J M Combes; P D Hislop; Frédéric Klopp; Shu Nakamura
2002-02-01
The integrated density of states (IDS) for random operators is an important function describing many physical characteristics of a random system. Properties of the IDS are derived from the Wegner estimate that describes the influence of finite-volume perturbations on a background system. In this paper, we present a simple proof of the Wegner estimate applicable to a wide variety of random perturbations of deterministic background operators. The proof yields the correct volume dependence of the upper bound. This implies the local Hölder continuity of the integrated density of states at energies in the unperturbed spectral gap. The proof depends on the -theory of the spectral shift function (SSF), for ≥ 1, applicable to pairs of self-adjoint operators whose difference is in the trace ideal $\\mathcal{I}_p$, for 0 < ≤ 1. We present this and other results on the SSF due to other authors. Under an additional condition of the single-site potential, local Hölder continuity is proved at all energies. Finally, we present extensions of this work to random potentials with nonsign definite single-site potentials.
Power spectral density of velocity fluctuations estimated from phase Doppler data
Directory of Open Access Journals (Sweden)
Jicha Miroslav
2012-04-01
Full Text Available Laser Doppler Anemometry (LDA and its modifications such as PhaseDoppler Particle Anemometry (P/DPA is point-wise method for optical nonintrusive measurement of particle velocity with high data rate. Conversion of the LDA velocity data from temporal to frequency domain – calculation of power spectral density (PSD of velocity fluctuations, is a non trivial task due to nonequidistant data sampling in time. We briefly discuss possibilities for the PSD estimation and specify limitations caused by seeding density and other factors of the flow and LDA setup. Arbitrary results of LDA measurements are compared with corresponding Hot Wire Anemometry (HWA data in the frequency domain. Slot correlation (SC method implemented in software program Kern by Nobach (2006 is used for the PSD estimation. Influence of several input parameters on resulting PSDs is described. Optimum setup of the software for our data of particle-laden air flow in realistic human airway model is documented. Typical character of the flow is described using PSD plots of velocity fluctuations with comments on specific properties of the flow. Some recommendations for improvements of future experiments to acquire better PSD results are given.
Error estimates for density-functional theory predictions of surface energy and work function
De Waele, Sam; Lejaeghere, Kurt; Sluydts, Michael; Cottenier, Stefaan
2016-12-01
Density-functional theory (DFT) predictions of materials properties are becoming ever more widespread. With increased use comes the demand for estimates of the accuracy of DFT results. In view of the importance of reliable surface properties, this work calculates surface energies and work functions for a large and diverse test set of crystalline solids. They are compared to experimental values by performing a linear regression, which results in a measure of the predictable and material-specific error of the theoretical result. Two of the most prevalent functionals, the local density approximation (LDA) and the Perdew-Burke-Ernzerhof parametrization of the generalized gradient approximation (PBE-GGA), are evaluated and compared. Both LDA and GGA-PBE are found to yield accurate work functions with error bars below 0.3 eV, rivaling the experimental precision. LDA also provides satisfactory estimates for the surface energy with error bars smaller than 10%, but GGA-PBE significantly underestimates the surface energy for materials with a large correlation energy.
Density and Biomass Estimates by Removal for an Amazonian Crocodilian, Paleosuchus palpebrosus.
Directory of Open Access Journals (Sweden)
Zilca Campos
Full Text Available Direct counts of crocodilians are rarely feasible and it is difficult to meet the assumptions of mark-recapture methods for most species in most habitats. Catch-out experiments are also usually not logistically or morally justifiable because it would be necessary to destroy the habitat in order to be confident that most individuals had been captured. We took advantage of the draining and filling of a large area of flooded forest during the building of the Santo Antônio dam on the Madeira River to obtain accurate estimates of the density and biomass of Paleosuchus palpebrosus. The density, 28.4 non-hatchling individuals per km2, is one of the highest reported for any crocodilian, except for species that are temporarily concentrated in small areas during dry-season drought. The biomass estimate of 63.15 kg*km-2 is higher than that for most or even all mammalian carnivores in tropical forest. P. palpebrosus may be one of the World´s most abundant crocodilians.
Experimental study on small group behavior and crowd dynamics in a tall office building evacuation
Ma, Yaping; Li, Lihua; Zhang, Hui; Chen, Tao
2017-05-01
It is well known that a large percentage of occupants in a building are evacuated together with their friends, families, and officemates, especially in China. Small group behaviors are therefore critical for crowd movement. This paper aims to study the crowd dynamic considering different social relations and the impacts of small groups on crowd dynamics in emergency evacuation. Three experiments are conducted in an 11-storey office building. In the first two experiments, all participants are classmates and know each other well. They are evacuated as individuals or pairs. In the third experiment, social relations among the participants are complex. Participants consist of 8 families, 6 lovers and several individuals. Space-time features, speed characteristics and density-speed relations for each experiment are analyzed and compared. Results conclude that small group behaviors can make positive impacts on crowd dynamics when evacuees know each other and are cooperative. This conclusion is also testified by four verified experiments. In the third experiment, speeds of evacuees are lowest. Small groups form automatically with the presence of intimate social relations. Small groups in this experiment slow down the average speed of the crowd and make disturbance on the crowd flow. Small groups in this case make negative impacts on the movement of the crowd. It is because that evacuees do not know each other and they are competitive to each other. Characteristics of different types of small groups are also investigated. Experimental data can provide foundational parameters for evacuation model development and are helpful for building designers.
Directory of Open Access Journals (Sweden)
Lingling Jiang
2016-01-01
Full Text Available A multiband and a single-band semianalytical model were developed to predict algae cell density distribution. The models were based on cell density (N dependent parameterizations of the spectral backscattering coefficients, bb(λ, obtained from in situ measurements. There was a strong relationship between bb(λ and N, with a minimum regression coefficient of 0.97 at 488 nm and a maximum value of 0.98 at other bands. The cell density calculated by the multiband inversion model was similar to the field measurements of the coastal waters (the average relative error was only 8.9%, but it could not accurately discern the red tide from mixed pixels, and this led to overestimation of the area affected by the red tide. While the single-band inversion model is less precise than the former model in the high chlorophyll water, it could eliminate the impact of the suspended sediments and make more accurate estimates of the red tide area. We concluded that the two models both have advantages and disadvantages; these methods lay the foundation for developing a remote sensing forecasting system for red tides.
Directory of Open Access Journals (Sweden)
Shanshan Yang
Full Text Available Detection of dysphonia is useful for monitoring the progression of phonatory impairment for patients with Parkinson's disease (PD, and also helps assess the disease severity. This paper describes the statistical pattern analysis methods to study different vocal measurements of sustained phonations. The feature dimension reduction procedure was implemented by using the sequential forward selection (SFS and kernel principal component analysis (KPCA methods. Four selected vocal measures were projected by the KPCA onto the bivariate feature space, in which the class-conditional feature densities can be approximated with the nonparametric kernel density estimation technique. In the vocal pattern classification experiments, Fisher's linear discriminant analysis (FLDA was applied to perform the linear classification of voice records for healthy control subjects and PD patients, and the maximum a posteriori (MAP decision rule and support vector machine (SVM with radial basis function kernels were employed for the nonlinear classification tasks. Based on the KPCA-mapped feature densities, the MAP classifier successfully distinguished 91.8% voice records, with a sensitivity rate of 0.986, a specificity rate of 0.708, and an area value of 0.94 under the receiver operating characteristic (ROC curve. The diagnostic performance provided by the MAP classifier was superior to those of the FLDA and SVM classifiers. In addition, the classification results indicated that gender is insensitive to dysphonia detection, and the sustained phonations of PD patients with minimal functional disability are more difficult to be correctly identified.
Fiora, Alessandro; Cescatti, Alessandro
2006-09-01
Daily and seasonal patterns in radial distribution of sap flux density were monitored in six trees differing in social position in a mixed coniferous stand dominated by silver fir (Abies alba Miller) and Norway spruce (Picea abies (L.) Karst) in the Alps of northeastern Italy. Radial distribution of sap flux was measured with arrays of 1-cm-long Granier probes. The radial profiles were either Gaussian or decreased monotonically toward the tree center, and seemed to be related to social position and crown distribution of the trees. The ratio between sap flux estimated with the most external sensor and the mean flux, weighted with the corresponding annulus areas, was used as a correction factor (CF) to express diurnal and seasonal radial variation in sap flow. During sunny days, the diurnal radial profile of sap flux changed with time and accumulated photosynthetic active radiation (PAR), with an increasing contribution of sap flux in the inner sapwood during the day. Seasonally, the contribution of sap flux in the inner xylem increased with daily cumulative PAR and the variation of CF was proportional to the tree diameter, ranging from 29% for suppressed trees up to 300% for dominant trees. Two models were developed, relating CF with PAR and tree diameter at breast height (DBH), to correct daily and seasonal estimates of whole-tree and stand sap flow obtained by assuming uniform sap flux density over the sapwood. If the variability in the radial profile of sap flux density was not accounted for, total stand transpiration would be overestimated by 32% during sunny days and 40% for the entire season.
Luo, Shezhou; Chen, Jing M; Wang, Cheng; Xi, Xiaohuan; Zeng, Hongcheng; Peng, Dailiang; Li, Dong
2016-05-30
Vegetation leaf area index (LAI), height, and aboveground biomass are key biophysical parameters. Corn is an important and globally distributed crop, and reliable estimations of these parameters are essential for corn yield forecasting, health monitoring and ecosystem modeling. Light Detection and Ranging (LiDAR) is considered an effective technology for estimating vegetation biophysical parameters. However, the estimation accuracies of these parameters are affected by multiple factors. In this study, we first estimated corn LAI, height and biomass (R2 = 0.80, 0.874 and 0.838, respectively) using the original LiDAR data (7.32 points/m2), and the results showed that LiDAR data could accurately estimate these biophysical parameters. Second, comprehensive research was conducted on the effects of LiDAR point density, sampling size and height threshold on the estimation accuracy of LAI, height and biomass. Our findings indicated that LiDAR point density had an important effect on the estimation accuracy for vegetation biophysical parameters, however, high point density did not always produce highly accurate estimates, and reduced point density could deliver reasonable estimation results. Furthermore, the results showed that sampling size and height threshold were additional key factors that affect the estimation accuracy of biophysical parameters. Therefore, the optimal sampling size and the height threshold should be determined to improve the estimation accuracy of biophysical parameters. Our results also implied that a higher LiDAR point density, larger sampling size and height threshold were required to obtain accurate corn LAI estimation when compared with height and biomass estimations. In general, our results provide valuable guidance for LiDAR data acquisition and estimation of vegetation biophysical parameters using LiDAR data.
Release of crowding by pattern completion.
Manassi, Mauro; Hermens, Frouke; Francis, Gregory; Herzog, Michael H
2015-01-01
In crowding, target perception deteriorates in the presence of flanking elements. Crowding is classically explained by low-level mechanisms such as pooling or feature substitution. However, we have previously shown that perceptual grouping between the target and flankers, rather than low-level mechanisms, determines crowding. There are many grouping cues that can determine crowding, such as low- and high-level feature similarity, low- and high-level pattern regularity, and good Gestalt. Here we show that pattern completion, another grouping cue that is important for crowding in foveal vision, is also important in peripheral vision. We also describe computer simulations that show how pattern completion, and crowding in general, can be partly explained by recurrent processing.
Effects of macromolecular crowding on genetic networks.
Morelli, Marco J; Allen, Rosalind J; Wolde, Pieter Rein ten
2011-12-21
The intracellular environment is crowded with proteins, DNA, and other macromolecules. Under physiological conditions, macromolecular crowding can alter both molecular diffusion and the equilibria of bimolecular reactions and therefore is likely to have a significant effect on the function of biochemical networks. We propose a simple way to model the effects of macromolecular crowding on biochemical networks via an appropriate scaling of bimolecular association and dissociation rates. We use this approach, in combination with kinetic Monte Carlo simulations, to analyze the effects of crowding on a constitutively expressed gene, a repressed gene, and a model for the bacteriophage λ genetic switch, in the presence and absence of nonspecific binding of transcription factors to genomic DNA. Our results show that the effects of crowding are mainly caused by the shift of association-dissociation equilibria rather than the slowing down of protein diffusion, and that macromolecular crowding can have relevant and counterintuitive effects on biochemical network performance.
Höing, Andrea; Quinten, Marcel C; Indrawati, Yohana Maria; Cheyne, Susan M; Waltert, Matthias
2013-02-01
Estimating population densities of key species is crucial for many conservation programs. Density estimates provide baseline data and enable monitoring of population size. Several different survey methods are available, and the choice of method depends on the species and study aims. Few studies have compared the accuracy and efficiency of different survey methods for large mammals, particularly for primates. Here we compare estimates of density and abundance of Kloss' gibbons (Hylobates klossii) using two of the most common survey methods: line transect distance sampling and triangulation. Line transect surveys (survey effort: 155.5 km) produced a total of 101 auditory and visual encounters and a density estimate of 5.5 gibbon clusters (groups or subgroups of primate social units)/km(2). Triangulation conducted from 12 listening posts during the same period revealed a similar density estimate of 5.0 clusters/km(2). Coefficients of variation of cluster density estimates were slightly higher from triangulation (0.24) than from line transects (0.17), resulting in a lack of precision in detecting changes in cluster densities of triangulation and triangulation method also may be appropriate.
Lemke, Dorothea; Mattauch, Volkmar; Heidinger, Oliver; Pebesma, Edzer; Hense, Hans-Werner
2015-03-31
Monitoring spatial disease risk (e.g. identifying risk areas) is of great relevance in public health research, especially in cancer epidemiology. A common strategy uses case-control studies and estimates a spatial relative risk function (sRRF) via kernel density estimation (KDE). This study was set up to evaluate the sRRF estimation methods, comparing fixed with adaptive bandwidth-based KDE, and how they were able to detect 'risk areas' with case data from a population-based cancer registry. The sRRF were estimated within a defined area, using locational information on incident cancer cases and on a spatial sample of controls, drawn from a high-resolution population grid recognized as underestimating the resident population in urban centers. The spatial extensions of these areas with underestimated resident population were quantified with population reference data and used in this study as 'true risk areas'. Sensitivity and specificity analyses were conducted by spatial overlay of the 'true risk areas' and the significant (α=.05) p-contour lines obtained from the sRRF. We observed that the fixed bandwidth-based sRRF was distinguished by a conservative behavior in identifying these urban 'risk areas', that is, a reduced sensitivity but increased specificity due to oversmoothing as compared to the adaptive risk estimator. In contrast, the latter appeared more competitive through variance stabilization, resulting in a higher sensitivity, while the specificity was equal as compared to the fixed risk estimator. Halving the originally determined bandwidths led to a simultaneous improvement of sensitivity and specificity of the adaptive sRRF, while the specificity was reduced for the fixed estimator. The fixed risk estimator contrasts with an oversmoothing tendency in urban areas, while overestimating the risk in rural areas. The use of an adaptive bandwidth regime attenuated this pattern, but led in general to a higher false positive rate, because, in our study design
Cavuoti, S.; Amaro, V.; Brescia, M.; Vellucci, C.; Tortora, C.; Longo, G.
2017-02-01
A variety of fundamental astrophysical science topics require the determination of very accurate photometric redshifts (photo-z). A wide plethora of methods have been developed, based either on template models fitting or on empirical explorations of the photometric parameter space. Machine-learning-based techniques are not explicitly dependent on the physical priors and able to produce accurate photo-z estimations within the photometric ranges derived from the spectroscopic training set. These estimates, however, are not easy to characterize in terms of a photo-z probability density function (PDF), due to the fact that the analytical relation mapping the photometric parameters on to the redshift space is virtually unknown. We present METAPHOR (Machine-learning Estimation Tool for Accurate PHOtometric Redshifts), a method designed to provide a reliable PDF of the error distribution for empirical techniques. The method is implemented as a modular workflow, whose internal engine for photo-z estimation makes use of the MLPQNA neural network (Multi Layer Perceptron with Quasi Newton learning rule), with the possibility to easily replace the specific machine-learning model chosen to predict photo-z. We present a summary of results on SDSS-DR9 galaxy data, used also to perform a direct comparison with PDFs obtained by the LE PHARE spectral energy distribution template fitting. We show that METAPHOR is capable to estimate the precision and reliability of photometric redshifts obtained with three different self-adaptive techniques, i.e. MLPQNA, Random Forest and the standard K-Nearest Neighbors models.
Bayes and empirical Bayes estimators of abundance and density from spatial capture-recapture data.
Dorazio, Robert M
2013-01-01
In capture-recapture and mark-resight surveys, movements of individuals both within and between sampling periods can alter the susceptibility of individuals to detection over the region of sampling. In these circumstances spatially explicit capture-recapture (SECR) models, which incorporate the observed locations of individuals, allow population density and abundance to be estimated while accounting for differences in detectability of individuals. In this paper I propose two Bayesian SECR models, one for the analysis of recaptures observed in trapping arrays and another for the analysis of recaptures observed in area searches. In formulating these models I used distinct submodels to specify the distribution of individual home-range centers and the observable recaptures associated with these individuals. This separation of ecological and observational processes allowed me to derive a formal connection between Bayes and empirical Bayes estimators of population abundance that has not been established previously. I showed that this connection applies to every Poisson point-process model of SECR data and provides theoretical support for a previously proposed estimator of abundance based on recaptures in trapping arrays. To illustrate results of both classical and Bayesian methods of analysis, I compared Bayes and empirical Bayes esimates of abundance and density using recaptures from simulated and real populations of animals. Real populations included two iconic datasets: recaptures of tigers detected in camera-trap surveys and recaptures of lizards detected in area-search surveys. In the datasets I analyzed, classical and Bayesian methods provided similar - and often identical - inferences, which is not surprising given the sample sizes and the noninformative priors used in the analyses.
Seismic Hazard Analysis Using the Adaptive Kernel Density Estimation Technique for Chennai City
Ramanna, C. K.; Dodagoudar, G. R.
2012-01-01
Conventional method of probabilistic seismic hazard analysis (PSHA) using the Cornell-McGuire approach requires identification of homogeneous source zones as the first step. This criterion brings along many issues and, hence, several alternative methods to hazard estimation have come up in the last few years. Methods such as zoneless or zone-free methods, modelling of earth's crust using numerical methods with finite element analysis, have been proposed. Delineating a homogeneous source zone in regions of distributed seismicity and/or diffused seismicity is rather a difficult task. In this study, the zone-free method using the adaptive kernel technique to hazard estimation is explored for regions having distributed and diffused seismicity. Chennai city is in such a region with low to moderate seismicity so it has been used as a case study. The adaptive kernel technique is statistically superior to the fixed kernel technique primarily because the bandwidth of the kernel is varied spatially depending on the clustering or sparseness of the epicentres. Although the fixed kernel technique has proven to work well in general density estimation cases, it fails to perform in the case of multimodal and long tail distributions. In such situations, the adaptive kernel technique serves the purpose and is more relevant in earthquake engineering as the activity rate probability density surface is multimodal in nature. The peak ground acceleration (PGA) obtained from all the three approaches (i.e., the Cornell-McGuire approach, fixed kernel and adaptive kernel techniques) for 10% probability of exceedance in 50 years is around 0.087 g. The uniform hazard spectra (UHS) are also provided for different structural periods.
How does spatial study design influence density estimates from spatial capture-recapture models?
Directory of Open Access Journals (Sweden)
Rahel Sollmann
Full Text Available When estimating population density from data collected on non-invasive detector arrays, recently developed spatial capture-recapture (SCR models present an advance over non-spatial models by accounting for individual movement. While these models should be more robust to changes in trapping designs, they have not been well tested. Here we investigate how the spatial arrangement and size of the trapping array influence parameter estimates for SCR models. We analysed black bear data collected with 123 hair snares with an SCR model accounting for differences in detection and movement between sexes and across the trapping occasions. To see how the size of the trap array and trap dispersion influence parameter estimates, we repeated analysis for data from subsets of traps: 50% chosen at random, 50% in the centre of the array and 20% in the South of the array. Additionally, we simulated and analysed data under a suite of trap designs and home range sizes. In the black bear study, we found that results were similar across trap arrays, except when only 20% of the array was used. Black bear density was approximately 10 individuals per 100 km(2. Our simulation study showed that SCR models performed well as long as the extent of the trap array was similar to or larger than the extent of individual movement during the study period, and movement was at least half the distance between traps. SCR models performed well across a range of spatial trap setups and animal movements. Contrary to non-spatial capture-recapture models, they do not require the trapping grid to cover an area several times the average home range of the studied species. This renders SCR models more appropriate for the study of wide-ranging mammals and more flexible to design studies targeting multiple species.
Shimizu, Noritaka; Futamura, Yasunori; Sakurai, Tetsuya; Mizusaki, Takahiro; Otsuka, Takaharu
2016-01-01
We introduce a novel method to obtain level densities in large-scale shell-model calculations. Our method is a stochastic estimation of eigenvalue count based on a shifted Krylov-subspace method, which enables us to obtain level densities of huge Hamiltonian matrices. This framework leads to a successful description of both low-lying spectroscopy and the experimentally observed equilibration of $J^\\pi=2^+$ and $2^-$ states in $^{58}$Ni in a unified manner.
Directory of Open Access Journals (Sweden)
Noritaka Shimizu
2016-02-01
Full Text Available We introduce a novel method to obtain level densities in large-scale shell-model calculations. Our method is a stochastic estimation of eigenvalue count based on a shifted Krylov-subspace method, which enables us to obtain level densities of huge Hamiltonian matrices. This framework leads to a successful description of both low-lying spectroscopy and the experimentally observed equilibration of Jπ=2+ and 2− states in 58Ni in a unified manner.
International perspectives on emergency department crowding.
Pines, Jesse M; Hilton, Joshua A; Weber, Ellen J; Alkemade, Annechien J; Al Shabanah, Hasan; Anderson, Philip D; Bernhard, Michael; Bertini, Alessio; Gries, André; Ferrandiz, Santiago; Kumar, Vijaya Arun; Harjola, Veli-Pekka; Hogan, Barbara; Madsen, Bo; Mason, Suzanne; Ohlén, Gunnar; Rainer, Timothy; Rathlev, Niels; Revue, Eric; Richardson, Drew; Sattarian, Mehdi; Schull, Michael J
2011-12-01
The maturation of emergency medicine (EM) as a specialty has coincided with dramatic increases in emergency department (ED) visit rates, both in the United States and around the world. ED crowding has become a public health problem where periodic supply and demand mismatches in ED and hospital resources cause long waiting times and delays in critical treatments. ED crowding has been associated with several negative clinical outcomes, including higher complication rates and mortality. This article describes emergency care systems and the extent of crowding across 15 countries outside of the United States: Australia, Canada, Denmark, Finland, France, Germany, Hong Kong, India, Iran, Italy, The Netherlands, Saudi Arabia, Catalonia (Spain), Sweden, and the United Kingdom. The authors are local emergency care leaders with knowledge of emergency care in their particular countries. Where available, data are provided about visit patterns in each country; however, for many of these countries, no national data are available on ED visits rates or crowding. For most of the countries included, there is both objective evidence of increases in ED visit rates and ED crowding and also subjective assessments of trends toward higher crowding in the ED. ED crowding appears to be worsening in many countries despite the presence of universal health coverage. Scandinavian countries with robust systems to manage acute care outside the ED do not report crowding is a major problem. The main cause for crowding identified by many authors is the boarding of admitted patients, similar to the United States. Many hospitals in these countries have implemented operational interventions to mitigate crowding in the ED, and some countries have imposed strict limits on ED length of stay (LOS), while others have no clear plan to mitigate crowding. An understanding of the causes and potential solutions implemented in these countries can provide a lens into how to mitigate ED crowding in the United States
GPU Generation of Large Varied Animated Crowds
Isaac Rudomin; Benjamín Hernández; Oriam de Gyves; Leonel Toledo; Ivan Rivalcoba; Sergio Ruiz
2013-01-01
..We discuss several steps in the process of simulating and visualizing large and varied crowds in real time for consumer-level computers and graphic cards (GPUs). Animating varied crowds using a diversity of models and animations (assets) is complex and costly. One has to use models that are expensive if bought, take a long time to model, and consume too much memory and computing resources. We discuss methods for simulating, generating, animating and rendering crowds of varied aspect and a d...
Akune, Tadahiro; Sakamoto, Nobuyoshi
2009-03-01
In a multifilamentary wire proximity-currents between filaments show a close resemblance with the inter-grain current in a high-Tc superconductor. The critical current densities of the proximity-induced superconducting matrix Jcm can be estimated from measured twist-pitch dependence of magnetization and have been shown to follow the well-known scaling law of the pinning strength. The grained Bean model is applied on the multifilamentary wire to obtain Jcm, where the filaments are immersed in the proximity-induced superconducting matrix. Difference of the superconducting characteristics of the filament, the matrix and the filament content factor give a variety of deformation on the AC susceptibility curves. The computed AC susceptibility curves of multifilamentary wires using the grained Bean model are favorably compared with the experimental results. The values of Jcm estimated from the susceptibilities using the grained Bean model are comparable to those estimated from measured twist-pitch dependence of magnetization. The applicability of the grained Bean model on the multifilamentary wire is discussed in detail.
Energy Technology Data Exchange (ETDEWEB)
Akune, Tadahiro; Sakamoto, Nobuyoshi, E-mail: akune@te.kyusan-u.ac.j [Department of Electrical Engineering and Information Technology, Kyushu Sangyo University, 2-3-1 Matsukadai, Fukuoka 813-8503 (Japan)
2009-03-01
In a multifilamentary wire proximity-currents between filaments show a close resemblance with the inter-grain current in a high-T{sub c} superconductor. The critical current densities of the proximity-induced superconducting matrix J{sub cm} can be estimated from measured twist-pitch dependence of magnetization and have been shown to follow the well-known scaling law of the pinning strength. The grained Bean model is applied on the multifilamentary wire to obtain J{sub cm}, where the filaments are immersed in the proximity-induced superconducting matrix. Difference of the superconducting characteristics of the filament, the matrix and the filament content factor give a variety of deformation on the AC susceptibility curves. The computed AC susceptibility curves of multifilamentary wires using the grained Bean model are favorably compared with the experimental results. The values of J{sub cm} estimated from the susceptibilities using the grained Bean model are comparable to those estimated from measured twist-pitch dependence of magnetization. The applicability of the grained Bean model on the multifilamentary wire is discussed in detail.
Fu, Libi; Song, Weiguo; Lo, Siuming
2017-01-01
Emergencies involved in mass events are related to a variety of factors and processes. An important factor is the transmission of information on danger that has an influence on nonlinear crowd dynamics during the process of crowd dispersion. Due to much uncertainty in this process, there is an urgent need to propose a method to investigate the influence. In this paper, a novel fuzzy-theory-based method is presented to study crowd dynamics under the influence of information transmission. Fuzzy functions and rules are designed for the ambiguous description of human states. Reasonable inference is employed to decide the output values of decision making such as pedestrian movement speed and directions. Through simulation under four-way pedestrian situations, good crowd dispersion phenomena are achieved. Simulation results under different conditions demonstrate that information transmission cannot always induce successful crowd dispersion in all situations. This depends on whether decision strategies in response to information on danger are unified and effective, especially in dense crowds. Results also suggest that an increase in drift strength at low density and the percentage of pedestrians, who choose one of the furthest unoccupied Von Neumann neighbors from the dangerous source as the drift direction at high density, is helpful in crowd dispersion. Compared with previous work, our comprehensive study improves an in-depth understanding of nonlinear crowd dynamics under the effect of information on danger.
Macromolecular crowding: Macromolecules friend or foe.
Mittal, Shruti; Chowhan, Rimpy Kaur; Singh, Laishram Rajendrakumar
2015-09-01
Cellular interior is known to be densely crowded due to the presence of soluble and insoluble macromolecules, which altogether occupy ~40% of the total cellular volume. This results in altered biological properties of macromolecules. Macromolecular crowding is observed to have both positive and negative effects on protein folding, structure, stability and function. Significant data has been accumulated so far on both the aspects. However, most of the review articles so far have focused on the positive aspect of macromolecular crowding and not much attention has been paid on the deleterious aspect of crowding on macromolecules. In order to have a complete knowledge of the effect of macromolecular crowding on proteins and enzymes, it is important to look into both the aspects of crowding to determine its precise role under physiological conditions. To fill the gap in the understanding of the effect of macromolecular crowding on proteins and enzymes, this review article focuses on the deleterious influence of crowding on macromolecules. Macromolecular crowding is not always good but also has several deleterious effects on various macromolecular properties. Taken together, the properties of biological macromolecules in vivo appears to be finely regulated by the nature and level of the intracellular crowdedness in order to perform their biological functions appropriately. The information provided here gives an understanding of the role played by the nature and level of cellular crowdedness in intensifying and/or alleviating the burden of various proteopathies. Copyright © 2015 Elsevier B.V. All rights reserved.
On the generality of crowding: visual crowding in size, saturation, and hue compared to orientation.
van den Berg, Ronald; Roerdink, Jos B T M; Cornelissen, Frans W
2007-07-17
Perception of peripherally viewed shapes is impaired when surrounded by similar shapes. This phenomenon is commonly referred to as "crowding". Although studied extensively for perception of characters (mainly letters) and, to a lesser extent, for orientation, little is known about whether and how crowding affects perception of other features. Nevertheless, current crowding models suggest that the effect should be rather general and thus not restricted to letters and orientation. Here, we report on a series of experiments investigating crowding in the following elementary feature dimensions: size, hue, and saturation. Crowding effects in these dimensions were benchmarked against those in the orientation domain. Our primary finding is that all features studied show clear signs of crowding. First, identification thresholds increase with decreasing mask spacing. Second, for all tested features, critical spacing appears to be roughly half the viewing eccentricity and independent of stimulus size, a property previously proposed as the hallmark of crowding. Interestingly, although critical spacings are highly comparable, crowding magnitude differs across features: Size crowding is almost as strong as orientation crowding, whereas the effect is much weaker for saturation and hue. We suggest that future theories and models of crowding should be able to accommodate these differences in crowding effects.
Agent Based Modeling and Simulation of Pedestrian Crowds In Panic Situations
Alrashed, Mohammed
2016-11-01
The increasing occurrence of panic stampedes during mass events has motivated studying the impact of panic on crowd dynamics and the simulation of pedestrian flows in panic situations. The lack of understanding of panic stampedes still causes hundreds of fatalities each year, not to mention the scarce methodical studies of panic behavior capable of envisaging such crowd dynamics. Under those circumstances, there are thousands of fatalities and twice that many of injuries every year caused be crowd stampede worldwide, despite the tremendous efforts of crowd control and massive numbers of safekeeping forces. Pedestrian crowd dynamics are generally predictable in high-density crowds where pedestrians cannot move freely and thus gives rise to self-propelling interactions between pedestrians. Although every pedestrian has personal preferences, the motion dynamics can be modeled as a social force in such crowds. These forces are representations of internal preferences and objectives to perform certain actions or movements. The corresponding forces can be controlled for each individual to represent a different variety of behaviors that can be associated with panic situations such as escaping danger, clustering, and pushing. In this thesis, we use an agent-based model of pedestrian behavior in panic situations to predict the collective human behavior in such crowd dynamics. The proposed simulations suggests a practical way to alleviate fatalities and minimize the evacuation time in panic situations. Moreover, we introduce contagious panic and pushing behavior, resulting in a more realistic crowd dynamics model. The proposed methodology describes the intensity and spread of panic for each individual as a function of distances between pedestrians.
CrowdPhase: crowdsourcing the phase problem.
Jorda, Julien; Sawaya, Michael R; Yeates, Todd O
2014-06-01
The human mind innately excels at some complex tasks that are difficult to solve using computers alone. For complex problems amenable to parallelization, strategies can be developed to exploit human intelligence in a collective form: such approaches are sometimes referred to as `crowdsourcing'. Here, a first attempt at a crowdsourced approach for low-resolution ab initio phasing in macromolecular crystallography is proposed. A collaborative online game named CrowdPhase was designed, which relies on a human-powered genetic algorithm, where players control the selection mechanism during the evolutionary process. The algorithm starts from a population of `individuals', each with a random genetic makeup, in this case a map prepared from a random set of phases, and tries to cause the population to evolve towards individuals with better phases based on Darwinian survival of the fittest. Players apply their pattern-recognition capabilities to evaluate the electron-density maps generated from these sets of phases and to select the fittest individuals. A user-friendly interface, a training stage and a competitive scoring system foster a network of well trained players who can guide the genetic algorithm towards better solutions from generation to generation via gameplay. CrowdPhase was applied to two synthetic low-resolution phasing puzzles and it was shown that players could successfully obtain phase sets in the 30° phase error range and corresponding molecular envelopes showing agreement with the low-resolution models. The successful preliminary studies suggest that with further development the crowdsourcing approach could fill a gap in current crystallographic methods by making it possible to extract meaningful information in cases where limited resolution might otherwise prevent initial phasing.
Pascual-Marqui, R D; Gonzalez-Andino, S L; Valdes-Sosa, P A; Biscay-Lirio, R
1988-12-01
A method for the spatial analysis of EEG and EP data, based on the spherical harmonic Fourier expansion (SHE) of scalp potential measurements, is described. This model provides efficient and accurate formulas for: (1) the computation of the surface Laplacian and (2) the interpolation of electrical potentials, current source densities, test statistics and other derived variables. Physiologically based simulation experiments show that the SHE method gives better estimates of the surface Laplacian than the commonly used finite difference method. Cross-validation studies for the objective comparison of different interpolation methods demonstrate the superiority of the SHE over the commonly used methods based on the weighted (inverse distance) average of the nearest three and four neighbor values.
Current-source density analysis of slow brain potentials during time estimation.
Gibbons, Henning; Rammsayer, Thomas H
2004-11-01
Two event-related potential studies were conducted to investigate differential brain correlates of temporal processing of intervals below and above 3-4 s. In the first experiment, 24 participants were presented with auditorily marked target durations of 2, 4, and 6 s that had to be reproduced. Timing accuracy was similar for all three target durations. As revealed by current-source density analysis, slow-wave components during both presentation and reproduction were independent of target duration. Experiment 2 examined potential modulating effects of type of interval (filled and empty) and presentation mode (randomized and blocked presentation of target durations). Behavioral and slow-wave findings were consistent with those of Experiment 1. Thus, the present findings support the notion of a general timing mechanism irrespective of interval duration as proposed by scalar timing theory and pacemaker-counter models of time estimation.
Optimal estimation of free energies and stationary densities from multiple biased simulations
Wu, Hao
2013-01-01
When studying high-dimensional dynamical systems such as macromolecules, quantum systems and polymers, a prime concern is the identification of the most probable states and their stationary probabilities or free energies. Often, these systems have metastable regions or phases, prohibiting to estimate the stationary probabilities by direct simulation. Efficient sampling methods such as umbrella sampling, metadynamics and conformational flooding have developed that perform a number of simulations where the system's potential is biased such as to accelerate the rare barrier crossing events. A joint free energy profile or stationary density can then be obtained from these biased simulations with weighted histogram analysis method (WHAM). This approach (a) requires a few essential order parameters to be defined in which the histogram is set up, and (b) assumes that each simulation is in global equilibrium. Both assumptions make the investigation of high-dimensional systems with previously unknown energy landscape ...
Daniell method for power spectral density estimation in atomic force microscopy
Energy Technology Data Exchange (ETDEWEB)
Labuda, Aleksander [Asylum Research an Oxford Instruments Company, Santa Barbara, California 93117 (United States)
2016-03-15
An alternative method for power spectral density (PSD) estimation—the Daniell method—is revisited and compared to the most prevalent method used in the field of atomic force microscopy for quantifying cantilever thermal motion—the Bartlett method. Both methods are shown to underestimate the Q factor of a simple harmonic oscillator (SHO) by a predictable, and therefore correctable, amount in the absence of spurious deterministic noise sources. However, the Bartlett method is much more prone to spectral leakage which can obscure the thermal spectrum in the presence of deterministic noise. By the significant reduction in spectral leakage, the Daniell method leads to a more accurate representation of the true PSD and enables clear identification and rejection of deterministic noise peaks. This benefit is especially valuable for the development of automated PSD fitting algorithms for robust and accurate estimation of SHO parameters from a thermal spectrum.
Proactive Uniform Data Replication by Density Estimation in Apollonian P2P Networks
Bonnel, Nicolas; Ménier, Gildas; Marteau, Pierre-François
We propose a data replication scheme on a random apollonian P2P overlay that benefits from the small world and scale free properties. The proposed algorithm features a replica density estimation and a space filling mechanism designed to avoid redundant messages. Not only it provides uniform replication of the data stored into the network but it also improves on classical flooding approaches by removing any redundancy. This last property is obtained at the cost of maintaining a random apollonian overlay. Thanks to the small world and scale free properties of the random apollonian P2P overlay, the search efficiency of the space filling tree algorithm we propose has comparable performances with the classical flooding algorithm on a random network.
Dos Santos, Alessio Moreira; Mitja, Danielle; Delaître, Eric; Demagistri, Laurent; de Souza Miranda, Izildinha; Libourel, Thérèse; Petit, Michel
2017-05-15
High spatial resolution images as well as image processing and object detection algorithms are recent technologies that aid the study of biodiversity and commercial plantations of forest species. This paper seeks to contribute knowledge regarding the use of these technologies by studying randomly dispersed native palm tree. Here, we analyze the automatic detection of large circular crown (LCC) palm tree using a high spatial resolution panchromatic GeoEye image (0.50 m) taken on the area of a community of small agricultural farms in the Brazilian Amazon. We also propose auxiliary methods to estimate the density of the LCC palm tree Attalea speciosa (babassu) based on the detection results. We used the "Compt-palm" algorithm based on the detection of palm tree shadows in open areas via mathematical morphology techniques and the spatial information was validated using field methods (i.e. structural census and georeferencing). The algorithm recognized individuals in life stages 5 and 6, and the extraction percentage, branching factor and quality percentage factors were used to evaluate its performance. A principal components analysis showed that the structure of the studied species differs from other species. Approximately 96% of the babassu individuals in stage 6 were detected. These individuals had significantly smaller stipes than the undetected ones. In turn, 60% of the stage 5 babassu individuals were detected, showing significantly a different total height and a different number of leaves from the undetected ones. Our calculations regarding resource availability indicate that 6870 ha contained 25,015 adult babassu palm tree, with an annual potential productivity of 27.4 t of almond oil. The detection of LCC palm tree and the implementation of auxiliary field methods to estimate babassu density is an important first step to monitor this industry resource that is extremely important to the Brazilian economy and thousands of families over a large scale.
Directory of Open Access Journals (Sweden)
David Novelli
Full Text Available Exposure to crowding is said to be aversive, yet people also seek out and enjoy crowded situations. We surveyed participants at two crowd events to test the prediction of self-categorization theory that variable emotional responses to crowding are a function of social identification with the crowd. In data collected from participants who attended a crowded outdoor music event (n = 48, identification with the crowd predicted feeling less crowded; and there was an indirect effect of identification with the crowd on positive emotion through feeling less crowded. Identification with the crowd also moderated the relation between feeling less crowded and positive emotion. In data collected at a demonstration march (n = 112, identification with the crowd predicted central (most dense location in the crowd; and there was an indirect effect of identification with the crowd on positive emotion through central location in the crowd. Positive emotion in the crowd also increased over the duration of the crowd event. These findings are in line with the predictions of self-categorization theory. They are inconsistent with approaches that suggest that crowding is inherently aversive; and they cannot easily be explained through the concept of 'personal space'.
Crowding, reading, and developmental dyslexia.
Martelli, Marialuisa; Di Filippo, Gloria; Spinelli, Donatella; Zoccolotti, Pierluigi
2009-04-17
We tested the hypothesis that crowding effects are responsible for the reading slowness characteristic of developmental dyslexia. A total of twenty-nine Italian dyslexics and thirty-three age-matched controls participated in various parts of the study. In Experiment 1, we measured contrast thresholds for identifying letters and words as a function of stimulus duration. Thresholds were higher in dyslexics than controls for words (at a limited time exposure) but not for single letters. Adding noise to the stimuli produced comparable effects in dyslexics and controls. At the long time exposure thresholds were comparable in the two groups. In Experiment 2, we measured the spacing between a target letter and two flankers at a fixed level of performance as a function of eccentricity and size. With eccentricity, the critical spacing (CS) scaled in the control group with 0.62 proportionality (a value of b close to Bouma's law, 0.50) and with a greater proportionality (0.95) in the dyslexic group. CS was independent of size in both groups. In Experiment 3, we examined the critical print size (CPS), that is, the increase in reading rate up to a critical character size (S. T. Chung, J. S. Mansfield, & G. E. Legge, 1998). CPS of dyslexic children was greater than that of controls. Individual maximal reading speed was predicted by individual bs (from Experiment 2). The maximal reading rate achieved by dyslexics at CPS (and also for larger print sizes) was below the values observed in controls. We conclude that word analysis in dyslexics is slowed because of greater crowding effects, which limit letter identification in multi-letter arrays across the visual field. We propose that the peripheral reading of normal readers might constitute a model for dyslexic reading. The periphery model accounts for 60% of dyslexics' slowness. After compensating for crowding, the dyslexics' reading rate remains slower than that of proficient readers. This failure is discussed in terms of a
Crowd Simulation and Its Applications：Recent Advances
Institute of Scientific and Technical Information of China (English)
徐明亮; 蒋浩; 金小刚; 邓志刚
2014-01-01
This article surveys the state-of-the-art crowd simulation techniques and their selected applications, with its focus on our recent research advances in this rapidly growing research field. We first give a categorized overview on the mainstream methodologies of crowd simulation. Then, we describe our recent research advances on crowd evacuation, pedestrian crowds, crowd formation, traffic simulation, and swarm simulation. Finally, we offer our viewpoints on open crowd simulation research challenges and point out potential future directions in this field.
Chen, Lin; Ray, Shonket; Keller, Brad M; Pertuz, Said; McDonald, Elizabeth S; Conant, Emily F; Kontos, Despina
2016-09-01
Purpose To investigate the impact of radiation dose on breast density estimation in digital mammography. Materials and Methods With institutional review board approval and Health Insurance Portability and Accountability Act compliance under waiver of consent, a cohort of women from the American College of Radiology Imaging Network Pennsylvania 4006 trial was retrospectively analyzed. All patients underwent breast screening with a combination of dose protocols, including standard full-field digital mammography, low-dose digital mammography, and digital breast tomosynthesis. A total of 5832 images from 486 women were analyzed with previously validated, fully automated software for quantitative estimation of density. Clinical Breast Imaging Reporting and Data System (BI-RADS) density assessment results were also available from the trial reports. The influence of image acquisition radiation dose on quantitative breast density estimation was investigated with analysis of variance and linear regression. Pairwise comparisons of density estimations at different dose levels were performed with Student t test. Agreement of estimation was evaluated with quartile-weighted Cohen kappa values and Bland-Altman limits of agreement. Results Radiation dose of image acquisition did not significantly affect quantitative density measurements (analysis of variance, P = .37 to P = .75), with percent density demonstrating a high overall correlation between protocols (r = 0.88-0.95; weighted κ = 0.83-0.90). However, differences in breast percent density (1.04% and 3.84%, P digital mammography are not substantially affected by variations in radiation dose; thus, the use of low-dose techniques for the purpose of density estimation may be feasible. (©) RSNA, 2016 Online supplemental material is available for this article.
Kamousi, Baharan; Amini, Ali Nasiri; He, Bin
2007-06-01
The goal of the present study is to employ the source imaging methods such as cortical current density estimation for the classification of left- and right-hand motor imagery tasks, which may be used for brain-computer interface (BCI) applications. The scalp recorded EEG was first preprocessed by surface Laplacian filtering, time-frequency filtering, noise normalization and independent component analysis. Then the cortical imaging technique was used to solve the EEG inverse problem. Cortical current density distributions of left and right trials were classified from each other by exploiting the concept of Von Neumann entropy. The proposed method was tested on three human subjects (180 trials each) and a maximum accuracy of 91.5% and an average accuracy of 88% were obtained. The present results confirm the hypothesis that source analysis methods may improve accuracy for classification of motor imagery tasks. The present promising results using source analysis for classification of motor imagery enhances our ability of performing source analysis from single trial EEG data recorded on the scalp, and may have applications to improved BCI systems.
Comparison of volatility function technique for risk-neutral densities estimation
Bahaludin, Hafizah; Abdullah, Mimi Hafizah
2017-08-01
Volatility function technique by using interpolation approach plays an important role in extracting the risk-neutral density (RND) of options. The aim of this study is to compare the performances of two interpolation approaches namely smoothing spline and fourth order polynomial in extracting the RND. The implied volatility of options with respect to strike prices/delta are interpolated to obtain a well behaved density. The statistical analysis and forecast accuracy are tested using moments of distribution. The difference between the first moment of distribution and the price of underlying asset at maturity is used as an input to analyze forecast accuracy. RNDs are extracted from the Dow Jones Industrial Average (DJIA) index options with a one month constant maturity for the period from January 2011 until December 2015. The empirical results suggest that the estimation of RND using a fourth order polynomial is more appropriate to be used compared to a smoothing spline in which the fourth order polynomial gives the lowest mean square error (MSE). The results can be used to help market participants capture market expectations of the future developments of the underlying asset.
Energy Technology Data Exchange (ETDEWEB)
Lal, Ratan, E-mail: rlal_npl_3543@yahoo.i [Superconductivity Division, National Physical Laboratory, Council of Scientific and Industrial Research, Dr. K.S. Krishnan Road, New Delhi 110012 (India)
2010-02-15
The critical current density J{sub c} of some of the superconducting samples, calculated on the basis of the Bean's model, shows negative curvature for low magnetic field with a downward bending near H = 0. To avoid this problem Kim's expression of the critical current density, J{sub c} = k/(H{sub 0} + H), where J{sub c} has positive curvature for all H, has been employed by connecting the positive constants k and H{sub 0} with the features of the hysteresis loop of a superconductor. A relation between the full penetration field H{sub p} and the magnetic field H{sub min}, at which the magnetization is minimum, is obtained from the Kim's theory. Taking the value of J{sub c} at H = H{sub p} according to the actual loop width, as in the Bean's theory, and at H = 0 according to an enhanced loop width due to the local internal field, values of k and H{sub 0} are obtained in terms of the magnetization values M{sup +}(-H{sub min}), M{sup -}(H{sub min}), M{sup +}(H{sub p}) and M{sup -}(H{sub p}). The resulting method of estimating J{sub c} from the hysteresis loop turns out to be as simple as the Bean's method.
Lussana, C.
2013-04-01
The presented work focuses on the investigation of gridded daily minimum (TN) and maximum (TX) temperature probability density functions (PDFs) with the intent of both characterising a region and detecting extreme values. The empirical PDFs estimation procedure has been realised using the most recent years of gridded temperature analysis fields available at ARPA Lombardia, in Northern Italy. The spatial interpolation is based on an implementation of Optimal Interpolation using observations from a dense surface network of automated weather stations. An effort has been made to identify both the time period and the spatial areas with a stable data density otherwise the elaboration could be influenced by the unsettled station distribution. The PDF used in this study is based on the Gaussian distribution, nevertheless it is designed to have an asymmetrical (skewed) shape in order to enable distinction between warming and cooling events. Once properly defined the occurrence of extreme events, it is possible to straightforwardly deliver to the users the information on a local-scale in a concise way, such as: TX extremely cold/hot or TN extremely cold/hot.
Jacob Strunk; Hailemariam Temesgen; Hans-Erik Andersen; James P. Flewelling; Lisa Madsen
2012-01-01
Using lidar in an area-based model-assisted approach to forest inventory has the potential to increase estimation precision for some forest inventory variables. This study documents the bias and precision of a model-assisted (regression estimation) approach to forest inventory with lidar-derived auxiliary variables relative to lidar pulse density and the number of...
Directory of Open Access Journals (Sweden)
Yang Zu
2015-07-01
Full Text Available This paper studies the asymptotic normality for the kernel deconvolution estimator when the noise distribution is logarithmic chi-square; both identical and independently distributed observations and strong mixing observations are considered. The dependent case of the result is applied to obtain the pointwise asymptotic distribution of the deconvolution volatility density estimator in discrete-time stochastic volatility models.
Probability density function and estimation for error of digitized map coordinates in GIS
Institute of Scientific and Technical Information of China (English)
童小华; 刘大杰
2004-01-01
Traditionally, it is widely accepted that measurement error usually obeys the normal distribution. However, in this paper a new idea is proposed that the error in digitized data which is a major derived data source in GIS does not obey the normal distribution but the p-norm distribution with a determinate parameter. Assuming that the error is random and has the same statistical properties, the probability density function of the normal distribution,Laplace distribution and p-norm distribution are derived based on the arithmetic mean axiom, median axiom and pmedian axiom, which means that the normal distribution is only one of these distributions but not the least one.Based on this idea, distribution fitness tests such as Skewness and Kurtosis coefficient test, Pearson chi-square x2 test and Kolmogorov test for digitized data are conducted. The results show that the error in map digitization obeys the p-norm distribution whose parameter is close to 1.60. A least p-norm estimation and the least square estimation of digitized data are further analyzed, showing that the least p-norm adiustment is better than the least square adjustment for digitized data processing in GIS.
Estimating basic wood density and its uncertainty for Pinus densiflora in the Republic of Korea
Directory of Open Access Journals (Sweden)
Jung Kee Pyo
2012-05-01
Full Text Available According to the Intergovernmental Panel on Climate Change(IPCC guidelines, uncertainty assessment is an important aspect of a greenhouse gas inventory, and effort should be made to incorporate it into the reporting. The goal of this study was to estimate basic wood density (BWD and its uncertainty for Pinus densiflora (Siebold & Zucc. in Korea. In this study, P. densiflora forests throughout the country were divided into two regional variants, which were the Gangwon region variant, distributed on the northeastern part of the country, and the central region variant. A total of 36 representative sampling plots were selected in both regions to collect sampletrees for destructive sampling. The trees were selected considering the distributions of tree age and diameter at breast height. Hypothesis testing was carried out to test the BWD differences between two age groups, i.e. age ≥ 20 and < 20, and differences between the two regions. The test suggested that there was no statistically significant difference between the two age classes. On the other hand, it is suggested a strong evidence of a statistically significant difference between regions. The BWD and its uncertainty were0.418 g/cm3 and 11.9% for the Gangwon region, whereas they were 0.471g/ cm3 and 3.8% for the central region. As a result, the estimated BWD for P.densiflora was more precise than the value provided by the IPCC guidelines.
Spatial Variation in Tree Density and Estimated Aboveground Carbon Stocks in Southern Africa
Directory of Open Access Journals (Sweden)
Lulseged Tamene
2016-03-01
Full Text Available Variability in woody plant species, vegetation assemblages and anthropogenic activities derails the efforts to have common approaches for estimating biomass and carbon stocks in Africa. In order to suggest management options, it is important to understand the vegetation dynamics and the major drivers governing the observed conditions. This study uses data from 29 sentinel landscapes (4640 plots across the southern Africa. We used T-Square distance method to sample trees. Allometric models were used to estimate aboveground tree biomass from which aboveground biomass carbon stock (AGBCS was derived for each site. Results show average tree density of 502 trees·ha−1 with semi-arid areas having the highest (682 trees·ha−1 and arid regions the lowest (393 trees·ha−1. The overall AGBCS was 56.4 Mg·ha−1. However, significant site to site variability existed across the region. Over 60 fold differences were noted between the lowest AGBCS (2.2 Mg·ha−1 in the Musungwa plains of Zambia and the highest (138.1 Mg·ha−1 in the scrublands of Kenilworth in Zimbabwe. Semi-arid and humid sites had higher carbon stocks than sites in sub-humid and arid regions. Anthropogenic activities also influenced the observed carbon stocks. Repeated measurements would reveal future trends in tree cover and carbon stocks across different systems.
Institute of Scientific and Technical Information of China (English)
YAN Hao; WANG Hu; WANG Yong-hui; ZHANG Yu-mei
2013-01-01
Background The classification of Alzheimer's disease (AD) from magnetic resonance imaging (MRI) has been challenged by lack of effective and reliable biomarkers due to inter-subject variability.This article presents a classification method for AD based on kernel density estimation (KDE) of local features.Methods First,a large number of local features were extracted from stable image blobs to represent various anatomical patterns for potential effective biomarkers.Based on distinctive descriptors and locations,the local features were robustly clustered to identify correspondences of the same underlying patterns.Then,the KDE was used to estimate distribution parameters of the correspondences by weighting contributions according to their distances.Thus,biomarkers could be reliably quantified by reducing the effects of further away correspondences which were more likely noises from inter-subject variability.Finally,the Bayes classifier was applied on the distribution parameters for the classification of AD.Results Experiments were performed on different divisions of a publicly available database to investigate the accuracy and the effects of age and AD severity.Our method achieved an equal error classification rate of 0.85 for subject aged 60-80 years exhibiting mild AD and outperformed a recent local feature-based work regardless of both effects.Conclusions We proposed a volumetric brain MRI classification method for neurodegenerative disease based on statistics of local features using KDE.The method may be potentially useful for the computer-aided diagnosis in clinical settings.
mBEEF: An accurate semi-local Bayesian error estimation density functional
Wellendorff, Jess; Lundgaard, Keld T.; Jacobsen, Karsten W.; Bligaard, Thomas
2014-04-01
We present a general-purpose meta-generalized gradient approximation (MGGA) exchange-correlation functional generated within the Bayesian error estimation functional framework [J. Wellendorff, K. T. Lundgaard, A. Møgelhøj, V. Petzold, D. D. Landis, J. K. Nørskov, T. Bligaard, and K. W. Jacobsen, Phys. Rev. B 85, 235149 (2012)]. The functional is designed to give reasonably accurate density functional theory (DFT) predictions of a broad range of properties in materials physics and chemistry, while exhibiting a high degree of transferability. Particularly, it improves upon solid cohesive energies and lattice constants over the BEEF-vdW functional without compromising high performance on adsorption and reaction energies. We thus expect it to be particularly well-suited for studies in surface science and catalysis. An ensemble of functionals for error estimation in DFT is an intrinsic feature of exchange-correlation models designed this way, and we show how the Bayesian ensemble may provide a systematic analysis of the reliability of DFT based simulations.
Methods for Estimating Environmental Effects and Constraints on NexGen: High Density Case Study
Augustine, S.; Ermatinger, C.; Graham, M.; Thompson, T.
2010-01-01
This document provides a summary of the current methods developed by Metron Aviation for the estimate of environmental effects and constraints on the Next Generation Air Transportation System (NextGen). This body of work incorporates many of the key elements necessary to achieve such an estimate. Each section contains the background and motivation for the technical elements of the work, a description of the methods used, and possible next steps. The current methods described in this document were selected in an attempt to provide a good balance between accuracy and fairly rapid turn around times to best advance Joint Planning and Development Office (JPDO) System Modeling and Analysis Division (SMAD) objectives while also supporting the needs of the JPDO Environmental Working Group (EWG). In particular this document describes methods applied to support the High Density (HD) Case Study performed during the spring of 2008. A reference day (in 2006) is modeled to describe current system capabilities while the future demand is applied to multiple alternatives to analyze system performance. The major variables in the alternatives are operational/procedural capabilities for airport, terminal, and en route airspace along with projected improvements to airframe, engine and navigational equipment.
NEAR INFRARED SPECTROSCOPY FOR ESTIMATING SUGARCANE BAGASSE CONTENT IN MEDIUM DENSITY FIBERBOARD
Directory of Open Access Journals (Sweden)
Ugo Leandro Belini
2011-04-01
Full Text Available Medium density fiberboard (MDF is an engineered wood product formed by breaking down selected lignin-cellulosic material residuals into fibers, combining it with wax and a resin binder, and then forming panels by applying high temperature and pressure. Because the raw material in the industrial process is ever-changing, the panel industry requires methods for monitoring the composition of their products. The aim of this study was to estimate the ratio of sugarcane (SC bagasse to Eucalyptus wood in MDF panels using near infrared (NIR spectroscopy. Principal component analysis (PCA and partial least square (PLS regressions were performed. MDF panels having different bagasse contents were easily distinguished from each other by the PCA of their NIR spectra with clearly different patterns of response. The PLS-R models for SC content of these MDF samples presented a strong coefficient of determination (0.96 between the NIR-predicted and Lab-determined values and a low standard error of prediction (~1.5% in the cross-validations. A key role of resins (adhesives, cellulose, and lignin for such PLS-R calibrations was shown. PLS-DA model correctly classified ninety-four percent of MDF samples by cross-validations and ninety-eight percent of the panels by independent test set. These NIR-based models can be useful to quickly estimate sugarcane bagasse vs. Eucalyptus wood content ratio in unknown MDF samples and to verify the quality of these engineered wood products in an online process.
Lee, Sooyeul; Jeong, Ji-Wook; Lee, Jeong Won; Yoo, Done-Sik; Kim, Seunghwan
2006-01-01
Osteoporosis is characterized by an abnormal loss of bone mineral content, which leads to a tendency to non-traumatic bone fractures or to structural deformations of bone. Thus, bone density measurement has been considered as a most reliable method to assess bone fracture risk due to osteoporosis. In past decades, X-ray images have been studied in connection with the bone mineral density estimation. However, the estimated bone mineral density from the X-ray image can undergo a relatively large accuracy or precision error. The most relevant origin of the accuracy or precision error may be unstable X-ray image acquisition condition. Thus, we focus our attentions on finding a bone mineral density estimation method that is relatively insensitive to the X-ray image acquisition condition. In this paper, we develop a simple technique for distal radius bone mineral density estimation using the trabecular bone filling factor in the X-ray image and apply the technique to the wrist X-ray images of 20 women. Estimated bone mineral density shows a high linear correlation with a dual-energy X-ray absorptiometry (r=0.87).
Zeng, L.; Doyle, E. J.; Rhodes, T. L.; Wang, G.; Sung, C.; Peebles, W. A.; Bobrek, M.
2016-11-01
A new model-based technique for fast estimation of the pedestal electron density gradient has been developed. The technique uses ordinary mode polarization profile reflectometer time delay data and does not require direct profile inversion. Because of its simple data processing, the technique can be readily implemented via a Field-Programmable Gate Array, so as to provide a real-time density gradient estimate, suitable for use in plasma control systems such as envisioned for ITER, and possibly for DIII-D and Experimental Advanced Superconducting Tokamak. The method is based on a simple edge plasma model with a linear pedestal density gradient and low scrape-off-layer density. By measuring reflectometer time delays for three adjacent frequencies, the pedestal density gradient can be estimated analytically via the new approach. Using existing DIII-D profile reflectometer data, the estimated density gradients obtained from the new technique are found to be in good agreement with the actual density gradients for a number of dynamic DIII-D plasma conditions.
Social influence makes self-interested crowds smarter: an optimal control perspective
Luo, Yu; Venkatasubramanian, Venkat
2016-01-01
It is very common to observe crowds of individuals solving similar problems with similar information in a largely independent manner. We argue here that crowds can become "smarter," i.e., more efficient and robust, by partially following the average opinion. This observation runs counter to the widely accepted claim that the wisdom of crowds deteriorates with social influence. The key difference is that individuals are self-interested and hence will reject feedbacks that do not improve their performance. We propose a control-theoretic methodology to compute the degree of social influence, i.e., the level to which one accepts the population feedback, that optimizes performance. We conducted an experiment with human subjects ($N = 194$), where the participants were first asked to solve an optimization problem independently, i.e., under no social influence. Our theoretical methodology estimates a $30\\%$ degree of social influence to be optimal, resulting in a $29\\%$ improvement in the crowd's performance. We the...
Comparing crowding in human and ideal observers.
van den Berg, Ronald; Johnson, Addie; Martinez Anton, Angela; Schepers, Anne L; Cornelissen, Frans W
2012-06-12
A visual target is more difficult to recognize when it is surrounded by other, similar objects. This breakdown in object recognition is known as crowding. Despite a long history of experimental work, computational models of crowding are still sparse. Specifically, few studies have examined crowding using an ideal-observer approach. Here, we compare crowding in ideal observers with crowding in humans. We derived an ideal-observer model for target identification under conditions of position and identity uncertainty. Simulations showed that this model reproduces the hallmark of crowding, namely a critical spacing that scales with viewing eccentricity. To examine how well the model fits quantitatively to human data, we performed three experiments. In Experiments 1 and 2, we measured observers' perceptual uncertainty about stimulus positions and identities, respectively, for a target in isolation. In Experiment 3, observers identified a target that was flanked by two distractors. We found that about half of the errors in Experiment 3 could be accounted for by the perceptual uncertainty measured in Experiments 1 and 2. The remainder of the errors could be accounted for by assuming that uncertainty (i.e., the width of internal noise distribution) about stimulus positions and identities depends on flanker proximity. Our results provide a mathematical restatement of the crowding problem and support the hypothesis that crowding behavior is a sign of optimality rather than a perceptual defect.
Cohort Crowding and Nonresident College Enrollment
Winters, John V.
2012-01-01
This study uses a fixed effects panel data framework to examine the effects of cohort crowding and other variables on nonresident enrollment at four-year public colleges and universities. The results suggest that larger cohorts of resident students crowd out nonresident students at flagship universities, but there is inconsistent evidence of crowd…
No priming for global motion in crowding.
Pavan, Andrea; Gall, Martin G; Manassi, Mauro; Greenlee, Mark W
2015-01-01
There is psychophysical evidence that low-level priming, e.g., from oriented gratings, as well as high-level semantic priming, survives crowding. We investigated priming for global translational motion in crowded and noncrowded conditions. The results indicated that reliable motion priming occurs in the noncrowded condition, but motion priming does not survive crowding. Crowding persisted despite variations in the direction of the flankers with respect to the prime's direction. Motion priming was still absent under crowding when 85% of the flankers moved in the same direction as the prime. Crowding also persisted despite variations in the speed of the flankers relative to the prime even when the flankers' speed was four times slower than the speed of the prime. However, a priming effect was evident when the prime's spatial location was precued and its distance to the flankers increased, suggesting a release from crowding. These results suggest that transient attention induced by precueing the spatial location of the prime may improve subjects' ability to discriminate its direction. Spatial cueing could act to decrease the integration field, thereby diminishing the influence of nearby distracters. In an additional experiment in which we used fewer flankers, we found a priming effect under conditions in which the interelement distance varied between flankers and prime. Overall, the results suggest that motion priming is strongly affected by crowding, but transient attention can partially retrieve such facilitation.
How crowded is the prokaryotic cytoplasm?
Spitzer, Jan; Poolman, Bert; Ferguson, Stuart
2013-01-01
We consider biomacromolecular crowding within the cytoplasm of prokaryotic cells as a two-phase system of 'supercrowded' cytogel and 'dilute' cytosol; we simplify and quantify this model for a coccoid cell over a wide range of biomacromolecular crowding. The key result shows that the supercrowded
Socialization of Social Anxiety in Adolescent Crowds
Van Zalk, Nejra; Van Zalk, Maarten Herman Walter; Kerr, Margaret
2011-01-01
In this study, we looked at whether social anxiety is socialized, or influenced by peers' social anxiety, more in some peer crowds than others. Adolescents in crowds with eye-catching appearances such as Goths and Punks (here termed "Radical"), were compared with three comparison groups. Using data from 796 adolescents (353 girls and 443 boys; M…
Perceptions of Emergency Department Crowding in Pennsylvania
Directory of Open Access Journals (Sweden)
Pines, Jesse M
2013-02-01
Full Text Available Introduction: The state of emergency department (ED crowding in Pennsylvania has not previously been reported.Methods: We assessed perceptions of ED crowding by surveying medical directors/chairs from Pennsylvania EDs in the spring of 2008.Results: A total of 106 completed the questionnaire (68% response rate. A total of 83% (86/104 agreed that ED crowding was a problem; 26% (27/105 reported that at least half of admitted patients boarded for more than 4 hours. Ninety-eight percent (102/104 agreed that patient satisfaction suffers during crowding and 79% (84/106 stated that quality suffers. Sixty-five percent (68/105 reported that crowding had worsened during the past 2 years. Several hospital interventions were used to alleviate crowding: expediting discharges, 81% (86/106; prioritizing ED patients for inpatient beds, 79% (84/ 106; and ambulance diversion, 55% (57/105. Almost all respondents who had improved ED operations reported that it had reduced crowding.Conclusion: ED crowding is a common problem in Pennsylvania and is worsening in the majority of hospitals, despite the implementation of a variety of interventions. [West J EmergMed. 2013;14(1:1–10.
Directory of Open Access Journals (Sweden)
François Pimont
2015-06-01
Full Text Available Leaf biomass distribution is a key factor for modeling energy and carbon fluxes in forest canopies and for assessing fire behavior. We propose a new method to estimate 3D leaf bulk density distribution, based on a calibration of indices derived from T-LiDAR. We applied the method to four contrasted plots in a mature Quercus pubescens forest. Leaf bulk densities were measured inside 0.7 m-diameter spheres, referred to as Calibration Volumes. Indices were derived from LiDAR point clouds and calibrated over the Calibration Volume bulk densities. Several indices were proposed and tested to account for noise resulting from mixed pixels and other theoretical considerations. The best index and its calibration parameter were then used to estimate leaf bulk densities at the grid nodes of each plot. These LiDAR-derived bulk density distributions were used to estimate bulk density vertical profiles and loads and above four meters compared well with those assessed by the classical inventory-based approach. Below four meters, the LiDAR-based approach overestimated bulk densities since no distinction was made between wood and leaf returns. The results of our method are promising since they demonstrate the possibility to assess bulk density on small plots at a reasonable operational cost.
Kaye, Jason; Yang, Chao
2014-01-01
Kohn-Sham density functional theory is one of the most widely used electronic structure theories. The recently developed adaptive local basis functions form an accurate and systematically improvable basis set for solving Kohn-Sham density functional theory using discontinuous Galerkin methods, requiring a small number of basis functions per atom. In this paper we develop residual-based a posteriori error estimates for the adaptive local basis approach, which can be used to guide non-uniform basis refinement for highly inhomogeneous systems such as surfaces and large molecules. The adaptive local basis functions are non-polynomial basis functions, and standard a posteriori error estimates for $hp$-refinement using polynomial basis functions do not directly apply. We generalize the error estimates for $hp$-refinement to non-polynomial basis functions. We demonstrate the practical use of the a posteriori error estimator in performing three-dimensional Kohn-Sham density functional theory calculations for quasi-2D...
Dynamics analysis for local crowd state without foreground segmentation%无前景分割的人群局部状态动力学分析
Institute of Scientific and Technical Information of China (English)
朱海龙; 吴锐; 刘鹏; 唐降龙
2012-01-01
Considering that a static background model can not be used to precisely confirm the crowd state in a complex scene of surveillance video due to its poor adaptability, a scheme for dynamics analysis of local crowd state without foreground segmentation is proposed. The scheme handles local blocks in consecutive frames in the space-time domain as a linear dynamic system ( LDS) , and employs the mixture dynamic texture algorithm to classify them to estimate the crowd density; uses a main path tracking method to evaluate the crowd velocity; models the LDS by partial differential equations to describe the variation relation between the density field, velocity field and flow quantity field. The experimental results show that the proposed scheme can be used to perform quantitative analysis on the crowd state, as well as on the changing trend. The result of state analysis can be used to detect the anomaly events in a crowd exactly.%针对静态背景建模方法对于动态场景来说适应性较差,不能将视频中的人群状态进行准确界定的问题,提出了一种无需前景分割的群体局部状态动力学分析方法.该方法把有限时间窗口内的视频局部区域视为线性动态系统(LDS),使用混合动态纹理方法进行群体分类、估计群体密度；采用主路径跟踪法估计群体的主流速度取向和幅值；构建偏微分方程对该动态系统进行建模,描述局部区域内群体密度场、速度场和流量场之间的变化关系,实现对视频中群体状态及其变化趋势进行定量描述.实验结果表明,该方法能够可靠地实现对动态场景群体状态较准确的定量分析,状态分析结果可用于重点监控区域的异常检测,实现差别化监控.
Pervasive Adaptation in Car Crowds
Ferscha, Alois; Riener, Andreas
Advances in the miniaturization and embedding of electronics for microcomputing, communication and sensor/actuator systems, have fertilized the pervasion of technology into literally everything. Pervasive computing technology is particularly flourishing in the automotive domain, exceling the “smart car”, embodying intelligent control mechanics, intelligent driver assistance, safety and comfort systems, navigation, tolling, fleet management and car-to-car interaction systems, as one of the outstanding success stories of pervasive computing. This paper raises the issue of the socio-technical phenomena emerging from the reciprocal interrelationship between drivers and smart cars, particularly in car crowds. A driver-vehicle co-model (DVC-model) is proposed, expressing the complex interactions between the human driver and the in-car and on-car technologies. Both explicit (steering, shifting, overtaking), as well as implicit (body posture, respiration) interactions are considered, and related to the drivers vital state (attentive, fatigue, distracted, aggressive). DVC-models are considered as building blocks in large scale simulation experiments, aiming to analyze and understand adaptation phenomena rooted in the feed-back loops among individual driver behavior and car crowds.
Crowd and environmental management during mass gatherings.
Johansson, Anders; Batty, Michael; Hayashi, Konrad; Al Bar, Osama; Marcozzi, David; Memish, Ziad A
2012-02-01
Crowds are a feature of large cities, occurring not only at mass gatherings but also at routine events such as the journey to work. To address extreme crowding, various computer models for crowd movement have been developed in the past decade, and we review these and show how they can be used to identify health and safety issues. State-of-the-art models that simulate the spread of epidemics operate on a population level, but the collection of fine-scale data might enable the development of models for epidemics that operate on a microscopic scale, similar to models for crowd movement. We provide an example of such simulations, showing how an individual-based crowd model can mirror aggregate susceptible-infected-recovered models that have been the main models for epidemics so far.
X-Ray Methods to Estimate Breast Density Content in Breast Tissue
Maraghechi, Borna
This work focuses on analyzing x-ray methods to estimate the fat and fibroglandular contents in breast biopsies and in breasts. The knowledge of fat in the biopsies could aid in their wide-angle x-ray scatter analyses. A higher mammographic density (fibrous content) in breasts is an indicator of higher cancer risk. Simulations for 5 mm thick breast biopsies composed of fibrous, cancer, and fat and for 4.2 cm thick breast fat/fibrous phantoms were done. Data from experimental studies using plastic biopsies were analyzed. The 5 mm diameter 5 mm thick plastic samples consisted of layers of polycarbonate (lexan), polymethyl methacrylate (PMMA-lucite) and polyethylene (polyet). In terms of the total linear attenuation coefficients, lexan ≡ fibrous, lucite ≡ cancer and polyet ≡ fat. The detectors were of two types, photon counting (CdTe) and energy integrating (CCD). For biopsies, three photon counting methods were performed to estimate the fat (polyet) using simulation and experimental data, respectively. The two basis function method that assumed the biopsies were composed of two materials, fat and a 50:50 mixture of fibrous (lexan) and cancer (lucite) appears to be the most promising method. Discrepancies were observed between the results obtained via simulation and experiment. Potential causes are the spectrum and the attenuation coefficient values used for simulations. An energy integrating method was compared to the two basis function method using experimental and simulation data. A slight advantage was observed for photon counting whereas both detectors gave similar results for the 4.2 cm thick breast phantom simulations. The percentage of fibrous within a 9 cm diameter circular phantom of fibrous/fat tissue was estimated via a fan beam geometry simulation. Both methods yielded good results. Computed tomography (CT) images of the circular phantom were obtained using both detector types. The radon transforms were estimated via four energy integrating
Optical Density Analysis of X-Rays Utilizing Calibration Tooling to Estimate Thickness of Parts
Grau, David
2012-01-01
This process is designed to estimate the thickness change of a material through data analysis of a digitized version of an x-ray (or a digital x-ray) containing the material (with the thickness in question) and various tooling. Using this process, it is possible to estimate a material's thickness change in a region of the material or part that is thinner than the rest of the reference thickness. However, that same principle process can be used to determine the thickness change of material using a thinner region to determine thickening, or it can be used to develop contour plots of an entire part. Proper tooling must be used. An x-ray film with an S-shaped characteristic curve or a digital x-ray device with a product resulting in like characteristics is necessary. If a film exists with linear characteristics, this type of film would be ideal; however, at the time of this reporting, no such film has been known. Machined components (with known fractional thicknesses) of a like material (similar density) to that of the material to be measured are necessary. The machined components should have machined through-holes. For ease of use and better accuracy, the throughholes should be a size larger than 0.125 in. (.3 mm). Standard components for this use are known as penetrameters or image quality indicators. Also needed is standard x-ray equipment, if film is used in place of digital equipment, or x-ray digitization equipment with proven conversion properties. Typical x-ray digitization equipment is commonly used in the medical industry, and creates digital images of x-rays in DICOM format. It is recommended to scan the image in a 16-bit format. However, 12-bit and 8-bit resolutions are acceptable. Finally, x-ray analysis software that allows accurate digital image density calculations, such as Image-J freeware, is needed. The actual procedure requires the test article to be placed on the raw x-ray, ensuring the region of interest is aligned for perpendicular x-ray exposure
Parameter estimation of social forces in pedestrian dynamics models via a probabilistic method.
Corbetta, Alessandro; Muntean, Adrian; Vafayi, Kiamars
2015-04-01
Focusing on a specific crowd dynamics situation, including real life experiments and measurements, our paper targets a twofold aim: (1) we present a Bayesian probabilistic method to estimate the value and the uncertainty (in the form of a probability density function) of parameters in crowd dynamic models from the experimental data; and (2) we introduce a fitness measure for the models to classify a couple of model structures (forces) according to their fitness to the experimental data, preparing the stage for a more general model-selection and validation strategy inspired by probabilistic data analysis. Finally, we review the essential aspects of our experimental setup and measurement technique.
Brown, S; Gaston, G
1995-01-01
One of the most important databases needed for estimating emissions of carbon dioxide resulting from changes in the cover, use, and management of tropical forests is the total quantity of biomass per unit area, referred to as biomass density. Forest inventories have been shown to be valuable sources of data for estimating biomass density, but inventories for the tropics are few in number and their quality is poor. This lack of reliable data has been overcome by use of a promising approach that produces geographically referenced estimates by modeling in a geographic information system (GIS). This approach has been used to produce geographically referenced, spatial distributions of potential and actual (circa 1980) aboveground biomass density of all forests types in tropical Africa. Potential and actual biomass density estimates ranged from 33 to 412 Mg ha(-1) (10(6)g ha(-1)) and 20 to 299 Mg ha(-1), respectively, for very dry lowland to moist lowland forests and from 78 to 197 Mg ha(-1) and 37 to 105 Mg ha(-1), respectively, for montane-seasonal to montane-moist forests. Of the 37 countries included in this study, more than half (51%) contained forests that had less than 60% of their potential biomass. Actual biomass density for forest vegetation was lowest in Botswana, Niger, Somalia, and Zimbabwe (about 10 to 15 Mg ha(-1)). Highest estimates for actual biomass density were found in Congo, Equatorial Guinea, Gabon, and Liberia (305 to 344 Mg ha(-1)). Results from this research effort can contribute to reducing uncertainty in the inventory of country-level emission by providing consistent estimates of biomass density at subnational scales that can be used with other similarly scaled databases on change in land cover and use.
Wang, Ying; Wu, Fengchang; Giesy, John P; Feng, Chenglian; Liu, Yuedan; Qin, Ning; Zhao, Yujie
2015-09-01
Due to use of different parametric models for establishing species sensitivity distributions (SSDs), comparison of water quality criteria (WQC) for metals of the same group or period in the periodic table is uncertain and results can be biased. To address this inadequacy, a new probabilistic model, based on non-parametric kernel density estimation was developed and optimal bandwidths and testing methods are proposed. Zinc (Zn), cadmium (Cd), and mercury (Hg) of group IIB of the periodic table are widespread in aquatic environments, mostly at small concentrations, but can exert detrimental effects on aquatic life and human health. With these metals as target compounds, the non-parametric kernel density estimation method and several conventional parametric density estimation methods were used to derive acute WQC of metals for protection of aquatic species in China that were compared and contrasted with WQC for other jurisdictions. HC5 values for protection of different types of species were derived for three metals by use of non-parametric kernel density estimation. The newly developed probabilistic model was superior to conventional parametric density estimations for constructing SSDs and for deriving WQC for these metals. HC5 values for the three metals were inversely proportional to atomic number, which means that the heavier atoms were more potent toxicants. The proposed method provides a novel alternative approach for developing SSDs that could have wide application prospects in deriving WQC and use in assessment of risks to ecosystems.
Markedly divergent estimates of Amazon forest carbon density from ground plots and satellites
Mitchard, Edward T A; Feldpausch, Ted R; Brienen, Roel J W; Lopez-Gonzalez, Gabriela; Monteagudo, Abel; Baker, Timothy R; Lewis, Simon L; Lloyd, Jon; Quesada, Carlos A; Gloor, Manuel; ter Steege, Hans; Meir, Patrick; Alvarez, Esteban; Araujo-Murakami, Alejandro; Aragão, Luiz E O C; Arroyo, Luzmila; Aymard, Gerardo; Banki, Olaf; Bonal, Damien; Brown, Sandra; Brown, Foster I; Cerón, Carlos E; Chama Moscoso, Victor; Chave, Jerome; Comiskey, James A; Cornejo, Fernando; Corrales Medina, Massiel; Da Costa, Lola; Costa, Flavia R C; Di Fiore, Anthony; Domingues, Tomas F; Erwin, Terry L; Frederickson, Todd; Higuchi, Niro; Honorio Coronado, Euridice N; Killeen, Tim J; Laurance, William F; Levis, Carolina; Magnusson, William E; Marimon, Beatriz S; Marimon Junior, Ben Hur; Mendoza Polo, Irina; Mishra, Piyush; Nascimento, Marcelo T; Neill, David; Núñez Vargas, Mario P; Palacios, Walter A; Parada, Alexander; Pardo Molina, Guido; Peña-Claros, Marielos; Pitman, Nigel; Peres, Carlos A; Poorter, Lourens; Prieto, Adriana; Ramirez-Angulo, Hirma; Restrepo Correa, Zorayda; Roopsind, Anand; Roucoux, Katherine H; Rudas, Agustin; Salomão, Rafael P; Schietti, Juliana; Silveira, Marcos; de Souza, Priscila F; Steininger, Marc K; Stropp, Juliana; Terborgh, John; Thomas, Raquel; Toledo, Marisol; Torres-Lezama, Armando; van Andel, Tinde R; van der Heijden, Geertje M F; Vieira, Ima C G; Vieira, Simone; Vilanova-Torre, Emilio; Vos, Vincent A; Wang, Ophelia; Zartman, Charles E; Malhi, Yadvinder; Phillips, Oliver L
2014-01-01
Aim The accurate mapping of forest carbon stocks is essential for understanding the global carbon cycle, for assessing emissions from deforestation, and for rational land-use planning. Remote sensing (RS) is currently the key tool for this purpose, but RS does not estimate vegetation biomass directly, and thus may miss significant spatial variations in forest structure. We test the stated accuracy of pantropical carbon maps using a large independent field dataset. Location Tropical forests of the Amazon basin. The permanent archive of the field plot data can be accessed at: http://dx.doi.org/10.5521/FORESTPLOTS.NET/2014_1 Methods Two recent pantropical RS maps of vegetation carbon are compared to a unique ground-plot dataset, involving tree measurements in 413 large inventory plots located in nine countries. The RS maps were compared directly to field plots, and kriging of the field data was used to allow area-based comparisons. Results The two RS carbon maps fail to capture the main gradient in Amazon forest carbon detected using 413 ground plots, from the densely wooded tall forests of the north-east, to the light-wooded, shorter forests of the south-west. The differences between plots and RS maps far exceed the uncertainties given in these studies, with whole regions over- or under-estimated by > 25%, whereas regional uncertainties for the maps were reported to be < 5%. Main conclusions Pantropical biomass maps are widely used by governments and by projects aiming to reduce deforestation using carbon offsets, but may have significant regional biases. Carbon-mapping techniques must be revised to account for the known ecological variation in tree wood density and allometry to create maps suitable for carbon accounting. The use of single relationships between tree canopy height and above-ground biomass inevitably yields large, spatially correlated errors. This presents a significant challenge to both the forest conservation and remote sensing communities
Directory of Open Access Journals (Sweden)
Yu Xu
2016-06-01
Full Text Available Estimates of abundance or density are essential for wildlife management and conservation. There are few effective density estimates for the Buff-throated Partridge Tetraophasis szechenyii, a rare and elusive high-mountain Galliform species endemic to western China. In this study, we used the temporary emigration N-mixture model to estimate density of this species, with data acquired from playback point count surveys around a sacred area based on indigenous Tibetan culture of protection of wildlife, in Yajiang County, Sichuan, China, during April-June 2009. Within 84 125-m radius points, we recorded 53 partridge groups during three repeats. The best model indicated that detection probability was described by covariates of vegetation cover type, week of visit, time of day, and weather with weak effects, and a partridge group was present during a sampling period with a constant probability. The abundance component was accounted for by vegetation association. Abundance was substantially higher in rhododendron shrubs, fir-larch forests, mixed spruce-larch-birch forests, and especially oak thickets than in pine forests. The model predicted a density of 5.14 groups/km², which is similar to an estimate of 4.7 - 5.3 groups/km² quantified via an intensive spot-mapping effort. The post-hoc estimate of individual density was 14.44 individuals/km², based on the estimated mean group size of 2.81. We suggest that the method we employed is applicable to estimate densities of Buff-throated Partridges in large areas. Given importance of a mosaic habitat for this species, local logging should be regulated. Despite no effect of the conservation area (sacred on the abundance of Buff-throated Partridges, we suggest regulations linking the sacred mountain conservation area with the official conservation system because of strong local participation facilitated by sacred mountains in land conservation.
Avenues for crowd science in Hydrology.
Koch, Julian; Stisen, Simon
2016-04-01
Crowd science describes research that is conducted with the participation of the general public (the crowd) and gives the opportunity to involve the crowd in research design, data collection and analysis. In various fields, scientists have already drawn on underused human resources to advance research at low cost, with high transparency and large acceptance of the public due to the bottom up structure and the participatory process. Within the hydrological sciences, crowd research has quite recently become more established in the form of crowd observatories to generate hydrological data on water quality, precipitation or river flow. These innovative observatories complement more traditional ways of monitoring hydrological data and strengthen a community-based environmental decision making. However, the full potential of crowd science lies in internet based participation of the crowd and it is not yet fully exploited in the field of Hydrology. New avenues that are not primarily based on the outsourcing of labor, but instead capitalize the full potential of human capabilities have to emerge. In multiple realms of solving complex problems, like image detection, optimization tasks, narrowing of possible solutions, humans still remain more effective than computer algorithms. The most successful online crowd science projects Foldit and Galaxy Zoo have proven that the collective of tens of thousands users could clearly outperform traditional computer based science approaches. Our study takes advantage of the well trained human perception to conduct a spatial sensitivity analysis of land-surface variables of a distributed hydrological model to identify the most sensitive spatial inputs. True spatial performance metrics, that quantitatively compare patterns, are not trivial to choose and their applicability is often not universal. On the other hand humans can quickly integrate spatial information at various scales and are therefore a trusted competence. We selected
Directory of Open Access Journals (Sweden)
Niels Halama
Full Text Available BACKGROUND: Determining the correct number of positive immune cells in immunohistological sections of colorectal cancer and other tumor entities is emerging as an important clinical predictor and therapy selector for an individual patient. This task is usually obstructed by cell conglomerates of various sizes. We here show that at least in colorectal cancer the inclusion of immune cell conglomerates is indispensable for estimating reliable patient cell counts. Integrating virtual microscopy and image processing principally allows the high-throughput evaluation of complete tissue slides. METHODOLOGY/PRINCIPAL FINDINGS: For such large-scale systems we demonstrate a robust quantitative image processing algorithm for the reproducible quantification of cell conglomerates on CD3 positive T cells in colorectal cancer. While isolated cells (28 to 80 microm(2 are counted directly, the number of cells contained in a conglomerate is estimated by dividing the area of the conglomerate in thin tissues sections (< or =6 microm by the median area covered by an isolated T cell which we determined as 58 microm(2. We applied our algorithm to large numbers of CD3 positive T cell conglomerates and compared the results to cell counts obtained manually by two independent observers. While especially for high cell counts, the manual counting showed a deviation of up to 400 cells/mm(2 (41% variation, algorithm-determined T cell numbers generally lay in between the manually observed cell numbers but with perfect reproducibility. CONCLUSION: In summary, we recommend our approach as an objective and robust strategy for quantifying immune cell densities in immunohistological sections which can be directly implemented into automated full slide image processing systems.
Estimating basic wood density and its uncertainty for Pinus densiflora in the Republic of Korea
Directory of Open Access Journals (Sweden)
Jung Kee Pyo
2012-06-01
Full Text Available According to the Intergovernmental Panel on Climate Change (IPCC guidelines, uncertainty assessment is an important aspect of a greenhouse gas inventory, and effort should be made to incorporate it into the reporting. The goal of this study was to estimate basic wood density (BWD and its uncertainty for Pinus densiflora (Siebold & Zucc. in Korea. In this study, P. densiflora forests throughout the country were divided into two regional variants, which were the Gangwon region variant, distributed on the northeastern part of the country, and the central region variant. A total of 36 representative sampling plots were selected in both regions to collect sample trees for destructive sampling. The trees were selected considering the distributions of tree age and diameter at breast height. Hypothesis testing was carried out to test the BWD differences between two age groups, i.e. age over 20 and less than 20, and differences between the two regions. The test suggested that there was no statistically significant difference between the two age classes. On the other hand, it is suggested a strong evidence of a statistically significant difference between regions. The BWD and its uncertainty were 0.418 g/cm3 and 11.9% for the Gangwon region, whereas they were 0.471g/cm3 and 3.8% for the central region. As a result, the estimated BWD for P. densiflora was more precise than the value provided by the IPCC guidelines.
Crowd-induced random vibration of footbridge and vibration control using multiple tuned mass dampers
Li, Quan; Fan, Jiansheng; Nie, Jianguo; Li, Quanwang; Chen, Yu
2010-09-01
This paper investigates vibration characteristics of footbridge induced by crowd random walking, and presents the application of multiple tuned mass dampers (MTMD) in suppressing crowd-induced vibration. A single foot force model for the vertical component of walking-induced force is developed, avoiding the phase angle inaccessibility of the continuous walking force. Based on the single foot force model, the crowd-footbridge random vibration model, in which pedestrians are modeled as a crowd flow characterized with the average time headway, is developed to consider the worst vibration state of footbridge. In this random vibration model, an analytic formulation is developed to calculate the acceleration power spectral density in arbitrary position of footbridge with arbitrary span layout. Resonant effect is observed as the footbridge natural frequencies fall within the frequency bandwidth of crowd excitation. To suppress the excessive acceleration for human normal walking comfort, a MTMD system is used to improve the footbridge dynamic characteristics. According to the random vibration model, an optimization procedure, based on the minimization of maximum root-mean-square (rms) acceleration of footbridge, is introduced to determine the optimal design parameters of MTMD system. Numerical analysis shows that the proposed MTMD designed by random optimization procedure, is more effective than traditional MTMD design methodology in reducing dynamic response during crowd-footbridge resonance, and that the proper frequency spacing enlargement will effectively reduce the off-tuning effect of MTMD.
Institute of Scientific and Technical Information of China (English)
张凌; 常加峰; 张炜; 李颖颖; 钱金平; 徐国盛; 丁斯晔; 高伟; 吴振伟; 陈颖杰; 黄娟; 刘晓菊; 臧庆
2011-01-01
In this work, population coefficients of hydrogen＇s n = 3 excited state from the hydrogen collisional-radiative （CR） model, from the data file of DEGAS 2, are used to calculate the photon emissivity coefficients （PECs） of hydrogen Balmer-α （n = 3 →n = 2） （Hα）. The results are compared with the PECs from Atomic Data and Analysis Structure （ADAS） database, and a good agreement is found. A magnetic surface-averaged neutral density profile of typical double-null （DN） plasma in EAST is obtained by using FRANTIC, the 1.5-D fluid transport code. It is found that the sum of integral Dα and Hα emission intensity calculated via the neutral density agrees with the measured results obtained by using the absolutely calibrated multi-channel poloidal photodiode array systems viewing the lower divertor at the last closed flux surface （LCFS）. It is revealed that the typical magnetic surface-averaged neutral density at LCFS is about 3.5×10^16 m^-3 .
Semivariogram models for estimating fig fly population density throughout the year
Directory of Open Access Journals (Sweden)
Mauricio Paulo Batistella Pasini
2014-07-01
Full Text Available The objective of this work was to select semivariogram models to estimate the population density of fig fly (Zaprionus indianus; Diptera: Drosophilidae throughout the year, using ordinary kriging. Nineteen monitoring sites were demarcated in an area of 8,200 m2, cropped with six fruit tree species: persimmon, citrus, fig, guava, apple, and peach. During a 24 month period, 106 weekly evaluations were done in these sites. The average number of adult fig flies captured weekly per trap, during each month, was subjected to the circular, spherical, pentaspherical, exponential, Gaussian, rational quadratic, hole effect, K-Bessel, J-Bessel, and stable semivariogram models, using ordinary kriging interpolation. The models with the best fit were selected by cross-validation. Each data set (months has a particular spatial dependence structure, which makes it necessary to define specific models of semivariograms in order to enhance the adjustment to the experimental semivariogram. Therefore, it was not possible to determine a standard semivariogram model; instead, six theoretical models were selected: circular, Gaussian, hole effect, K-Bessel, J-Bessel, and stable.
Chan, Poh Yin; Tong, Chi Ming; Durrant, Marcus C
2011-09-01
An empirical method for estimation of the boiling points of organic molecules based on density functional theory (DFT) calculations with polarized continuum model (PCM) solvent corrections has been developed. The boiling points are calculated as the sum of three contributions. The first term is calculated directly from the structural formula of the molecule, and is related to its effective surface area. The second is a measure of the electronic interactions between molecules, based on the DFT-PCM solvation energy, and the third is employed only for planar aromatic molecules. The method is applicable to a very diverse range of organic molecules, with normal boiling points in the range of -50 to 500 °C, and includes ten different elements (C, H, Br, Cl, F, N, O, P, S and Si). Plots of observed versus calculated boiling points gave R²=0.980 for a training set of 317 molecules, and R²=0.979 for a test set of 74 molecules. The role of intramolecular hydrogen bonding in lowering the boiling points of certain molecules is quantitatively discussed. Crown Copyright © 2011. Published by Elsevier Inc. All rights reserved.
Can Hip Fracture Prediction in Women be Estimated beyond Bone Mineral Density Measurement Alone?
Geusens, Piet; van Geel, Tineke; van den Bergh, Joop
2010-01-01
The etiology of hip fractures is multifactorial and includes bone and fall-related factors. Low bone mineral density (BMD) and BMD-related and BMD-independent geometric components of bone strength, evaluated by hip strength analysis (HSA) and finite element analysis analyses on dual-energy X-ray absorptiometry (DXA) images, and ultrasound parameters are related to the presence and incidence of hip fracture. In addition, clinical risk factors contribute to the risk of hip fractures, independent of BMD. They are included in the fracture risk assessment tool (FRAX) case finding algorithm to estimate in the individual patient the 10-year risk of hip fracture, with and without BMD. Fall risks are not included in FRAX, but are included in other case finding tools, such as the Garvan algorithm, to predict the 5- and 10-year hip fracture risk. Hormones, cytokines, growth factors, markers of bone resorption and genetic background have been related to hip fracture risk. Vitamin D deficiency is endemic worldwide and low serum levels of 25-hydroxyvitamin D [25(OH)D] predict hip fracture risk. In the context of hip fracture prevention calculation of absolute fracture risk using clinical risks, BMD, bone geometry and fall-related risks is feasible, but needs further refinement by integrating bone and fall-related risk factors into a single case finding algorithm for clinical use. PMID:22870438
Measuring and Modeling Fault Density for Plume-Fault Encounter Probability Estimation
Energy Technology Data Exchange (ETDEWEB)
Jordan, P.D.; Oldenburg, C.M.; Nicot, J.-P.
2011-05-15
Emission of carbon dioxide from fossil-fueled power generation stations contributes to global climate change. Storage of this carbon dioxide within the pores of geologic strata (geologic carbon storage) is one approach to mitigating the climate change that would otherwise occur. The large storage volume needed for this mitigation requires injection into brine-filled pore space in reservoir strata overlain by cap rocks. One of the main concerns of storage in such rocks is leakage via faults. In the early stages of site selection, site-specific fault coverages are often not available. This necessitates a method for using available fault data to develop an estimate of the likelihood of injected carbon dioxide encountering and migrating up a fault, primarily due to buoyancy. Fault population statistics provide one of the main inputs to calculate the encounter probability. Previous fault population statistics work is shown to be applicable to areal fault density statistics. This result is applied to a case study in the southern portion of the San Joaquin Basin with the result that the probability of a carbon dioxide plume from a previously planned injection had a 3% chance of encountering a fully seal offsetting fault.
Directory of Open Access Journals (Sweden)
W. Chen
2015-11-01
Full Text Available Drought caused the most widespread damage in China, making up over 50 % of the total affected area nationwide in recent decades. In the paper, a Standardized Precipitation Index-based (SPI-based drought risk study is conducted using historical rainfall data of 19 weather stations in Shandong province, China. Kernel density based method is adopted to carry out the risk analysis. Comparison between the bivariate Gaussian kernel density estimation (GKDE and diffusion kernel density estimation (DKDE are carried out to analyze the effect of drought intensity and drought duration. The results show that DKDE is relatively more accurate without boundary-leakage. Combined with the GIS technique, the drought risk is presented which reveals the spatial and temporal variation of agricultural droughts for corn in Shandong. The estimation provides a different way to study the occurrence frequency and severity of drought risk from multiple perspectives.
Generalized Consistency for Kernel Density Estimation%密度核估计的广义相合性
Institute of Scientific and Technical Information of China (English)
王敏; 李开灿
2015-01-01
研究独立样本下密度核估计的相合性. 在 Peason-χ2距离和Kullback-Leibler距离意义下,提出密度核估计广义相合性的概念,并获得密度核估计的各种广义相合性.%In this paper, we discuss the consistency of the density kernel estimation under the independent sample. We give the definitions of generalized consistency for kernel density estimation and obtain several kinds of generalized consistency of kernel density estimation under thePeason-χ2distance and the Kullback-Leibler distance.
Directory of Open Access Journals (Sweden)
Andrew J Hearn
Full Text Available The marbled cat Pardofelis marmorata is a poorly known wild cat that has a broad distribution across much of the Indomalayan ecorealm. This felid is thought to exist at low population densities throughout its range, yet no estimates of its abundance exist, hampering assessment of its conservation status. To investigate the distribution and abundance of marbled cats we conducted intensive, felid-focused camera trap surveys of eight forest areas and two oil palm plantations in Sabah, Malaysian Borneo. Study sites were broadly representative of the range of habitat types and the gradient of anthropogenic disturbance and fragmentation present in contemporary Sabah. We recorded marbled cats from all forest study areas apart from a small, relatively isolated forest patch, although photographic detection frequency varied greatly between areas. No marbled cats were recorded within the plantations, but a single individual was recorded walking along the forest/plantation boundary. We collected sufficient numbers of marbled cat photographic captures at three study areas to permit density estimation based on spatially explicit capture-recapture analyses. Estimates of population density from the primary, lowland Danum Valley Conservation Area and primary upland, Tawau Hills Park, were 19.57 (SD: 8.36 and 7.10 (SD: 1.90 individuals per 100 km2, respectively, and the selectively logged, lowland Tabin Wildlife Reserve yielded an estimated density of 10.45 (SD: 3.38 individuals per 100 km2. The low detection frequencies recorded in our other survey sites and from published studies elsewhere in its range, and the absence of previous density estimates for this felid suggest that our density estimates may be from the higher end of their abundance spectrum. We provide recommendations for future marbled cat survey approaches.
The Behavioral Effects of Crowding: Definitions and Methods.
Dean, Larry M.; And Others
Crews of 18 U.S. Navy combat vessels rated their living and working conditions aboard ship, including degree of crowding. In order to better understand the behavioral effects of crowding, three different types of measures, corresponding to different definitions of crowding, were constructed. These separate crowding measures correlated uniquely…
An Energy based Method to Measure the Crowd Safety
Yin, H.; Li, D.; Zheng, X.
2014-01-01
How to evaluate crowd safety in crowded areas is a tough, but important, problem. According to accident-causing theory, uncontrolled release of hazardous energy among overcrowded pedestrians is the basic cause of crowd disaster. Therefore, crowd energy is modeled in this paper, which takes both
Directory of Open Access Journals (Sweden)
E. Carpenter
2014-06-01
Full Text Available Conservation and management of bats requires reliable and repeatable data regarding the size and patterns of variation in size of bat colonies. Counts and densities calculated via photography have proven more accurate and repeatable than visual counts and ocular estimates. Unfortunately, the potential of photography to investigate the size of a bat colony and roost density has rarely been explored. In the summer of 2006, a colony of Geoffroys Rousette Fruit Bat, Rousettus amplexicaudatus, was photo-documented in the Monfort Bat Cave, in the Island Garden City of Samal, Davao del Norte, Mindanao, Philippines. We selected 39 images to develop roost density estimates. Mean (+or-SE roosting density was 403+or-167.1 bats/m2 and 452.3+or-168.8 bats/m2 on the walls and ceiling of the cave, respectively; densities were not significantly different from each other (P=0.38. Based on these standardized data, we estimate that the initial 100m of the cave contained 883,526 bats. Ultimately, this photographic technique can be used to develop a statistical approach which involves repeatable estimates of colony size for Geoffroys Rousette Fruit Bats at Monfort Cave and will enhance ongoing monitoring activities throughout this species range.
An SV-GMR Needle Sensor-Based Estimation of Volume Density of Magnetic Fluid inside Human Body
Directory of Open Access Journals (Sweden)
C. P. Gooneratne
2008-01-01
Full Text Available A spin-valve giant magneto-resistive (SV-GMR sensor of needle-type configuration is reported to estimate the volume density of magnetic fluid inside human body. The magnetic fluid is usually injected into human body to kill cancerous cell using hyperthermia-based treatment. To control the heat treatment, a good knowledge of temperature is very much essential. The SV-GMR-based needle-type sensor is used to measure the magnetic flux density of the magnetic fluid inside the human body from which the temperature is estimated. The needle-type sensor provides a semi-invasive approach of temperature determination.
Karanth, K.Ullas; Chundawat, Raghunandan S.; Nichols, James D.; Kumar, N. Samba
2004-01-01
Tropical dry-deciduous forests comprise more than 45% of the tiger (Panthera tigris) habitat in India. However, in the absence of rigorously derived estimates of ecological densities of tigers in dry forests, critical baseline data for managing tiger populations are lacking. In this study tiger densities were estimated using photographic capture–recapture sampling in the dry forests of Panna Tiger Reserve in Central India. Over a 45-day survey period, 60 camera trap sites were sampled in a well-protected part of the 542-km2 reserve during 2002. A total sampling effort of 914 camera-trap-days yielded photo-captures of 11 individual tigers over 15 sampling occasions that effectively covered a 418-km2 area. The closed capture–recapture model Mh, which incorporates individual heterogeneity in capture probabilities, fitted these photographic capture history data well. The estimated capture probability/sample, p̂= 0.04, resulted in an estimated tiger population size and standard error (N̂(SÊN̂)) of 29 (9.65), and a density (D̂(SÊD̂)) of 6.94 (3.23) tigers/100 km2. The estimated tiger density matched predictions based on prey abundance. Our results suggest that, if managed appropriately, the available dry forest habitat in India has the potential to support a population size of about 9000 wild tigers.
Trolle, M.; Kery, M.
2003-01-01
Neotropical felids such as the ocelot (Leopardus pardalis) are secretive, and it is difficult to estimate their populations using conventional methods such as radiotelemetry or sign surveys. We show that recognition of individual ocelots from camera-trapping photographs is possible, and we use camera-trapping results combined with closed population capture-recapture models to estimate density of ocelots in the Brazilian Pantanal. We estimated the area from which animals were camera trapped at 17.71 km2. A model with constant capture probability yielded an estimate of 10 independent ocelots in our study area, which translates to a density of 2.82 independent individuals for every 5 km2 (SE 1.00).
Field-based crowd simulation%基于场的人群运动仿真
Institute of Scientific and Technical Information of China (English)
赵欣欣; 张勇; 孔德慧; 尹宝才
2013-01-01
Crowd simulation has been widely used in industry, architecture, transportation, and many other fields. To implement real-time crowd simulation in complex environments, efficiency is a pivotal problem to resolve. We meet a lot of challenges, such as rendering of large crowds, the update of crowd's locations and states, as well as collision avoidance. We propose a field-based approach to implement real-time crowd simulation. This approach guides the crowd movements by constructing a navigation field and a density field. The navigation field can make the crowd choose the optimal path to reach the desired destinations. The density field can affect the velocity of the crowd to help avoid collisions, combined with a GPU-based collision avoidance method. Using our approach, we have constructed a real-time crowd simulation system and tested the performance in a large venue with thousands of agents. We succeed in simulating the evacuation of crowds with excellent rendering and high efficiency.%人群仿真目前在工业、建筑、交通等多种领域中应用广泛.实现复杂场景中的人群运动实时仿真,效率是亟待解决的关键性问题,而提高仿真效率所必须面临的挑战主要有人群的渲染、位置及状态的实时更新和碰撞检测.提出一种基于场的方法来实现人群运动的实时仿真,通过构建导航场和密度场引导人群运动.导航场能够引导人群按最优可行路径到达其目标位置；而密度场通过对人群运动速度的影响,再与基于GPU的碰撞检测方法结合,有效地避免了人群碰撞.应用基于场的方法,搭建了人群运动实时仿真系统,在复杂的场馆中对几千人规模的人群进行了实验,成功地对人群进行疏散.实验结果表明,本文方法能够获得良好的渲染效果和仿真效率.
Oyang, Yen-Jen; Hwang, Shien-Ching; Ou, Yu-Yen; Chen, Chien-Yu; Chen, Zhi-Wei
2005-01-01
This paper presents a novel learning algorithm for efficient construction of the radial basis function (RBF) networks that can deliver the same level of accuracy as the support vector machines (SVMs) in data classification applications. The proposed learning algorithm works by constructing one RBF subnetwork to approximate the probability density function of each class of objects in the training data set. With respect to algorithm design, the main distinction of the proposed learning algorithm is the novel kernel density estimation algorithm that features an average time complexity of O(n log n), where n is the number of samples in the training data set. One important advantage of the proposed learning algorithm, in comparison with the SVM, is that the proposed learning algorithm generally takes far less time to construct a data classifier with an optimized parameter setting. This feature is of significance for many contemporary applications, in particular, for those applications in which new objects are continuously added into an already large database. Another desirable feature of the proposed learning algorithm is that the RBF networks constructed are capable of carrying out data classification with more than two classes of objects in one single run. In other words, unlike with the SVM, there is no need to resort to mechanisms such as one-against-one or one-against-all for handling datasets with more than two classes of objects. The comparison with SVM is of particular interest, because it has been shown in a number of recent studies that SVM generally are able to deliver higher classification accuracy than the other existing data classification algorithms. As the proposed learning algorithm is instance-based, the data reduction issue is also addressed in this paper. One interesting observation in this regard is that, for all three data sets used in data reduction experiments, the number of training samples remaining after a naive data reduction mechanism is
A novel crowd flow model based on linear fractional stable motion
Wei, Juan; Zhang, Hong; Wu, Zhenya; He, Junlin; Guo, Yangyong
2016-03-01
For the evacuation dynamics in indoor space, a novel crowd flow model is put forward based on Linear Fractional Stable Motion. Based on position attraction and queuing time, the calculation formula of movement probability is defined and the queuing time is depicted according to linear fractal stable movement. At last, an experiment and simulation platform can be used for performance analysis, studying deeply the relation among system evacuation time, crowd density and exit flow rate. It is concluded that the evacuation time and the exit flow rate have positive correlations with the crowd density, and when the exit width reaches to the threshold value, it will not effectively decrease the evacuation time by further increasing the exit width.
Directory of Open Access Journals (Sweden)
Wu Chi-Yeh
2010-01-01
Full Text Available Abstract Background MicroRNAs (miRNAs are short non-coding RNA molecules, which play an important role in post-transcriptional regulation of gene expression. There have been many efforts to discover miRNA precursors (pre-miRNAs over the years. Recently, ab initio approaches have attracted more attention because they do not depend on homology information and provide broader applications than comparative approaches. Kernel based classifiers such as support vector machine (SVM are extensively adopted in these ab initio approaches due to the prediction performance they achieved. On the other hand, logic based classifiers such as decision tree, of which the constructed model is interpretable, have attracted less attention. Results This article reports the design of a predictor of pre-miRNAs with a novel kernel based classifier named the generalized Gaussian density estimator (G2DE based classifier. The G2DE is a kernel based algorithm designed to provide interpretability by utilizing a few but representative kernels for constructing the classification model. The performance of the proposed predictor has been evaluated with 692 human pre-miRNAs and has been compared with two kernel based and two logic based classifiers. The experimental results show that the proposed predictor is capable of achieving prediction performance comparable to those delivered by the prevailing kernel based classification algorithms, while providing the user with an overall picture of the distribution of the data set. Conclusion Software predictors that identify pre-miRNAs in genomic sequences have been exploited by biologists to facilitate molecular biology research in recent years. The G2DE employed in this study can deliver prediction accuracy comparable with the state-of-the-art kernel based machine learning algorithms. Furthermore, biologists can obtain valuable insights about the different characteristics of the sequences of pre-miRNAs with the models generated by the G
Kernel Density Estimation, Kernel Methods, and Fast Learning in Large Data Sets.
Wang, Shitong; Wang, Jun; Chung, Fu-lai
2014-01-01
Kernel methods such as the standard support vector machine and support vector regression trainings take O(N(3)) time and O(N(2)) space complexities in their naïve implementations, where N is the training set size. It is thus computationally infeasible in applying them to large data sets, and a replacement of the naive method for finding the quadratic programming (QP) solutions is highly desirable. By observing that many kernel methods can be linked up with kernel density estimate (KDE) which can be efficiently implemented by some approximation techniques, a new learning method called fast KDE (FastKDE) is proposed to scale up kernel methods. It is based on establishing a connection between KDE and the QP problems formulated for kernel methods using an entropy-based integrated-squared-error criterion. As a result, FastKDE approximation methods can be applied to solve these QP problems. In this paper, the latest advance in fast data reduction via KDE is exploited. With just a simple sampling strategy, the resulted FastKDE method can be used to scale up various kernel methods with a theoretical guarantee that their performance does not degrade a lot. It has a time complexity of O(m(3)) where m is the number of the data points sampled from the training set. Experiments on different benchmarking data sets demonstrate that the proposed method has comparable performance with the state-of-art method and it is effective for a wide range of kernel methods to achieve fast learning in large data sets.
Directory of Open Access Journals (Sweden)
Md Nabiul Islam Khan
Full Text Available In the Point-Centred Quarter Method (PCQM, the mean distance of the first nearest plants in each quadrant of a number of random sample points is converted to plant density. It is a quick method for plant density estimation. In recent publications the estimator equations of simple PCQM (PCQM1 and higher order ones (PCQM2 and PCQM3, which uses the distance of the second and third nearest plants, respectively show discrepancy. This study attempts to review PCQM estimators in order to find the most accurate equation form. We tested the accuracy of different PCQM equations using Monte Carlo Simulations in simulated (having 'random', 'aggregated' and 'regular' spatial patterns plant populations and empirical ones.PCQM requires at least 50 sample points to ensure a desired level of accuracy. PCQM with a corrected estimator is more accurate than with a previously published estimator. The published PCQM versions (PCQM1, PCQM2 and PCQM3 show significant differences in accuracy of density estimation, i.e. the higher order PCQM provides higher accuracy. However, the corrected PCQM versions show no significant differences among them as tested in various spatial patterns except in plant assemblages with a strong repulsion (plant competition. If N is number of sample points and R is distance, the corrected estimator of PCQM1 is 4(4N - 1/(π ∑ R2 but not 12N/(π ∑ R2, of PCQM2 is 4(8N - 1/(π ∑ R2 but not 28N/(π ∑ R2 and of PCQM3 is 4(12N - 1/(π ∑ R2 but not 44N/(π ∑ R2 as published.If the spatial pattern of a plant association is random, PCQM1 with a corrected equation estimator and over 50 sample points would be sufficient to provide accurate density estimation. PCQM using just the nearest tree in each quadrant is therefore sufficient, which facilitates sampling of trees, particularly in areas with just a few hundred trees per hectare. PCQM3 provides the best density estimations for all types of plant assemblages including the repulsion process
Hall, S. A.; Burke, I.C.; Box, D. O.; Kaufmann, M. R.; Stoker, Jason M.
2005-01-01
The ponderosa pine forests of the Colorado Front Range, USA, have historically been subjected to wildfires. Recent large burns have increased public interest in fire behavior and effects, and scientific interest in the carbon consequences of wildfires. Remote sensing techniques can provide spatially explicit estimates of stand structural characteristics. Some of these characteristics can be used as inputs to fire behavior models, increasing our understanding of the effect of fuels on fire behavior. Others provide estimates of carbon stocks, allowing us to quantify the carbon consequences of fire. Our objective was to use discrete-return lidar to estimate such variables, including stand height, total aboveground biomass, foliage biomass, basal area, tree density, canopy base height and canopy bulk density. We developed 39 metrics from the lidar data, and used them in limited combinations in regression models, which we fit to field estimates of the stand structural variables. We used an information–theoretic approach to select the best model for each variable, and to select the subset of lidar metrics with most predictive potential. Observed versus predicted values of stand structure variables were highly correlated, with r2 ranging from 57% to 87%. The most parsimonious linear models for the biomass structure variables, based on a restricted dataset, explained between 35% and 58% of the observed variability. Our results provide us with useful estimates of stand height, total aboveground biomass, foliage biomass and basal area. There is promise for using this sensor to estimate tree density, canopy base height and canopy bulk density, though more research is needed to generate robust relationships. We selected 14 lidar metrics that showed the most potential as predictors of stand structure. We suggest that the focus of future lidar studies should broaden to include low density forests, particularly systems where the vertical structure of the canopy is important
NIH Abroad: Pictures Are Crowd Pullers
... Pictures Are Crowd Pullers …" Art, culture, and the Internet combine to intervene against malaria in Uganda NLM's ... Services Division collaborated on the project through the Internet. "We wanted to see if such a 'health ...
Towards a South African crowd control model
CSIR Research Space (South Africa)
Modise, M
2013-03-01
Full Text Available With the escalating number of incidents of service delivery, labour related protests and the increasingly violent nature of protests; crowd control is one of the major challenges facing South Africa today. Often these protests are characterized...
Ruderman, Michael A; Wilson, Deirdra F; Reid, Savanna
2015-01-01
This administrative data-linkage cohort study examines the association between prison crowding and the rate of post-release parole violations in a random sample of prisoners released with parole conditions in California, for an observation period of two years (January 2003 through December 2004). Crowding overextends prison resources needed to adequately protect inmates and provide drug rehabilitation services. Violence and lack of access to treatment are known risk factors for drug use and substance use disorders. These and other psychosocial effects of crowding may lead to higher rates of recidivism in California parolees. Rates of parole violation for parolees exposed to high and medium levels of prison crowding were compared to parolees with low prison crowding exposure. Hazard ratios (HRs) with 95% confidence intervals (CIs) were estimated using a Cox model for recurrent events. Our dataset included 13070 parolees in California, combining individual level parolee data with aggregate level crowding data for multilevel analysis. Comparing parolees exposed to high crowding with those exposed to low crowding, the effect sizes from greatest to least were absconding violations (HR 3.56 95% CI: 3.05-4.17), drug violations (HR 2.44 95% CI: 2.00-2.98), non-violent violations (HR 2.14 95% CI: 1.73-2.64), violent and serious violations (HR 1.88 95% CI: 1.45-2.43), and technical violations (HR 1.86 95% CI: 1.37-2.53). Prison crowding predicted higher rates of parole violations after release from prison. The effect was magnitude-dependent and particularly strong for drug charges. Further research into whether adverse prison experiences, such as crowding, are associated with recidivism and drug use in particular may be warranted.
Spatio-temporal properties of letter crowding
Chung, Susana T. L.
2016-01-01
Crowding between adjacent letters has been investigated primarily as a spatial effect. The purpose of this study was to investigate the spatio-temporal properties of letter crowding. Specifically, we examined the systematic changes in the degradation effects in letter identification performance when adjacent letters were presented with a temporal asynchrony, as a function of letter separation and between the fovea and the periphery. We measured proportion-correct performance for identifying the middle target letter in strings of three lowercase letters at the fovea and 10° in the inferior visual field, for a range of center-to-center letter separations and a range of stimulus onset asynchronies (SOA) between the target and flanking letters (positive SOAs: target preceded flankers). As expected, the accuracy for identifying the target letters reduces with decreases in letter separation. This crowding effect shows a strong dependency on SOAs, such that crowding is maximal between 0 and ∼100 ms (depending on conditions) and diminishes for larger SOAs (positive or negative). Maximal crowding does not require the target and flanking letters to physically coexist for the entire presentation duration. Most importantly, crowding can be minimized even for closely spaced letters if there is a large temporal asynchrony between the target and flankers. The reliance of letter identification performance on SOAs and how it changes with letter separations imply that the crowding effect can be traded between space and time. Our findings are consistent with the notion that crowding should be considered as a spatio-temporal, and not simply a spatial, effect. PMID:27088895
Spatio-temporal properties of letter crowding.
Chung, Susana T L
2016-01-01
Crowding between adjacent letters has been investigated primarily as a spatial effect. The purpose of this study was to investigate the spatio-temporal properties of letter crowding. Specifically, we examined the systematic changes in the degradation effects in letter identification performance when adjacent letters were presented with a temporal asynchrony, as a function of letter separation and between the fovea and the periphery. We measured proportion-correct performance for identifying the middle target letter in strings of three lowercase letters at the fovea and 10° in the inferior visual field, for a range of center-to-center letter separations and a range of stimulus onset asynchronies (SOA) between the target and flanking letters (positive SOAs: target preceded flankers). As expected, the accuracy for identifying the target letters reduces with decreases in letter separation. This crowding effect shows a strong dependency on SOAs, such that crowding is maximal between 0 and ∼100 ms (depending on conditions) and diminishes for larger SOAs (positive or negative). Maximal crowding does not require the target and flanking letters to physically coexist for the entire presentation duration. Most importantly, crowding can be minimized even for closely spaced letters if there is a large temporal asynchrony between the target and flankers. The reliance of letter identification performance on SOAs and how it changes with letter separations imply that the crowding effect can be traded between space and time. Our findings are consistent with the notion that crowding should be considered as a spatio-temporal, and not simply a spatial, effect.
Crowd Behavior Algorithm Development for COMBAT XXI
2017-05-30
time to scenario development for CXXI scenario integraters. Report Organization This report is organized into literature review, analysis, results, and...TRAC-M-TR-17-027 30 May 2017 Crowd Behavior Algorithm Development for COMBATXXI TRADOC Analysis Center 700 Dyer Road Monterey, California 93943-0692...30 May 2017 Crowd Behavior Algorithm Development for COMBATXXI LTC Casey Connors Dr. Steven Hall Dr. Imre Balogh Terry Norbraten TRADOC Analysis
Pan, Guangming; Zhou, Wang
2010-01-01
A consistent kernel estimator of the limiting spectral distribution of general sample covariance matrices was introduced in Jing, Pan, Shao and Zhou (2010). The central limit theorem of the kernel estimator is proved in this paper.
Categorical membership modulates crowding: evidence from characters.
Reuther, Josephine; Chakravarthi, Ramakrishna
2014-10-16
Visual crowding is generally thought to affect recognition mostly or only at the level of feature combination. Calling this assertion into question, recent studies have shown that if a target object and its flankers belong to different categories crowding is weaker than if they belong to the same category. Nevertheless, these results can be explained in terms of featural differences between categories. The current study tests if category-level (i.e., high-level) interference in crowding occurs when featural differences are controlled for. First, replicating previous results, we found lower critical spacing for targets and flankers belonging to different categories. Second, we observed the same, albeit weaker, category-specific effect when objects in both categories had the exact same feature set, suggesting that category-specific effects persist even when featural differences are fully controlled for. Third, we manipulated the semantic content of the flankers while keeping their feature set constant, by using upright or rotated objects, and found that meaning modulated crowding. An exclusively feature-based account of crowding would predict no differences due to such changes in meaning. We conclude that crowding results from not only the well-documented feature-level interactions but also additional interactions at a level where objects are grouped by meaning.
Crowding in the S-cone pathway.
Coates, Daniel R; Chung, Susana T L
2016-05-01
The spatial extent of interference from nearby object or contours (the critical spacing of "crowding") has been thoroughly characterized across the visual field, typically using high contrast achromatic stimuli. However, attempts to link this measure with known properties of physiological pathways have been inconclusive. The S-cone pathway, with its ease of psychophysical isolation and known anatomical characteristics, offers a unique tool to gain additional insights into crowding. In this study, we measured the spatial extent of crowding in the S-cone pathway at several retinal locations using a chromatic adaptation paradigm. S-cone crowding was evident and extensive, but its spatial extent changed less markedly as a function of retinal eccentricity than the extent found using traditional achromatic stimuli. However, the spatial extent agreed with that of low contrast achromatic stimuli matched for isolated resolvability. This suggests that common cortical mechanisms mediate the crowding effect in the S-cone and achromatic pathway, but contrast is an important factor. The low contrast of S-cone stimuli makes S-cone vision more acuity-limited than crowding-limited.
Acuity, crowding, reading and fixation stability.
Falkenberg, Helle K; Rubin, Gary S; Bex, Peter J
2007-01-01
People with age-related macular disease frequently experience reading difficulty that could be attributed to poor acuity, elevated crowding or unstable fixation associated with peripheral visual field dependence. We examine how the size, location, spacing and instability of retinal images affect the visibility of letters and words at different eccentricities. Fixation instability was simulated in normally sighted observers by randomly jittering single or crowded letters or words along a circular arc of fixed eccentricity. Visual performance was assessed at different levels of instability with forced choice measurements of acuity, crowding and reading speed in a rapid serial visual presentation paradigm. In the periphery: (1) acuity declined; (2) crowding increased for acuity- and eccentricity-corrected targets; and (3), the rate of reading fell with acuity-, crowding- and eccentricity-corrected targets. Acuity and crowding were unaffected by even high levels of image instability. However, reading speed decreased with image instability, even though the visibility of the component letters was unaffected. The results show that reading performance cannot be standardised across the visual field by correcting the size, spacing and eccentricity of letters or words. The results suggest that unstable fixation may contribute to reading difficulties in people with low vision and therefore that rehabilitation may benefit from fixation training.
Maadooliat, Mehdi
2015-10-21
This paper develops a method for simultaneous estimation of density functions for a collection of populations of protein backbone angle pairs using a data-driven, shared basis that is constructed by bivariate spline functions defined on a triangulation of the bivariate domain. The circular nature of angular data is taken into account by imposing appropriate smoothness constraints across boundaries of the triangles. Maximum penalized likelihood is used to fit the model and an alternating blockwise Newton-type algorithm is developed for computation. A simulation study shows that the collective estimation approach is statistically more efficient than estimating the densities individually. The proposed method was used to estimate neighbor-dependent distributions of protein backbone dihedral angles (i.e., Ramachandran distributions). The estimated distributions were applied to protein loop modeling, one of the most challenging open problems in protein structure prediction, by feeding them into an angular-sampling-based loop structure prediction framework. Our estimated distributions compared favorably to the Ramachandran distributions estimated by fitting a hierarchical Dirichlet process model; and in particular, our distributions showed significant improvements on the hard cases where existing methods do not work well.
Plug-in error bounds for a mixing density estimate in $R^d,$ and for its derivatives
Yatracos, Yannis G.
2015-01-01
A mixture density, $f_p,$ is estimable in $R^d, \\ d \\ge 1,$ but an estimate for the mixing density, $p,$ is usually obtained only when $d$ is unity; $h$ is the mixture's kernel. When $f_p$'s estimate has form $f_{\\hat p_n}$ and $p$ is $\\tilde q$-smooth, vanishing outside a compact in $R^d,$ plug-in upper bounds are obtained herein for the $L_u$-error (and risk)of $\\hat p_n$ and its derivatives; $d \\ge 1, 1 \\le u \\le \\infty.$ The bounds depend on $f_{\\hat p_n}$'s $L_u$-error (or risk), $h$'s F...
Directory of Open Access Journals (Sweden)
Park Jinho
2012-06-01
Full Text Available Abstract Background Myocardial ischemia can be developed into more serious diseases. Early Detection of the ischemic syndrome in electrocardiogram (ECG more accurately and automatically can prevent it from developing into a catastrophic disease. To this end, we propose a new method, which employs wavelets and simple feature selection. Methods For training and testing, the European ST-T database is used, which is comprised of 367 ischemic ST episodes in 90 records. We first remove baseline wandering, and detect time positions of QRS complexes by a method based on the discrete wavelet transform. Next, for each heart beat, we extract three features which can be used for differentiating ST episodes from normal: 1 the area between QRS offset and T-peak points, 2 the normalized and signed sum from QRS offset to effective zero voltage point, and 3 the slope from QRS onset to offset point. We average the feature values for successive five beats to reduce effects of outliers. Finally we apply classifiers to those features. Results We evaluated the algorithm by kernel density estimation (KDE and support vector machine (SVM methods. Sensitivity and specificity for KDE were 0.939 and 0.912, respectively. The KDE classifier detects 349 ischemic ST episodes out of total 367 ST episodes. Sensitivity and specificity of SVM were 0.941 and 0.923, respectively. The SVM classifier detects 355 ischemic ST episodes. Conclusions We proposed a new method for detecting ischemia in ECG. It contains signal processing techniques of removing baseline wandering and detecting time positions of QRS complexes by discrete wavelet transform, and feature extraction from morphology of ECG waveforms explicitly. It was shown that the number of selected features were sufficient to discriminate ischemic ST episodes from the normal ones. We also showed how the proposed KDE classifier can automatically select kernel bandwidths, meaning that the algorithm does not require any numerical
Adaptively Learning the Crowd Kernel
Tamuz, Omer; Belongie, Serge; Shamir, Ohad; Kalai, Adam Tauman
2011-01-01
We introduce an algorithm that, given n objects, learns a similarity matrix over all n^2 pairs, from crowdsourced data alone. The algorithm samples responses to adaptively chosen triplet-based relative-similarity queries. Each query has the form "is object 'a' more similar to 'b' or to 'c'?" and is chosen to be maximally informative given the preceding responses. The output is an embedding of the objects into Euclidean space (like MDS); we refer to this as the "crowd kernel." The runtime (empirically observed to be linear) and cost (about $0.15 per object) of the algorithm are small enough to permit its application to databases of thousands of objects. The distance matrix provided by the algorithm allows for the development of an intuitive and powerful sequential, interactive search algorithm which we demonstrate for a variety of visual stimuli. We present quantitative results that demonstrate the benefit in cost and time of our approach compared to a nonadaptive approach. We also show the ability of our appr...
Calabia, Andres; Jin, Shuanggen
2017-02-01
The thermospheric mass density variations and the thermosphere-ionosphere coupling during geomagnetic storms are not clear due to lack of observables and large uncertainty in the models. Although accelerometers on-board Low-Orbit-Earth (LEO) satellites can measure non-gravitational accelerations and derive thermospheric mass density variations with unprecedented details, their measurements are not always available (e.g., for the March 2013 geomagnetic storm). In order to cover accelerometer data gaps of Gravity Recovery and Climate Experiment (GRACE), we estimate thermospheric mass densities from numerical derivation of GRACE determined precise orbit ephemeris (POE) for the period 2011-2016. Our results show good correlation with accelerometer-based mass densities, and a better estimation than the NRLMSISE00 empirical model. Furthermore, we statistically analyze the differences to accelerometer-based densities, and study the March 2013 geomagnetic storm response. The thermospheric density enhancements at the polar regions on 17 March 2013 are clearly represented by POE-based measurements. Although our results show density variations better correlate with Dst and k-derived geomagnetic indices, the auroral electroject activity index AE as well as the merging electric field Em picture better agreement at high latitude for the March 2013 geomagnetic storm. On the other side, low-latitude variations are better represented with the Dst index. With the increasing resolution and accuracy of Precise Orbit Determination (POD) products and LEO satellites, the straightforward technique of determining non-gravitational accelerations and thermospheric mass densities through numerical differentiation of POE promises potentially good applications for the upper atmosphere research community.
Innovative funding solution for special projects: Crowd funding
Directory of Open Access Journals (Sweden)
Sentot Imam Wahjono
2015-06-01
Full Text Available The aim of this paper is to examine the influence of crowd funding knowledge, applica-tion, platform, and project initiator toward successful crowd funding. This study conducted by quantitative approach, data have been collected with web-based ques-tionnaires via Kickstarter.com direct message and e-mail to 200 successful crowd funding project initiators as a sample and as much 152 sets questionnaire returned by a complete answer and should be analyzed further. Deployment and data collection take 3 month from October to December 2013. This study found evidence that crowd funding knowledge, crowd funding application, crowd funding platform, and project initiator has positive and significant relationship toward the success of crowd funding. The implication from this research is crowd funding can be a source of capital to finance the projects, not just rely on traditional sources of financing just like banking and capital markets. Crowd funding can be innovative funding solution.
Altered immunity in crowded locust reduced fungal (Metarhizium anisopliae) pathogenesis.
Wang, Yundan; Yang, Pengcheng; Cui, Feng; Kang, Le
2013-01-01
The stress of living conditions, similar to infections, alters animal immunity. High population density is empirically considered to induce prophylactic immunity to reduce the infection risk, which was challenged by a model of low connectivity between infectious and susceptible individuals in crowded animals. The migratory locust, which exhibits polyphenism through gregarious and solitary phases in response to population density and displays different resistance to fungal biopesticide (Metarhizium anisopliae), was used to observe the prophylactic immunity of crowded animals. We applied an RNA-sequencing assay to investigate differential expression in fat body samples of gregarious and solitary locusts before and after infection. Solitary locusts devoted at least twice the number of genes for combating M. anisopliae infection than gregarious locusts. The transcription of immune molecules such as pattern recognition proteins, protease inhibitors, and anti-oxidation proteins, was increased in prophylactic immunity of gregarious locusts. The differentially expressed transcripts reducing gregarious locust susceptibility to M. anisopliae were confirmed at the transcriptional and translational level. Further investigation revealed that locust GNBP3 was susceptible to proteolysis while GNBP1, induced by M. anisopliae infection, resisted proteolysis. Silencing of gnbp3 by RNAi significantly shortened the life span of gregarious locusts but not solitary locusts. By contrast, gnbp1 silencing did not affect the life span of both gregarious and solitary locusts after M. anisopliae infection. Thus, the GNBP3-dependent immune responses were involved in the phenotypic resistance of gregarious locusts to fungal infection, but were redundant in solitary locusts. Our results indicated that gregarious locusts prophylactically activated upstream modulators of immune cascades rather than downstream effectors, preferring to quarantine rather than eliminate pathogens to conserve energy
Altered immunity in crowded locust reduced fungal (Metarhizium anisopliae pathogenesis.
Directory of Open Access Journals (Sweden)
Yundan Wang
2013-01-01
Full Text Available The stress of living conditions, similar to infections, alters animal immunity. High population density is empirically considered to induce prophylactic immunity to reduce the infection risk, which was challenged by a model of low connectivity between infectious and susceptible individuals in crowded animals. The migratory locust, which exhibits polyphenism through gregarious and solitary phases in response to population density and displays different resistance to fungal biopesticide (Metarhizium anisopliae, was used to observe the prophylactic immunity of crowded animals. We applied an RNA-sequencing assay to investigate differential expression in fat body samples of gregarious and solitary locusts before and after infection. Solitary locusts devoted at least twice the number of genes for combating M. anisopliae infection than gregarious locusts. The transcription of immune molecules such as pattern recognition proteins, protease inhibitors, and anti-oxidation proteins, was increased in prophylactic immunity of gregarious locusts. The differentially expressed transcripts reducing gregarious locust susceptibility to M. anisopliae were confirmed at the transcriptional and translational level. Further investigation revealed that locust GNBP3 was susceptible to proteolysis while GNBP1, induced by M. anisopliae infection, resisted proteolysis. Silencing of gnbp3 by RNAi significantly shortened the life span of gregarious locusts but not solitary locusts. By contrast, gnbp1 silencing did not affect the life span of both gregarious and solitary locusts after M. anisopliae infection. Thus, the GNBP3-dependent immune responses were involved in the phenotypic resistance of gregarious locusts to fungal infection, but were redundant in solitary locusts. Our results indicated that gregarious locusts prophylactically activated upstream modulators of immune cascades rather than downstream effectors, preferring to quarantine rather than eliminate pathogens to
CrowdAidRepair: A Crowd-Aided Interactive Data Repairing Method
Zhou, Jian
2016-03-25
Data repairing aims at discovering and correcting erroneous data in databases. Traditional methods relying on predefined quality rules to detect the conflict between data may fail to choose the right way to fix the detected conflict. Recent efforts turn to use the power of crowd in data repairing, but the crowd power has its own drawbacks such as high human intervention cost and inevitable low efficiency. In this paper, we propose a crowd-aided interactive data repairing method which takes the advantages of both rule-based method and crowd-based method. Particularly, we investigate the interaction between crowd-based repairing and rule-based repairing, and show that by doing crowd-based repairing to a small portion of values, we can greatly improve the repairing quality of the rule-based repairing method. Although we prove that the optimal interaction scheme using the least number of values for crowd-based repairing to maximize the imputation recall is not feasible to be achieved, still, our proposed solution identifies an efficient scheme through investigating the inconsistencies and the dependencies between values in the repairing process. Our empirical study on three data collections demonstrates the high repairing quality of CrowdAidRepair, as well as the efficiency of the generated interaction scheme over baselines.
De Marco, Stefano
2011-01-01
We study smoothness of densities for the solutions of SDEs whose coefficients are smooth and nondegenerate only on an open domain $D$. We prove that a smooth density exists on $D$ and give upper bounds for this density. Under some additional conditions (mainly dealing with the growth of the coefficients and their derivatives), we formulate upper bounds that are suitable to obtain asymptotic estimates of the density for large values of the state variable ("tail" estimates). These results specify and extend some results by Kusuoka and Stroock [J. Fac. Sci. Univ. Tokyo Sect. IA Math. 32 (1985) 1--76], but our approach is substantially different and based on a technique to estimate the Fourier transform inspired from Fournier [Electron. J. Probab. 13 (2008) 135--156] and Bally [Integration by parts formula for locally smooth laws and applications to equations with jumps I (2007) The Royal Swedish Academy of Sciences]. This study is motivated by existing models for financial securities which rely on SDEs with non-...
Some Numerical Aspects on Crowd Motion - The Hughes Model
Gomes, Diogo A.
2016-01-06
Here, we study a crowd model proposed by R. Hughes in [5] and we describe a numerical approach to solve it. This model comprises a Fokker-Planck equation coupled with an Eikonal equation with Dirichlet or Neumann data. First, we establish a priori estimates for the solution. Second, we study radial solutions and identify a shock formation mechanism. Third, we illustrate the existence of congestion, the breakdown of the model, and the trend to the equilibrium. Finally, we propose a new numerical method and consider two numerical examples.
Further characterization of the influence of crowding on medication errors
Directory of Open Access Journals (Sweden)
Hannah Watts
2013-01-01
Full Text Available Study Objectives: Our prior analysis suggested that error frequency increases disproportionately with Emergency department (ED crowding. To further characterize, we measured this association while controlling for the number of charts reviewed and the presence of ambulance diversion status. We hypothesized that errors would occur significantly more frequently as crowding increased, even after controlling for higher patient volumes. Materials and Methods: We performed a prospective, observational study in a large, community hospital ED from May to October of 2009. Our ED has full-time pharmacists who review orders of patients to help identify errors prior to their causing harm. Research volunteers shadowed our ED pharmacists over discrete 4- hour time periods during their reviews of orders on patients in the ED. The total numbers of charts reviewed and errors identified were documented along with details for each error type, severity, and category. We then measured the correlation between error rate (number of errors divided by total number of charts reviewed and ED occupancy rate while controlling for diversion status during the observational period. We estimated a sample size requirement of at least 45 errors identified to allow detection of an effect size of 0.6 based on our historical data. Results: During 324 hours of surveillance, 1171 charts were reviewed and 87 errors were identified. Median error rate per 4-hour block was 5.8% of charts reviewed (IQR 0-13. No significant change was seen with ED occupancy rate (Spearman′s rho = -.08, P = .49. Median error rate during times on ambulance diversion was almost twice as large (11%, IQR 0-17, but this rate did not reach statistical significance in univariate or multivariate analysis. Conclusions: Error frequency appears to remain relatively constant across the range of crowding in our ED when controlling for patient volume via the quantity of orders reviewed. Error quantity therefore increases
Directory of Open Access Journals (Sweden)
Siu-Fung Lau
2011-05-01
Full Text Available Upright flanking faces have stronger detrimental effects on the recognition of upright target face, than inverted flanking faces. One possible explanation for this “flanker- inversion effect” was that the more holistically processed upright flanking faces allowed for more erroneous feature integration. Alternatively, crowding was known to be stronger when target and flankers were more similar. Here we investigate flanker-inversion effect on crowding in Chinese character identification. Five normally-sighted young adults participated. Targets of size 1.2° were presented at 5° in the lower visual field. Four flankers with center-to-center distance of 1.8° were presented in the crowded condition. Three types of flankers were used, upright or inverted Chinese and upright Korean characters. The identification contrast thresholds were estimated by QUEST and crowding strength was measured through threshold elevation (TE. Crowding on upright Chinese target was significantly stronger with inverted Chinese flankers (TE = 1.59±0.32 than with upright Chinese flankers (TE = 1.47±0.29. No inversion effect was observed for inverted Chinese target. Korean flankers produced similar crowding as upright Chinese flankers. Our results go against the similarity rule that predicts upright Chinese flankers would produce stronger crowding for upright Chinese target. Holistic processing preferred for inverted Chinese characters may account for the findings.
Després-Einspenner, Marie-Lyne; Howe, Eric J; Drapeau, Pierre; Kühl, Hjalmar S
2017-03-07
Empirical validations of survey methods for estimating animal densities are rare, despite the fact that only an application to a population of known density can demonstrate their reliability under field conditions and constraints. Here, we present a field validation of camera trapping in combination with spatially explicit capture-recapture (SECR) methods for enumerating chimpanzee populations. We used 83 camera traps to sample a habituated community of western chimpanzees (Pan troglodytes verus) of known community and territory size in Taï National Park, Ivory Coast, and estimated community size and density using spatially explicit capture-recapture models. We aimed to: (1) validate camera trapping as a means to collect capture-recapture data for chimpanzees; (2) validate SECR methods to estimate chimpanzee density from camera trap data; (3) compare the efficacy of targeting locations frequently visited by chimpanzees versus deploying cameras according to a systematic design; (4) evaluate the performance of SECR estimators with reduced sampling effort; and (5) identify sources of heterogeneity in detection probabilities. Ten months of camera trapping provided abundant capture-recapture data. All weaned individuals were detected, most of them multiple times, at both an array of targeted locations, and a systematic grid of cameras positioned randomly within the study area, though detection probabilities were higher at targeted locations. SECR abundance estimates were accurate and precise, and analyses of subsets of the data indicated that the majority of individuals in a community could be detected with as few as five traps deployed within their territory. Our results highlight the potential of camera trapping for cost-effective monitoring of chimpanzee populations.
Analysis of the Influence of Plot Size and LiDAR Density on Forest Structure Attribute Estimates
Luis A. Ruiz; Txomin Hermosilla; Francisco Mauro; Miguel Godino
2014-01-01
Licencia Creative Commons: Attribution 3.0 Unported (CC BY 3.0) This paper assesses the combined effect of field plot size and LiDAR density on the estimation of four forest structure attributes: volume, total biomass, basal area and canopy cover. A total of 21 different plot sizes were considered, obtained by decreasing the field measured plot radius value from 25 to 5 m with regular intervals of 1 m. LiDAR data densities were simulated by randomly removing LiDAR pulses until ...
Minh, David D L; Vaikuntanathan, Suriyanarayanan
2011-01-21
The nonequilibrium fluctuation theorems have paved the way for estimating equilibrium thermodynamic properties, such as free energy differences, using trajectories from driven nonequilibrium processes. While many statistical estimators may be derived from these identities, some are more efficient than others. It has recently been suggested that trajectories sampled using a particular time-dependent protocol for perturbing the Hamiltonian may be analyzed with another one. Choosing an analysis protocol based on the nonequilibrium density was empirically demonstrated to reduce the variance and bias of free energy estimates. Here, we present an alternate mathematical formalism for protocol postprocessing based on the Feynmac-Kac theorem. The estimator that results from this formalism is demonstrated on a few low-dimensional model systems. It is found to have reduced bias compared to both the standard form of Jarzynski's equality and the previous protocol postprocessing formalism.