WorldWideScience

Sample records for spatial sampling design

  1. Latent spatial models and sampling design for landscape genetics

    Science.gov (United States)

    Hanks, Ephraim M.; Hooten, Mevin B.; Knick, Steven T.; Oyler-McCance, Sara J.; Fike, Jennifer A.; Cross, Todd B.; Schwartz, Michael K.

    2016-01-01

    We propose a spatially-explicit approach for modeling genetic variation across space and illustrate how this approach can be used to optimize spatial prediction and sampling design for landscape genetic data. We propose a multinomial data model for categorical microsatellite allele data commonly used in landscape genetic studies, and introduce a latent spatial random effect to allow for spatial correlation between genetic observations. We illustrate how modern dimension reduction approaches to spatial statistics can allow for efficient computation in landscape genetic statistical models covering large spatial domains. We apply our approach to propose a retrospective spatial sampling design for greater sage-grouse (Centrocercus urophasianus) population genetics in the western United States.

  2. A phoswich detector design for improved spatial sampling in PET

    Science.gov (United States)

    Thiessen, Jonathan D.; Koschan, Merry A.; Melcher, Charles L.; Meng, Fang; Schellenberg, Graham; Goertzen, Andrew L.

    2018-02-01

    Block detector designs, utilizing a pixelated scintillator array coupled to a photosensor array in a light-sharing design, are commonly used for positron emission tomography (PET) imaging applications. In practice, the spatial sampling of these designs is limited by the crystal pitch, which must be large enough for individual crystals to be resolved in the detector flood image. Replacing the conventional 2D scintillator array with an array of phoswich elements, each consisting of an optically coupled side-by-side scintillator pair, may improve spatial sampling in one direction of the array without requiring resolving smaller crystal elements. To test the feasibility of this design, a 4 × 4 phoswich array was constructed, with each phoswich element consisting of two optically coupled, 3 . 17 × 1 . 58 × 10mm3 LSO crystals co-doped with cerium and calcium. The amount of calcium doping was varied to create a 'fast' LSO crystal with decay time of 32.9 ns and a 'slow' LSO crystal with decay time of 41.2 ns. Using a Hamamatsu R8900U-00-C12 position-sensitive photomultiplier tube (PS-PMT) and a CAEN V1720 250 MS/s waveform digitizer, we were able to show effective discrimination of the fast and slow LSO crystals in the phoswich array. Although a side-by-side phoswich array is feasible, reflections at the crystal boundary due to a mismatch between the refractive index of the optical adhesive (n = 1 . 5) and LSO (n = 1 . 82) caused it to behave optically as an 8 × 4 array rather than a 4 × 4 array. Direct coupling of each phoswich element to individual photodetector elements may be necessary with the current phoswich array design. Alternatively, in order to implement this phoswich design with a conventional light sharing PET block detector, a high refractive index optical adhesive is necessary to closely match the refractive index of LSO.

  3. The variance quadtree algorithm: use for spatial sampling design

    NARCIS (Netherlands)

    Minasny, B.; McBratney, A.B.; Walvoort, D.J.J.

    2007-01-01

    Spatial sampling schemes are mainly developed to determine sampling locations that can cover the variation of environmental properties in the area of interest. Here we proposed the variance quadtree algorithm for sampling in an area with prior information represented as ancillary or secondary

  4. Incorporating covariance estimation uncertainty in spatial sampling design for prediction with trans-Gaussian random fields

    Directory of Open Access Journals (Sweden)

    Gunter eSpöck

    2015-05-01

    Full Text Available Recently, Spock and Pilz [38], demonstratedthat the spatial sampling design problem forthe Bayesian linear kriging predictor can betransformed to an equivalent experimentaldesign problem for a linear regression modelwith stochastic regression coefficients anduncorrelated errors. The stochastic regressioncoefficients derive from the polar spectralapproximation of the residual process. Thus,standard optimal convex experimental designtheory can be used to calculate optimal spatialsampling designs. The design functionals ̈considered in Spock and Pilz [38] did nottake into account the fact that kriging isactually a plug-in predictor which uses theestimated covariance function. The resultingoptimal designs were close to space-fillingconfigurations, because the design criteriondid not consider the uncertainty of thecovariance function.In this paper we also assume that thecovariance function is estimated, e.g., byrestricted maximum likelihood (REML. Wethen develop a design criterion that fully takesaccount of the covariance uncertainty. Theresulting designs are less regular and space-filling compared to those ignoring covarianceuncertainty. The new designs, however, alsorequire some closely spaced samples in orderto improve the estimate of the covariancefunction. We also relax the assumption ofGaussian observations and assume that thedata is transformed to Gaussianity by meansof the Box-Cox transformation. The resultingprediction method is known as trans-Gaussiankriging. We apply the Smith and Zhu [37]approach to this kriging method and show thatresulting optimal designs also depend on theavailable data. We illustrate our results witha data set of monthly rainfall measurementsfrom Upper Austria.

  5. Detecting spatial structures in throughfall data: The effect of extent, sample size, sampling design, and variogram estimation method

    Science.gov (United States)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-09-01

    In the last decades, an increasing number of studies analyzed spatial patterns in throughfall by means of variograms. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and a layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation method on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with large outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling) and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments (non-robust and robust estimators) and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the number recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous

  6. MEETING IN CHICAGO: SADA: A FREEWARE DECISION SUPPORT TOOL INTEGRATING GIS, SAMPLE DESIGN, SPATIAL MODELING, AND ENVIRONMENTAL RISK ASSESSMENT

    Science.gov (United States)

    Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporates tools from environmental assessment into an effective problem-solving environment. SADA was developed by the Institute for Environmental Modeling at the University of Tennessee and inc...

  7. SADA: A FREEWARE DECISION SUPPORT TOOL INTEGRATING GIS, SAMPLE DESIGN, SPATIAL MODELING AND RISK ASSESSMENT (SLIDE PRESENTATION)

    Science.gov (United States)

    Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporates tools from environmental assessment into an effective problem-solving environment. SADA was developed by the Institute for Environmental Modeling at the University of Tennessee and inc...

  8. MEETING IN CZECH REPUBLIC: SADA: A FREEWARE DECISION SUPPORT TOOL INTEGRATING GIS, SAMPLE DESIGN, SPATIAL MODELING, AND RISK ASSESSMENT

    Science.gov (United States)

    Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporates tools from environmental assessment into an effective problem-solving environment. SADA was developed by the Institute for Environmental Modeling at the University of Tennessee and inc...

  9. Spatial distribution, sampling precision and survey design optimisation with non-normal variables: The case of anchovy (Engraulis encrasicolus) recruitment in Spanish Mediterranean waters

    Science.gov (United States)

    Tugores, M. Pilar; Iglesias, Magdalena; Oñate, Dolores; Miquel, Joan

    2016-02-01

    In the Mediterranean Sea, the European anchovy (Engraulis encrasicolus) displays a key role in ecological and economical terms. Ensuring stock sustainability requires the provision of crucial information, such as species spatial distribution or unbiased abundance and precision estimates, so that management strategies can be defined (e.g. fishing quotas, temporal closure areas or marine protected areas MPA). Furthermore, the estimation of the precision of global abundance at different sampling intensities can be used for survey design optimisation. Geostatistics provide a priori unbiased estimations of the spatial structure, global abundance and precision for autocorrelated data. However, their application to non-Gaussian data introduces difficulties in the analysis in conjunction with low robustness or unbiasedness. The present study applied intrinsic geostatistics in two dimensions in order to (i) analyse the spatial distribution of anchovy in Spanish Western Mediterranean waters during the species' recruitment season, (ii) produce distribution maps, (iii) estimate global abundance and its precision, (iv) analyse the effect of changing the sampling intensity on the precision of global abundance estimates and, (v) evaluate the effects of several methodological options on the robustness of all the analysed parameters. The results suggested that while the spatial structure was usually non-robust to the tested methodological options when working with the original dataset, it became more robust for the transformed datasets (especially for the log-backtransformed dataset). The global abundance was always highly robust and the global precision was highly or moderately robust to most of the methodological options, except for data transformation.

  10. Urban Design and Spatial Equity

    DEFF Research Database (Denmark)

    Silva, Victor

    2012-01-01

    During the last century, the motorized vehicles have been preponderant in the streets. However, the emergence of the debate about sustainability and its relation to the urban environment has influenced urban designers to rethink the role of the streets and their spatiality. Pedestrians and cyclists...... are gaining space not only for move to a specific destination, but also space in which to play and stay. Taking in consideration the formal structure of our cities, streets are critical to urban transformation and strategic to restructure the urban flows and the quality of urban life. This chapter aims...... transformation of a street in the core of Odense – Vestergade Vest. Firstly, this chapter presents the notion of shared use streets – including a brief historical context and a debate about its design characteristics and its role to enhance street life. Secondly, it is presented a creative and low budget design...

  11. Spatial-dependence recurrence sample entropy

    Science.gov (United States)

    Pham, Tuan D.; Yan, Hong

    2018-03-01

    Measuring complexity in terms of the predictability of time series is a major area of research in science and engineering, and its applications are spreading throughout many scientific disciplines, where the analysis of physiological signals is perhaps the most widely reported in literature. Sample entropy is a popular measure for quantifying signal irregularity. However, the sample entropy does not take sequential information, which is inherently useful, into its calculation of sample similarity. Here, we develop a method that is based on the mathematical principle of the sample entropy and enables the capture of sequential information of a time series in the context of spatial dependence provided by the binary-level co-occurrence matrix of a recurrence plot. Experimental results on time-series data of the Lorenz system, physiological signals of gait maturation in healthy children, and gait dynamics in Huntington's disease show the potential of the proposed method.

  12. Nonlinear Spatial Inversion Without Monte Carlo Sampling

    Science.gov (United States)

    Curtis, A.; Nawaz, A.

    2017-12-01

    High-dimensional, nonlinear inverse or inference problems usually have non-unique solutions. The distribution of solutions are described by probability distributions, and these are usually found using Monte Carlo (MC) sampling methods. These take pseudo-random samples of models in parameter space, calculate the probability of each sample given available data and other information, and thus map out high or low probability values of model parameters. However, such methods would converge to the solution only as the number of samples tends to infinity; in practice, MC is found to be slow to converge, convergence is not guaranteed to be achieved in finite time, and detection of convergence requires the use of subjective criteria. We propose a method for Bayesian inversion of categorical variables such as geological facies or rock types in spatial problems, which requires no sampling at all. The method uses a 2-D Hidden Markov Model over a grid of cells, where observations represent localized data constraining the model in each cell. The data in our example application are seismic properties such as P- and S-wave impedances or rock density; our model parameters are the hidden states and represent the geological rock types in each cell. The observations at each location are assumed to depend on the facies at that location only - an assumption referred to as `localized likelihoods'. However, the facies at a location cannot be determined solely by the observation at that location as it also depends on prior information concerning its correlation with the spatial distribution of facies elsewhere. Such prior information is included in the inversion in the form of a training image which represents a conceptual depiction of the distribution of local geologies that might be expected, but other forms of prior information can be used in the method as desired. The method provides direct (pseudo-analytic) estimates of posterior marginal probability distributions over each variable

  13. Interactions of collimation, sampling and filtering on spect spatial resolution

    International Nuclear Information System (INIS)

    Tsui, B.M.W.; Jaszczak, R.J.

    1984-01-01

    The major factors which affect the spatial resolution of single-photon emission computer tomography (SPECT) include collimation, sampling and filtering. A theoretical formulation is presented to describe the relationship between these factors and their effects on the projection data. Numerical calculations were made using commercially available SPECT systems and imaging parameters. The results provide an important guide for proper selection of the collimator-detector design, the imaging and the reconstruction parameters to avoid unnecessary spatial resolution degradation and aliasing artifacts in the reconstructed image. In addition, the understanding will help in the fair evaluation of different SPECT systems under specific imaging conditions

  14. Sample design effects in landscape genetics

    Science.gov (United States)

    Oyler-McCance, Sara J.; Fedy, Bradley C.; Landguth, Erin L.

    2012-01-01

    An important research gap in landscape genetics is the impact of different field sampling designs on the ability to detect the effects of landscape pattern on gene flow. We evaluated how five different sampling regimes (random, linear, systematic, cluster, and single study site) affected the probability of correctly identifying the generating landscape process of population structure. Sampling regimes were chosen to represent a suite of designs common in field studies. We used genetic data generated from a spatially-explicit, individual-based program and simulated gene flow in a continuous population across a landscape with gradual spatial changes in resistance to movement. Additionally, we evaluated the sampling regimes using realistic and obtainable number of loci (10 and 20), number of alleles per locus (5 and 10), number of individuals sampled (10-300), and generational time after the landscape was introduced (20 and 400). For a simulated continuously distributed species, we found that random, linear, and systematic sampling regimes performed well with high sample sizes (>200), levels of polymorphism (10 alleles per locus), and number of molecular markers (20). The cluster and single study site sampling regimes were not able to correctly identify the generating process under any conditions and thus, are not advisable strategies for scenarios similar to our simulations. Our research emphasizes the importance of sampling data at ecologically appropriate spatial and temporal scales and suggests careful consideration for sampling near landscape components that are likely to most influence the genetic structure of the species. In addition, simulating sampling designs a priori could help guide filed data collection efforts.

  15. An R package for spatial coverage sampling and random sampling from compact geographical strata by k-means

    NARCIS (Netherlands)

    Walvoort, D.J.J.; Brus, D.J.; Gruijter, de J.J.

    2010-01-01

    Both for mapping and for estimating spatial means of an environmental variable, the accuracy of the result will usually be increased by dispersing the sample locations so that they cover the study area as uniformly as possible. We developed a new R package for designing spatial coverage samples for

  16. Involving Motion Graphics in Spatial Experience Design

    DEFF Research Database (Denmark)

    Steijn, Arthur

    2013-01-01

    elements such as e.g. space, tone, color, movement, time and timing. Developing this design model has two purposes. The first is as a tool for analyzing empirical examples or cases of where motion graphics is used in spatial experience design. The second is as a tool that can be used in the actual design...... process, and therefore it should be constructed as such. Since the development of the design model has this double focus, I involve design students in design laboratories related to my practice as a teacher in visual communication design and production design. I also reflect on how an initial design...

  17. [Saarland Growth Study: sampling design].

    Science.gov (United States)

    Danker-Hopfe, H; Zabransky, S

    2000-01-01

    The use of reference data to evaluate the physical development of children and adolescents is part of the daily routine in the paediatric ambulance. The construction of such reference data is based on the collection of extensive reference data. There are different kinds of reference data: cross sectional references, which are based on data collected from a big representative cross-sectional sample of the population, longitudinal references, which are based on follow-up surveys of usually smaller samples of individuals from birth to maturity, and mixed longitudinal references, which are a combination of longitudinal and cross-sectional reference data. The advantages and disadvantages of the different methods of data collection and the resulting reference data are discussed. The Saarland Growth Study was conducted for several reasons: growth processes are subject to secular changes, there are no specific reference data for children and adolescents from this part of the country and the growth charts in use in the paediatric praxis are possibly not appropriate any more. Therefore, the Saarland Growth Study served two purposes a) to create actual regional reference data and b) to create a database for future studies on secular trends in growth processes of children and adolescents from Saarland. The present contribution focusses on general remarks on the sampling design of (cross-sectional) growth surveys and its inferences for the design of the present study.

  18. On Angular Sampling Methods for 3-D Spatial Channel Models

    DEFF Research Database (Denmark)

    Fan, Wei; Jämsä, Tommi; Nielsen, Jesper Ødum

    2015-01-01

    This paper discusses generating three dimensional (3D) spatial channel models with emphasis on the angular sampling methods. Three angular sampling methods, i.e. modified uniform power sampling, modified uniform angular sampling, and random pairing methods are proposed and investigated in detail....... The random pairing method, which uses only twenty sinusoids in the ray-based model for generating the channels, presents good results if the spatial channel cluster is with a small elevation angle spread. For spatial clusters with large elevation angle spreads, however, the random pairing method would fail...... and the other two methods should be considered....

  19. Spatial Mapping of Organic Carbon in Returned Samples from Mars

    Science.gov (United States)

    Siljeström, S.; Fornaro, T.; Greenwalt, D.; Steele, A.

    2018-04-01

    To map organic material spatially to minerals present in the sample will be essential for the understanding of the origin of any organics in returned samples from Mars. It will be shown how ToF-SIMS may be used to map organics in samples from Mars.

  20. DESIGN AND CONSTRUCTION OF A FOREST SPATIAL DATABASE: AN APPLICATION

    Directory of Open Access Journals (Sweden)

    Turan Sönmez

    2006-11-01

    Full Text Available General Directorate of Forests (GDF has not yet created the spatial forest database to manage forest and catch the developed countries in forestry. The lack of spatial forest database results in collection of the spatial data redundancy, communication problems among the forestry organizations. Also it causes Turkish forestry to be backward of informatics’ era. To solve these problems; GDF should establish spatial forest database supported Geographic Information System (GIS. To design the spatial database, supported GIS, which provides accurate, on time and current data/info for decision makers and operators in forestry, and to develop sample interface program to apply and monitor classical forest management plans is paramount in contemporary forest management planning process. This research is composed of three major stages: (i spatial rototype database design considering required by the three hierarchical organizations of GDF (regional directorate of forests, forest enterprise, and territorial division, (ii user interface program developed to apply and monitor classical management plans based on the designed database, (iii the implementation of the designed database and its user interface in Artvin Central Planning Unit.

  1. Triangulation based inclusion probabilities: a design-unbiased sampling approach

    OpenAIRE

    Fehrmann, Lutz; Gregoire, Timothy; Kleinn, Christoph

    2011-01-01

    A probabilistic sampling approach for design-unbiased estimation of area-related quantitative characteristics of spatially dispersed population units is proposed. The developed field protocol includes a fixed number of 3 units per sampling location and is based on partial triangulations over their natural neighbors to derive the individual inclusion probabilities. The performance of the proposed design is tested in comparison to fixed area sample plots in a simulation with two forest stands. ...

  2. Designing optimal sampling schemes for field visits

    CSIR Research Space (South Africa)

    Debba, Pravesh

    2008-10-01

    Full Text Available This is a presentation of a statistical method for deriving optimal spatial sampling schemes. The research focuses on ground verification of minerals derived from hyperspectral data. Spectral angle mapper (SAM) and spectral feature fitting (SFF...

  3. Spatial Sampling of Weather Data for Regional Crop Yield Simulations

    Science.gov (United States)

    Van Bussel, Lenny G. J.; Ewert, Frank; Zhao, Gang; Hoffmann, Holger; Enders, Andreas; Wallach, Daniel; Asseng, Senthold; Baigorria, Guillermo A.; Basso, Bruno; Biernath, Christian; hide

    2016-01-01

    Field-scale crop models are increasingly applied at spatio-temporal scales that range from regions to the globe and from decades up to 100 years. Sufficiently detailed data to capture the prevailing spatio-temporal heterogeneity in weather, soil, and management conditions as needed by crop models are rarely available. Effective sampling may overcome the problem of missing data but has rarely been investigated. In this study the effect of sampling weather data has been evaluated for simulating yields of winter wheat in a region in Germany over a 30-year period (1982-2011) using 12 process-based crop models. A stratified sampling was applied to compare the effect of different sizes of spatially sampled weather data (10, 30, 50, 100, 500, 1000 and full coverage of 34,078 sampling points) on simulated wheat yields. Stratified sampling was further compared with random sampling. Possible interactions between sample size and crop model were evaluated. The results showed differences in simulated yields among crop models but all models reproduced well the pattern of the stratification. Importantly, the regional mean of simulated yields based on full coverage could already be reproduced by a small sample of 10 points. This was also true for reproducing the temporal variability in simulated yields but more sampling points (about 100) were required to accurately reproduce spatial yield variability. The number of sampling points can be smaller when a stratified sampling is applied as compared to a random sampling. However, differences between crop models were observed including some interaction between the effect of sampling on simulated yields and the model used. We concluded that stratified sampling can considerably reduce the number of required simulations. But, differences between crop models must be considered as the choice for a specific model can have larger effects on simulated yields than the sampling strategy. Assessing the impact of sampling soil and crop management

  4. Uncertainties in Coastal Ocean Color Products: Impacts of Spatial Sampling

    Science.gov (United States)

    Pahlevan, Nima; Sarkar, Sudipta; Franz, Bryan A.

    2016-01-01

    With increasing demands for ocean color (OC) products with improved accuracy and well characterized, per-retrieval uncertainty budgets, it is vital to decompose overall estimated errors into their primary components. Amongst various contributing elements (e.g., instrument calibration, atmospheric correction, inversion algorithms) in the uncertainty of an OC observation, less attention has been paid to uncertainties associated with spatial sampling. In this paper, we simulate MODIS (aboard both Aqua and Terra) and VIIRS OC products using 30 m resolution OC products derived from the Operational Land Imager (OLI) aboard Landsat-8, to examine impacts of spatial sampling on both cross-sensor product intercomparisons and in-situ validations of R(sub rs) products in coastal waters. Various OLI OC products representing different productivity levels and in-water spatial features were scanned for one full orbital-repeat cycle of each ocean color satellite. While some view-angle dependent differences in simulated Aqua-MODIS and VIIRS were observed, the average uncertainties (absolute) in product intercomparisons (due to differences in spatial sampling) at regional scales are found to be 1.8%, 1.9%, 2.4%, 4.3%, 2.7%, 1.8%, and 4% for the R(sub rs)(443), R(sub rs)(482), R(sub rs)(561), R(sub rs)(655), Chla, K(sub d)(482), and b(sub bp)(655) products, respectively. It is also found that, depending on in-water spatial variability and the sensor's footprint size, the errors for an in-situ validation station in coastal areas can reach as high as +/- 18%. We conclude that a) expected biases induced by the spatial sampling in product intercomparisons are mitigated when products are averaged over at least 7 km × 7 km areas, b) VIIRS observations, with improved consistency in cross-track spatial sampling, yield more precise calibration/validation statistics than that of MODIS, and c) use of a single pixel centered on in-situ coastal stations provides an optimal sampling size for

  5. Spatial distribution sampling and Monte Carlo simulation of radioactive isotopes

    CERN Document Server

    Krainer, Alexander Michael

    2015-01-01

    This work focuses on the implementation of a program for random sampling of uniformly spatially distributed isotopes for Monte Carlo particle simulations and in specific FLUKA. With FLUKA it is possible to calculate the radio nuclide production in high energy fields. The decay of these nuclide, and therefore the resulting radiation field, however can only be simulated in the same geometry. This works gives the tool to simulate the decay of the produced nuclide in other geometries. With that the radiation field from an irradiated object can be simulated in arbitrary environments. The sampling of isotope mixtures was tested by simulating a 50/50 mixture of $Cs^{137}$ and $Co^{60}$. These isotopes are both well known and provide therefore a first reliable benchmark in that respect. The sampling of uniformly distributed coordinates was tested using the histogram test for various spatial distributions. The advantages and disadvantages of the program compared to standard methods are demonstrated in the real life ca...

  6. spsann - optimization of sample patterns using spatial simulated annealing

    Science.gov (United States)

    Samuel-Rosa, Alessandro; Heuvelink, Gerard; Vasques, Gustavo; Anjos, Lúcia

    2015-04-01

    There are many algorithms and computer programs to optimize sample patterns, some private and others publicly available. A few have only been presented in scientific articles and text books. This dispersion and somewhat poor availability is holds back to their wider adoption and further development. We introduce spsann, a new R-package for the optimization of sample patterns using spatial simulated annealing. R is the most popular environment for data processing and analysis. Spatial simulated annealing is a well known method with widespread use to solve optimization problems in the soil and geo-sciences. This is mainly due to its robustness against local optima and easiness of implementation. spsann offers many optimizing criteria for sampling for variogram estimation (number of points or point-pairs per lag distance class - PPL), trend estimation (association/correlation and marginal distribution of the covariates - ACDC), and spatial interpolation (mean squared shortest distance - MSSD). spsann also includes the mean or maximum universal kriging variance (MUKV) as an optimizing criterion, which is used when the model of spatial variation is known. PPL, ACDC and MSSD were combined (PAN) for sampling when we are ignorant about the model of spatial variation. spsann solves this multi-objective optimization problem scaling the objective function values using their maximum absolute value or the mean value computed over 1000 random samples. Scaled values are aggregated using the weighted sum method. A graphical display allows to follow how the sample pattern is being perturbed during the optimization, as well as the evolution of its energy state. It is possible to start perturbing many points and exponentially reduce the number of perturbed points. The maximum perturbation distance reduces linearly with the number of iterations. The acceptance probability also reduces exponentially with the number of iterations. R is memory hungry and spatial simulated annealing is a

  7. Estimating abundance of mountain lions from unstructured spatial sampling

    Science.gov (United States)

    Russell, Robin E.; Royle, J. Andrew; Desimone, Richard; Schwartz, Michael K.; Edwards, Victoria L.; Pilgrim, Kristy P.; Mckelvey, Kevin S.

    2012-01-01

    Mountain lions (Puma concolor) are often difficult to monitor because of their low capture probabilities, extensive movements, and large territories. Methods for estimating the abundance of this species are needed to assess population status, determine harvest levels, evaluate the impacts of management actions on populations, and derive conservation and management strategies. Traditional mark–recapture methods do not explicitly account for differences in individual capture probabilities due to the spatial distribution of individuals in relation to survey effort (or trap locations). However, recent advances in the analysis of capture–recapture data have produced methods estimating abundance and density of animals from spatially explicit capture–recapture data that account for heterogeneity in capture probabilities due to the spatial organization of individuals and traps. We adapt recently developed spatial capture–recapture models to estimate density and abundance of mountain lions in western Montana. Volunteers and state agency personnel collected mountain lion DNA samples in portions of the Blackfoot drainage (7,908 km2) in west-central Montana using 2 methods: snow back-tracking mountain lion tracks to collect hair samples and biopsy darting treed mountain lions to obtain tissue samples. Overall, we recorded 72 individual capture events, including captures both with and without tissue sample collection and hair samples resulting in the identification of 50 individual mountain lions (30 females, 19 males, and 1 unknown sex individual). We estimated lion densities from 8 models containing effects of distance, sex, and survey effort on detection probability. Our population density estimates ranged from a minimum of 3.7 mountain lions/100 km2 (95% Cl 2.3–5.7) under the distance only model (including only an effect of distance on detection probability) to 6.7 (95% Cl 3.1–11.0) under the full model (including effects of distance, sex, survey effort, and

  8. Planetary Sample Caching System Design Options

    Science.gov (United States)

    Collins, Curtis; Younse, Paulo; Backes, Paul

    2009-01-01

    Potential Mars Sample Return missions would aspire to collect small core and regolith samples using a rover with a sample acquisition tool and sample caching system. Samples would need to be stored in individual sealed tubes in a canister that could be transfered to a Mars ascent vehicle and returned to Earth. A sample handling, encapsulation and containerization system (SHEC) has been developed as part of an integrated system for acquiring and storing core samples for application to future potential MSR and other potential sample return missions. Requirements and design options for the SHEC system were studied and a recommended design concept developed. Two families of solutions were explored: 1)transfer of a raw sample from the tool to the SHEC subsystem and 2)transfer of a tube containing the sample to the SHEC subsystem. The recommended design utilizes sample tool bit change out as the mechanism for transferring tubes to and samples in tubes from the tool. The SHEC subsystem design, called the Bit Changeout Caching(BiCC) design, is intended for operations on a MER class rover.

  9. The effects of spatial sampling choices on MR temperature measurements.

    Science.gov (United States)

    Todd, Nick; Vyas, Urvi; de Bever, Josh; Payne, Allison; Parker, Dennis L

    2011-02-01

    The purpose of this article is to quantify the effects that spatial sampling parameters have on the accuracy of magnetic resonance temperature measurements during high intensity focused ultrasound treatments. Spatial resolution and position of the sampling grid were considered using experimental and simulated data for two different types of high intensity focused ultrasound heating trajectories (a single point and a 4-mm circle) with maximum measured temperature and thermal dose volume as the metrics. It is demonstrated that measurement accuracy is related to the curvature of the temperature distribution, where regions with larger spatial second derivatives require higher resolution. The location of the sampling grid relative temperature distribution has a significant effect on the measured values. When imaging at 1.0 × 1.0 × 3.0 mm(3) resolution, the measured values for maximum temperature and volume dosed to 240 cumulative equivalent minutes (CEM) or greater varied by 17% and 33%, respectively, for the single-point heating case, and by 5% and 18%, respectively, for the 4-mm circle heating case. Accurate measurement of the maximum temperature required imaging at 1.0 × 1.0 × 3.0 mm(3) resolution for the single-point heating case and 2.0 × 2.0 × 5.0 mm(3) resolution for the 4-mm circle heating case. Copyright © 2010 Wiley-Liss, Inc.

  10. Automated simulation and study of spatial-structural design processes

    NARCIS (Netherlands)

    Davila Delgado, J.M.; Hofmeyer, H.; Stouffs, R.; Sariyildiz, S.

    2013-01-01

    A so-called "Design Process Investigation toolbox" (DPI toolbox), has been developed. It is a set of computational tools that simulate spatial-structural design processes. Its objectives are to study spatial-structural design processes and to support the involved actors. Two case-studies are

  11. Determination and optimization of spatial samples for distributed measurements.

    Energy Technology Data Exchange (ETDEWEB)

    Huo, Xiaoming (Georgia Institute of Technology, Atlanta, GA); Tran, Hy D.; Shilling, Katherine Meghan; Kim, Heeyong (Georgia Institute of Technology, Atlanta, GA)

    2010-10-01

    There are no accepted standards for determining how many measurements to take during part inspection or where to take them, or for assessing confidence in the evaluation of acceptance based on these measurements. The goal of this work was to develop a standard method for determining the number of measurements, together with the spatial distribution of measurements and the associated risks for false acceptance and false rejection. Two paths have been taken to create a standard method for selecting sampling points. A wavelet-based model has been developed to select measurement points and to determine confidence in the measurement after the points are taken. An adaptive sampling strategy has been studied to determine implementation feasibility on commercial measurement equipment. Results using both real and simulated data are presented for each of the paths.

  12. Mobile Variable Depth Sampling System Design Study

    International Nuclear Information System (INIS)

    BOGER, R.M.

    2000-01-01

    A design study is presented for a mobile, variable depth sampling system (MVDSS) that will support the treatment and immobilization of Hanford LAW and HLW. The sampler can be deployed in a 4-inch tank riser and has a design that is based on requirements identified in the Level 2 Specification (latest revision). The waste feed sequence for the MVDSS is based on Phase 1, Case 3S6 waste feed sequence. Technical information is also presented that supports the design study

  13. Mobile Variable Depth Sampling System Design Study

    Energy Technology Data Exchange (ETDEWEB)

    BOGER, R.M.

    2000-08-25

    A design study is presented for a mobile, variable depth sampling system (MVDSS) that will support the treatment and immobilization of Hanford LAW and HLW. The sampler can be deployed in a 4-inch tank riser and has a design that is based on requirements identified in the Level 2 Specification (latest revision). The waste feed sequence for the MVDSS is based on Phase 1, Case 3S6 waste feed sequence. Technical information is also presented that supports the design study.

  14. Thermal probe design for Europa sample acquisition

    Science.gov (United States)

    Horne, Mera F.

    2018-01-01

    The planned lander missions to the surface of Europa will access samples from the subsurface of the ice in a search for signs of life. A small thermal drill (probe) is proposed to meet the sample requirement of the Science Definition Team's (SDT) report for the Europa mission. The probe is 2 cm in diameter and 16 cm in length and is designed to access the subsurface to 10 cm deep and to collect five ice samples of 7 cm3 each, approximately. The energy required to penetrate the top 10 cm of ice in a vacuum is 26 Wh, approximately, and to melt 7 cm3 of ice is 1.2 Wh, approximately. The requirement stated in the SDT report of collecting samples from five different sites can be accommodated with repeated use of the same thermal drill. For smaller sample sizes, a smaller probe of 1.0 cm in diameter with the same length of 16 cm could be utilized that would require approximately 6.4 Wh to penetrate the top 10 cm of ice, and 0.02 Wh to collect 0.1 g of sample. The thermal drill has the advantage of simplicity of design and operations and the ability to penetrate ice over a range of densities and hardness while maintaining sample integrity.

  15. Spatial analysis of NDVI readings with difference sampling density

    Science.gov (United States)

    Advanced remote sensing technologies provide research an innovative way of collecting spatial data for use in precision agriculture. Sensor information and spatial analysis together allow for a complete understanding of the spatial complexity of a field and its crop. The objective of the study was...

  16. Counting Cats: Spatially Explicit Population Estimates of Cheetah (Acinonyx jubatus Using Unstructured Sampling Data.

    Directory of Open Access Journals (Sweden)

    Femke Broekhuis

    Full Text Available Many ecological theories and species conservation programmes rely on accurate estimates of population density. Accurate density estimation, especially for species facing rapid declines, requires the application of rigorous field and analytical methods. However, obtaining accurate density estimates of carnivores can be challenging as carnivores naturally exist at relatively low densities and are often elusive and wide-ranging. In this study, we employ an unstructured spatial sampling field design along with a Bayesian sex-specific spatially explicit capture-recapture (SECR analysis, to provide the first rigorous population density estimates of cheetahs (Acinonyx jubatus in the Maasai Mara, Kenya. We estimate adult cheetah density to be between 1.28 ± 0.315 and 1.34 ± 0.337 individuals/100km2 across four candidate models specified in our analysis. Our spatially explicit approach revealed 'hotspots' of cheetah density, highlighting that cheetah are distributed heterogeneously across the landscape. The SECR models incorporated a movement range parameter which indicated that male cheetah moved four times as much as females, possibly because female movement was restricted by their reproductive status and/or the spatial distribution of prey. We show that SECR can be used for spatially unstructured data to successfully characterise the spatial distribution of a low density species and also estimate population density when sample size is small. Our sampling and modelling framework will help determine spatial and temporal variation in cheetah densities, providing a foundation for their conservation and management. Based on our results we encourage other researchers to adopt a similar approach in estimating densities of individually recognisable species.

  17. Counting Cats: Spatially Explicit Population Estimates of Cheetah (Acinonyx jubatus) Using Unstructured Sampling Data.

    Science.gov (United States)

    Broekhuis, Femke; Gopalaswamy, Arjun M

    2016-01-01

    Many ecological theories and species conservation programmes rely on accurate estimates of population density. Accurate density estimation, especially for species facing rapid declines, requires the application of rigorous field and analytical methods. However, obtaining accurate density estimates of carnivores can be challenging as carnivores naturally exist at relatively low densities and are often elusive and wide-ranging. In this study, we employ an unstructured spatial sampling field design along with a Bayesian sex-specific spatially explicit capture-recapture (SECR) analysis, to provide the first rigorous population density estimates of cheetahs (Acinonyx jubatus) in the Maasai Mara, Kenya. We estimate adult cheetah density to be between 1.28 ± 0.315 and 1.34 ± 0.337 individuals/100km2 across four candidate models specified in our analysis. Our spatially explicit approach revealed 'hotspots' of cheetah density, highlighting that cheetah are distributed heterogeneously across the landscape. The SECR models incorporated a movement range parameter which indicated that male cheetah moved four times as much as females, possibly because female movement was restricted by their reproductive status and/or the spatial distribution of prey. We show that SECR can be used for spatially unstructured data to successfully characterise the spatial distribution of a low density species and also estimate population density when sample size is small. Our sampling and modelling framework will help determine spatial and temporal variation in cheetah densities, providing a foundation for their conservation and management. Based on our results we encourage other researchers to adopt a similar approach in estimating densities of individually recognisable species.

  18. Designing an enhanced groundwater sample collection system

    International Nuclear Information System (INIS)

    Schalla, R.

    1994-10-01

    As part of an ongoing technical support mission to achieve excellence and efficiency in environmental restoration activities at the Laboratory for Energy and Health-Related Research (LEHR), Pacific Northwest Laboratory (PNL) provided guidance on the design and construction of monitoring wells and identified the most suitable type of groundwater sampling pump and accessories for monitoring wells. The goal was to utilize a monitoring well design that would allow for hydrologic testing and reduce turbidity to minimize the impact of sampling. The sampling results of the newly designed monitoring wells were clearly superior to those of the previously installed monitoring wells. The new wells exhibited reduced turbidity, in addition to improved access for instrumentation and hydrologic testing. The variable frequency submersible pump was selected as the best choice for obtaining groundwater samples. The literature references are listed at the end of this report. Despite some initial difficulties, the actual performance of the variable frequency, submersible pump and its accessories was effective in reducing sampling time and labor costs, and its ease of use was preferred over the previously used bladder pumps. The surface seals system, called the Dedicator, proved to be useful accessory to prevent surface contamination while providing easy access for water-level measurements and for connecting the pump. Cost savings resulted from the use of the pre-production pumps (beta units) donated by the manufacturer for the demonstration. However, larger savings resulted from shortened field time due to the ease in using the submersible pumps and the surface seal access system. Proper deployment of the monitoring wells also resulted in cost savings and ensured representative samples

  19. Spatial ability in computer-aided design courses

    OpenAIRE

    Torner Ribé, Jordi; Alpiste Penalba, Francesc; Brigos Hermida, Miguel Ángel

    2014-01-01

    Many studies have demonstrated that spatial ability is an important factor in the study of Industrial Engineering. Spatial ability is fundamentally important to the work of an engineer, as it is vital for project design. Among other elements, spatial ability correlates with factors such as good academic results and a natural ability to learn how to use I.T systems and computer programs. Furthermore, the new framework drawn up by the European Higher Education Area (EHEA) guides us as to the...

  20. Large sample hydrology in NZ: Spatial organisation in process diagnostics

    Science.gov (United States)

    McMillan, H. K.; Woods, R. A.; Clark, M. P.

    2013-12-01

    A key question in hydrology is how to predict the dominant runoff generation processes in any given catchment. This knowledge is vital for a range of applications in forecasting hydrological response and related processes such as nutrient and sediment transport. A step towards this goal is to map dominant processes in locations where data is available. In this presentation, we use data from 900 flow gauging stations and 680 rain gauges in New Zealand, to assess hydrological processes. These catchments range in character from rolling pasture, to alluvial plains, to temperate rainforest, to volcanic areas. By taking advantage of so many flow regimes, we harness the benefits of large-sample and comparative hydrology to study patterns and spatial organisation in runoff processes, and their relationship to physical catchment characteristics. The approach we use to assess hydrological processes is based on the concept of diagnostic signatures. Diagnostic signatures in hydrology are targeted analyses of measured data which allow us to investigate specific aspects of catchment response. We apply signatures which target the water balance, the flood response and the recession behaviour. We explore the organisation, similarity and diversity in hydrological processes across the New Zealand landscape, and how these patterns change with scale. We discuss our findings in the context of the strong hydro-climatic gradients in New Zealand, and consider the implications for hydrological model building on a national scale.

  1. APPLICATION OF SPATIAL MODELLING APPROACHES, SAMPLING STRATEGIES AND 3S TECHNOLOGY WITHIN AN ECOLGOCIAL FRAMWORK

    Directory of Open Access Journals (Sweden)

    H.-C. Chen

    2012-07-01

    Full Text Available How to effectively describe ecological patterns in nature over broader spatial scales and build a modeling ecological framework has become an important issue in ecological research. We test four modeling methods (MAXENT, DOMAIN, GLM and ANN to predict the potential habitat of Schima superba (Chinese guger tree, CGT with different spatial scale in the Huisun study area in Taiwan. Then we created three sampling design (from small to large scales for model development and validation by different combinations of CGT samples from aforementioned three sites (Tong-Feng watershed, Yo-Shan Mountain, and Kuan-Dau watershed. These models combine points of known occurrence and topographic variables to infer CGT potential spatial distribution. Our assessment revealed that the method performance from highest to lowest was: MAXENT, DOMAIN, GLM and ANN on small spatial scale. The MAXENT and DOMAIN two models were the most capable for predicting the tree's potential habitat. However, the outcome clearly indicated that the models merely based on topographic variables performed poorly on large spatial extrapolation from Tong-Feng to Kuan-Dau because the humidity and sun illumination of the two watersheds are affected by their microterrains and are quite different from each other. Thus, the models developed from topographic variables can only be applied within a limited geographical extent without a significant error. Future studies will attempt to use variables involving spectral information associated with species extracted from high spatial, spectral resolution remotely sensed data, especially hyperspectral image data, for building a model so that it can be applied on a large spatial scale.

  2. Coevolutionary and genetic algorithm based building spatial and structural design

    NARCIS (Netherlands)

    Hofmeyer, H.; Davila Delgado, J.M.

    2015-01-01

    In this article, two methods to develop and optimize accompanying building spatial and structural designs are compared. The first, a coevolutionary method, applies deterministic procedures, inspired by realistic design processes, to cyclically add a suitable structural design to the input of a

  3. Spatial orientation in bone samples and Young's modulus

    NARCIS (Netherlands)

    Geraets, W.G.M.; van Ruijven, L.J.; Verheij, H.G.C.; van der Stelt, P.F.; van Eijden, T.M.G.J.

    2008-01-01

    Bone mass is the most important determinant of the mechanical strength of bones, and spatial structure is the second. In general, the spatial structure and mechanical properties of bones such as the breaking strength are direction dependent. The mean intercept length (MIL) and line frequency

  4. A proposal of optimal sampling design using a modularity strategy

    Science.gov (United States)

    Simone, A.; Giustolisi, O.; Laucelli, D. B.

    2016-08-01

    In real water distribution networks (WDNs) are present thousands nodes and optimal placement of pressure and flow observations is a relevant issue for different management tasks. The planning of pressure observations in terms of spatial distribution and number is named sampling design and it was faced considering model calibration. Nowadays, the design of system monitoring is a relevant issue for water utilities e.g., in order to manage background leakages, to detect anomalies and bursts, to guarantee service quality, etc. In recent years, the optimal location of flow observations related to design of optimal district metering areas (DMAs) and leakage management purposes has been faced considering optimal network segmentation and the modularity index using a multiobjective strategy. Optimal network segmentation is the basis to identify network modules by means of optimal conceptual cuts, which are the candidate locations of closed gates or flow meters creating the DMAs. Starting from the WDN-oriented modularity index, as a metric for WDN segmentation, this paper proposes a new way to perform the sampling design, i.e., the optimal location of pressure meters, using newly developed sampling-oriented modularity index. The strategy optimizes the pressure monitoring system mainly based on network topology and weights assigned to pipes according to the specific technical tasks. A multiobjective optimization minimizes the cost of pressure meters while maximizing the sampling-oriented modularity index. The methodology is presented and discussed using the Apulian and Exnet networks.

  5. Stratified sampling design based on data mining.

    Science.gov (United States)

    Kim, Yeonkook J; Oh, Yoonhwan; Park, Sunghoon; Cho, Sungzoon; Park, Hayoung

    2013-09-01

    To explore classification rules based on data mining methodologies which are to be used in defining strata in stratified sampling of healthcare providers with improved sampling efficiency. We performed k-means clustering to group providers with similar characteristics, then, constructed decision trees on cluster labels to generate stratification rules. We assessed the variance explained by the stratification proposed in this study and by conventional stratification to evaluate the performance of the sampling design. We constructed a study database from health insurance claims data and providers' profile data made available to this study by the Health Insurance Review and Assessment Service of South Korea, and population data from Statistics Korea. From our database, we used the data for single specialty clinics or hospitals in two specialties, general surgery and ophthalmology, for the year 2011 in this study. Data mining resulted in five strata in general surgery with two stratification variables, the number of inpatients per specialist and population density of provider location, and five strata in ophthalmology with two stratification variables, the number of inpatients per specialist and number of beds. The percentages of variance in annual changes in the productivity of specialists explained by the stratification in general surgery and ophthalmology were 22% and 8%, respectively, whereas conventional stratification by the type of provider location and number of beds explained 2% and 0.2% of variance, respectively. This study demonstrated that data mining methods can be used in designing efficient stratified sampling with variables readily available to the insurer and government; it offers an alternative to the existing stratification method that is widely used in healthcare provider surveys in South Korea.

  6. Spatial Distribution and Sampling Plans for Grapevine Plant Canopy-Inhabiting Scaphoideus titanus (Hemiptera: Cicadellidae) Nymphs.

    Science.gov (United States)

    Rigamonti, Ivo E; Brambilla, Carla; Colleoni, Emanuele; Jermini, Mauro; Trivellone, Valeria; Baumgärtner, Johann

    2016-04-01

    The paper deals with the study of the spatial distribution and the design of sampling plans for estimating nymph densities of the grape leafhopper Scaphoideus titanus Ball in vine plant canopies. In a reference vineyard sampled for model parameterization, leaf samples were repeatedly taken according to a multistage, stratified, random sampling procedure, and data were subjected to an ANOVA. There were no significant differences in density neither among the strata within the vineyard nor between the two strata with basal and apical leaves. The significant differences between densities on trunk and productive shoots led to the adoption of two-stage (leaves and plants) and three-stage (leaves, shoots, and plants) sampling plans for trunk shoots- and productive shoots-inhabiting individuals, respectively. The mean crowding to mean relationship used to analyze the nymphs spatial distribution revealed aggregated distributions. In both the enumerative and the sequential enumerative sampling plans, the number of leaves of trunk shoots, and of leaves and shoots of productive shoots, was kept constant while the number of plants varied. In additional vineyards data were collected and used to test the applicability of the distribution model and the sampling plans. The tests confirmed the applicability 1) of the mean crowding to mean regression model on the plant and leaf stages for representing trunk shoot-inhabiting distributions, and on the plant, shoot, and leaf stages for productive shoot-inhabiting nymphs, 2) of the enumerative sampling plan, and 3) of the sequential enumerative sampling plan. In general, sequential enumerative sampling was more cost efficient than enumerative sampling.

  7. Design in the planning arena : how regional designing influences strategic spatial planning

    NARCIS (Netherlands)

    Kempenaar, Annet

    2017-01-01

    Regional designing is a form of spatial design that engages with the future physical form and arrangement of regions, including its aesthetic appearances and how it can come about. As such it is closely entangled with spatial planning. This thesis studies the influence of regional designing on

  8. Research on test of product based on spatial sampling criteria and variable step sampling mechanism

    Science.gov (United States)

    Li, Ruihong; Han, Yueping

    2014-09-01

    This paper presents an effective approach for online testing the assembly structures inside products using multiple views technique and X-ray digital radiography system based on spatial sampling criteria and variable step sampling mechanism. Although there are some objects inside one product to be tested, there must be a maximal rotary step for an object within which the least structural size to be tested is predictable. In offline learning process, Rotating the object by the step and imaging it and so on until a complete cycle is completed, an image sequence is obtained that includes the full structural information for recognition. The maximal rotary step is restricted by the least structural size and the inherent resolution of the imaging system. During online inspection process, the program firstly finds the optimum solutions to all different target parts in the standard sequence, i.e., finds their exact angles in one cycle. Aiming at the issue of most sizes of other targets in product are larger than that of the least structure, the paper adopts variable step-size sampling mechanism to rotate the product specific angles with different steps according to different objects inside the product and match. Experimental results show that the variable step-size method can greatly save time compared with the traditional fixed-step inspection method while the recognition accuracy is guaranteed.

  9. Design of spatial experiments: Model fitting and prediction

    Energy Technology Data Exchange (ETDEWEB)

    Fedorov, V.V.

    1996-03-01

    The main objective of the paper is to describe and develop model oriented methods and algorithms for the design of spatial experiments. Unlike many other publications in this area, the approach proposed here is essentially based on the ideas of convex design theory.

  10. Estimating the spatial scale of herbicide and soil interactions by nested sampling, hierarchical analysis of variance and residual maximum likelihood

    Energy Technology Data Exchange (ETDEWEB)

    Price, Oliver R., E-mail: oliver.price@unilever.co [Warwick-HRI, University of Warwick, Wellesbourne, Warwick, CV32 6EF (United Kingdom); University of Reading, Soil Science Department, Whiteknights, Reading, RG6 6UR (United Kingdom); Oliver, Margaret A. [University of Reading, Soil Science Department, Whiteknights, Reading, RG6 6UR (United Kingdom); Walker, Allan [Warwick-HRI, University of Warwick, Wellesbourne, Warwick, CV32 6EF (United Kingdom); Wood, Martin [University of Reading, Soil Science Department, Whiteknights, Reading, RG6 6UR (United Kingdom)

    2009-05-15

    An unbalanced nested sampling design was used to investigate the spatial scale of soil and herbicide interactions at the field scale. A hierarchical analysis of variance based on residual maximum likelihood (REML) was used to analyse the data and provide a first estimate of the variogram. Soil samples were taken at 108 locations at a range of separating distances in a 9 ha field to explore small and medium scale spatial variation. Soil organic matter content, pH, particle size distribution, microbial biomass and the degradation and sorption of the herbicide, isoproturon, were determined for each soil sample. A large proportion of the spatial variation in isoproturon degradation and sorption occurred at sampling intervals less than 60 m, however, the sampling design did not resolve the variation present at scales greater than this. A sampling interval of 20-25 m should ensure that the main spatial structures are identified for isoproturon degradation rate and sorption without too great a loss of information in this field. - Estimating the spatial scale of herbicide and soil interactions by nested sampling.

  11. Estimating the spatial scale of herbicide and soil interactions by nested sampling, hierarchical analysis of variance and residual maximum likelihood

    International Nuclear Information System (INIS)

    Price, Oliver R.; Oliver, Margaret A.; Walker, Allan; Wood, Martin

    2009-01-01

    An unbalanced nested sampling design was used to investigate the spatial scale of soil and herbicide interactions at the field scale. A hierarchical analysis of variance based on residual maximum likelihood (REML) was used to analyse the data and provide a first estimate of the variogram. Soil samples were taken at 108 locations at a range of separating distances in a 9 ha field to explore small and medium scale spatial variation. Soil organic matter content, pH, particle size distribution, microbial biomass and the degradation and sorption of the herbicide, isoproturon, were determined for each soil sample. A large proportion of the spatial variation in isoproturon degradation and sorption occurred at sampling intervals less than 60 m, however, the sampling design did not resolve the variation present at scales greater than this. A sampling interval of 20-25 m should ensure that the main spatial structures are identified for isoproturon degradation rate and sorption without too great a loss of information in this field. - Estimating the spatial scale of herbicide and soil interactions by nested sampling.

  12. A random spatial sampling method in a rural developing nation

    Science.gov (United States)

    Michelle C. Kondo; Kent D.W. Bream; Frances K. Barg; Charles C. Branas

    2014-01-01

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method...

  13. Reachable Distance Space: Efficient Sampling-Based Planning for Spatially Constrained Systems

    KAUST Repository

    Xinyu Tang,

    2010-01-25

    Motion planning for spatially constrained robots is difficult due to additional constraints placed on the robot, such as closure constraints for closed chains or requirements on end-effector placement for articulated linkages. It is usually computationally too expensive to apply sampling-based planners to these problems since it is difficult to generate valid configurations. We overcome this challenge by redefining the robot\\'s degrees of freedom and constraints into a new set of parameters, called reachable distance space (RD-space), in which all configurations lie in the set of constraint-satisfying subspaces. This enables us to directly sample the constrained subspaces with complexity linear in the number of the robot\\'s degrees of freedom. In addition to supporting efficient sampling of configurations, we show that the RD-space formulation naturally supports planning and, in particular, we design a local planner suitable for use by sampling-based planners. We demonstrate the effectiveness and efficiency of our approach for several systems including closed chain planning with multiple loops, restricted end-effector sampling, and on-line planning for drawing/sculpting. We can sample single-loop closed chain systems with 1,000 links in time comparable to open chain sampling, and we can generate samples for 1,000-link multi-loop systems of varying topologies in less than a second. © 2010 The Author(s).

  14. A method to combine non-probability sample data with probability sample data in estimating spatial means of environmental variables

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    2003-01-01

    In estimating spatial means of environmental variables of a region from data collected by convenience or purposive sampling, validity of the results can be ensured by collecting additional data through probability sampling. The precision of the pi estimator that uses the probability sample can be

  15. Design of an optical spatial interferometer with transformation optics

    International Nuclear Information System (INIS)

    Naghibi, Atefeh; Shokooh-Saremi, Mehrdad

    2015-01-01

    In this paper, we apply transformation optics to design an optical spatial interferometer. The transformation equations are described and two-dimensional finite element simulations are presented to numerically confirm the functionality of the device. It is shown that a small change in the refractive index can alter the interference pattern and hence can be detected. The design of the interferometer could expand transformation optics’ applications and make way for introduction of new structures with unique electromagnetic or optical functionalities. (paper)

  16. Design-based estimators for snowball sampling

    OpenAIRE

    Shafie, Termeh

    2010-01-01

    Snowball sampling, where existing study subjects recruit further subjects from amongtheir acquaintances, is a popular approach when sampling from hidden populations.Since people with many in-links are more likely to be selected, there will be a selectionbias in the samples obtained. In order to eliminate this bias, the sample data must beweighted. However, the exact selection probabilities are unknown for snowball samplesand need to be approximated in an appropriate way. This paper proposes d...

  17. 30 CFR 71.208 - Bimonthly sampling; designated work positions.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Bimonthly sampling; designated work positions... UNDERGROUND COAL MINES Sampling Procedures § 71.208 Bimonthly sampling; designated work positions. (a) Each... standard when quartz is present), respirable dust sampling of designated work positions shall begin on the...

  18. On efficiency of some ratio estimators in double sampling design ...

    African Journals Online (AJOL)

    In this paper, three sampling ratio estimators in double sampling design were proposed with the intention of finding an alternative double sampling design estimator to the conventional ratio estimator in double sampling design discussed by Cochran (1997), Okafor (2002) , Raj (1972) and Raj and Chandhok (1999).

  19. Constrained optimisation of spatial sampling : a geostatistical approach

    NARCIS (Netherlands)

    Groenigen, van J.W.

    1999-01-01

    Aims

    This thesis aims at the development of optimal sampling strategies for geostatistical studies. Special emphasis is on the optimal use of ancillary data, such as co-related imagery, preliminary observations and historic knowledge. Although the object of all studies

  20. Analysing designed experiments in distance sampling

    Science.gov (United States)

    Stephen T. Buckland; Robin E. Russell; Brett G. Dickson; Victoria A. Saab; Donal N. Gorman; William M. Block

    2009-01-01

    Distance sampling is a survey technique for estimating the abundance or density of wild animal populations. Detection probabilities of animals inherently differ by species, age class, habitats, or sex. By incorporating the change in an observer's ability to detect a particular class of animals as a function of distance, distance sampling leads to density estimates...

  1. How does spatial study design influence density estimates from spatial capture-recapture models?

    Directory of Open Access Journals (Sweden)

    Rahel Sollmann

    Full Text Available When estimating population density from data collected on non-invasive detector arrays, recently developed spatial capture-recapture (SCR models present an advance over non-spatial models by accounting for individual movement. While these models should be more robust to changes in trapping designs, they have not been well tested. Here we investigate how the spatial arrangement and size of the trapping array influence parameter estimates for SCR models. We analysed black bear data collected with 123 hair snares with an SCR model accounting for differences in detection and movement between sexes and across the trapping occasions. To see how the size of the trap array and trap dispersion influence parameter estimates, we repeated analysis for data from subsets of traps: 50% chosen at random, 50% in the centre of the array and 20% in the South of the array. Additionally, we simulated and analysed data under a suite of trap designs and home range sizes. In the black bear study, we found that results were similar across trap arrays, except when only 20% of the array was used. Black bear density was approximately 10 individuals per 100 km(2. Our simulation study showed that SCR models performed well as long as the extent of the trap array was similar to or larger than the extent of individual movement during the study period, and movement was at least half the distance between traps. SCR models performed well across a range of spatial trap setups and animal movements. Contrary to non-spatial capture-recapture models, they do not require the trapping grid to cover an area several times the average home range of the studied species. This renders SCR models more appropriate for the study of wide-ranging mammals and more flexible to design studies targeting multiple species.

  2. Autonomous spatially adaptive sampling in experiments based on curvature, statistical error and sample spacing with applications in LDA measurements

    Science.gov (United States)

    Theunissen, Raf; Kadosh, Jesse S.; Allen, Christian B.

    2015-06-01

    Spatially varying signals are typically sampled by collecting uniformly spaced samples irrespective of the signal content. For signals with inhomogeneous information content, this leads to unnecessarily dense sampling in regions of low interest or insufficient sample density at important features, or both. A new adaptive sampling technique is presented directing sample collection in proportion to local information content, capturing adequately the short-period features while sparsely sampling less dynamic regions. The proposed method incorporates a data-adapted sampling strategy on the basis of signal curvature, sample space-filling, variable experimental uncertainty and iterative improvement. Numerical assessment has indicated a reduction in the number of samples required to achieve a predefined uncertainty level overall while improving local accuracy for important features. The potential of the proposed method has been further demonstrated on the basis of Laser Doppler Anemometry experiments examining the wake behind a NACA0012 airfoil and the boundary layer characterisation of a flat plate.

  3. The influence of sampling unit size and spatial arrangement patterns on neighborhood-based spatial structure analyses of forest stands

    Energy Technology Data Exchange (ETDEWEB)

    Wang, H.; Zhang, G.; Hui, G.; Li, Y.; Hu, Y.; Zhao, Z.

    2016-07-01

    Aim of study: Neighborhood-based stand spatial structure parameters can quantify and characterize forest spatial structure effectively. How these neighborhood-based structure parameters are influenced by the selection of different numbers of nearest-neighbor trees is unclear, and there is some disagreement in the literature regarding the appropriate number of nearest-neighbor trees to sample around reference trees. Understanding how to efficiently characterize forest structure is critical for forest management. Area of study: Multi-species uneven-aged forests of Northern China. Material and methods: We simulated stands with different spatial structural characteristics and systematically compared their structure parameters when two to eight neighboring trees were selected. Main results: Results showed that values of uniform angle index calculated in the same stand were different with different sizes of structure unit. When tree species and sizes were completely randomly interspersed, different numbers of neighbors had little influence on mingling and dominance indices. Changes of mingling or dominance indices caused by different numbers of neighbors occurred when the tree species or size classes were not randomly interspersed and their changing characteristics can be detected according to the spatial arrangement patterns of tree species and sizes. Research highlights: The number of neighboring trees selected for analyzing stand spatial structure parameters should be fixed. We proposed that the four-tree structure unit is the best compromise between sampling accuracy and costs for practical forest management. (Author)

  4. Optimization of Decision-Making for Spatial Sampling in the North China Plain, Based on Remote-Sensing a Priori Knowledge

    Science.gov (United States)

    Feng, J.; Bai, L.; Liu, S.; Su, X.; Hu, H.

    2012-07-01

    In this paper, the MODIS remote sensing data, featured with low-cost, high-timely and moderate/low spatial resolutions, in the North China Plain (NCP) as a study region were firstly used to carry out mixed-pixel spectral decomposition to extract an useful regionalized indicator parameter (RIP) (i.e., an available ratio, that is, fraction/percentage, of winter wheat planting area in each pixel as a regionalized indicator variable (RIV) of spatial sampling) from the initial selected indicators. Then, the RIV values were spatially analyzed, and the spatial structure characteristics (i.e., spatial correlation and variation) of the NCP were achieved, which were further processed to obtain the scalefitting, valid a priori knowledge or information of spatial sampling. Subsequently, founded upon an idea of rationally integrating probability-based and model-based sampling techniques and effectively utilizing the obtained a priori knowledge or information, the spatial sampling models and design schemes and their optimization and optimal selection were developed, as is a scientific basis of improving and optimizing the existing spatial sampling schemes of large-scale cropland remote sensing monitoring. Additionally, by the adaptive analysis and decision strategy the optimal local spatial prediction and gridded system of extrapolation results were able to excellently implement an adaptive report pattern of spatial sampling in accordance with report-covering units in order to satisfy the actual needs of sampling surveys.

  5. Climate Change and Agricultural Productivity in Sub-Saharan Africa: A Spatial Sample Selection Model

    NARCIS (Netherlands)

    Ward, P.S.; Florax, R.J.G.M.; Flores-Lagunes, A.

    2014-01-01

    Using spatially explicit data, we estimate a cereal yield response function using a recently developed estimator for spatial error models when endogenous sample selection is of concern. Our results suggest that yields across Sub-Saharan Africa will decline with projected climatic changes, and that

  6. Studying visual and spatial reasoning for design creativity

    CERN Document Server

    2015-01-01

    Creativity and design creativity in particular are being recognized as playing an increasing role in the social and economic wellbeing of a society. As a consequence creativity is becoming a focus of research. However, much of this burgeoning research is distributed across multiple disciplines that normally do not intersect with each other and researchers in one discipline are often unaware of related research in another discipline.  This volume brings together contributions from design science, computer science, cognitive science and neuroscience on studying visual and spatial reasoning applicable to design creativity. The book is the result of a unique NSF-funded workshop held in Aix-en-Provence, France. The aim of the workshop and the resulting volume was to allow researchers in disparate disciplines to be exposed to the other’s research, research methods and research results within the context of design creativity. Fifteen of the papers presented and discussed at the workshop are contained in this volu...

  7. BAYESIAN ENTROPY FOR SPATIAL SAMPLING DESIGN OF ENVIRONMENTAL DATA

    Science.gov (United States)

    Particulate Matter (PM) has been linked to widespread public health effects, including a range of serious respiratory and cardiovascular problems, and to reduced visibility in may parts of the United States, see the Environmental Protection Agency (EPA) report (2004) and relevant...

  8. The effect of short-range spatial variability on soil sampling uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Perk, Marcel van der [Department of Physical Geography, Utrecht University, P.O. Box 80115, 3508 TC Utrecht (Netherlands)], E-mail: m.vanderperk@geo.uu.nl; De Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria [Agenzia per la Protezione dell' Ambiente e per i Servizi Tecnici (APAT), Servizio Laboratori, Misure ed Attivita di Campo, Via di Castel Romano, 100-00128 Roma (Italy); Fajgelj, Ales; Sansone, Umberto [International Atomic Energy Agency (IAEA), Agency' s Laboratories Seibersdorf, A-1400 Vienna (Austria); Jeran, Zvonka; Jacimovic, Radojko [Jozef Stefan Institute, Jamova 39, 1000 Ljubljana (Slovenia)

    2008-11-15

    This paper aims to quantify the soil sampling uncertainty arising from the short-range spatial variability of elemental concentrations in the topsoils of agricultural, semi-natural, and contaminated environments. For the agricultural site, the relative standard sampling uncertainty ranges between 1% and 5.5%. For the semi-natural area, the sampling uncertainties are 2-4 times larger than in the agricultural area. The contaminated site exhibited significant short-range spatial variability in elemental composition, which resulted in sampling uncertainties of 20-30%.

  9. The effect of short-range spatial variability on soil sampling uncertainty.

    Science.gov (United States)

    Van der Perk, Marcel; de Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Fajgelj, Ales; Sansone, Umberto; Jeran, Zvonka; Jaćimović, Radojko

    2008-11-01

    This paper aims to quantify the soil sampling uncertainty arising from the short-range spatial variability of elemental concentrations in the topsoils of agricultural, semi-natural, and contaminated environments. For the agricultural site, the relative standard sampling uncertainty ranges between 1% and 5.5%. For the semi-natural area, the sampling uncertainties are 2-4 times larger than in the agricultural area. The contaminated site exhibited significant short-range spatial variability in elemental composition, which resulted in sampling uncertainties of 20-30%.

  10. [Study of spatial stratified sampling strategy of Oncomelania hupensis snail survey based on plant abundance].

    Science.gov (United States)

    Xun-Ping, W; An, Z

    2017-07-27

    Objective To optimize and simplify the survey method of Oncomelania hupensis snails in marshland endemic regions of schistosomiasis, so as to improve the precision, efficiency and economy of the snail survey. Methods A snail sampling strategy (Spatial Sampling Scenario of Oncomelania based on Plant Abundance, SOPA) which took the plant abundance as auxiliary variable was explored and an experimental study in a 50 m×50 m plot in a marshland in the Poyang Lake region was performed. Firstly, the push broom surveyed data was stratified into 5 layers by the plant abundance data; then, the required numbers of optimal sampling points of each layer through Hammond McCullagh equation were calculated; thirdly, every sample point in the line with the Multiple Directional Interpolation (MDI) placement scheme was pinpointed; and finally, the comparison study among the outcomes of the spatial random sampling strategy, the traditional systematic sampling method, the spatial stratified sampling method, Sandwich spatial sampling and inference and SOPA was performed. Results The method (SOPA) proposed in this study had the minimal absolute error of 0.213 8; and the traditional systematic sampling method had the largest estimate, and the absolute error was 0.924 4. Conclusion The snail sampling strategy (SOPA) proposed in this study obtains the higher estimation accuracy than the other four methods.

  11. A spatially augmented reality sketching interface for architectural daylighting design.

    Science.gov (United States)

    Sheng, Yu; Yapo, Theodore C; Young, Christopher; Cutler, Barbara

    2011-01-01

    We present an application of interactive global illumination and spatially augmented reality to architectural daylight modeling that allows designers to explore alternative designs and new technologies for improving the sustainability of their buildings. Images of a model in the real world, captured by a camera above the scene, are processed to construct a virtual 3D model. To achieve interactive rendering rates, we use a hybrid rendering technique, leveraging radiosity to simulate the interreflectance between diffuse patches and shadow volumes to generate per-pixel direct illumination. The rendered images are then projected on the real model by four calibrated projectors to help users study the daylighting illumination. The virtual heliodon is a physical design environment in which multiple designers, a designer and a client, or a teacher and students can gather to experience animated visualizations of the natural illumination within a proposed design by controlling the time of day, season, and climate. Furthermore, participants may interactively redesign the geometry and materials of the space by manipulating physical design elements and see the updated lighting simulation. © 2011 IEEE Published by the IEEE Computer Society

  12. Lagoa Real design. Description and evaluation of sampling system

    International Nuclear Information System (INIS)

    Hashizume, B.K.

    1982-10-01

    This report describes the samples preparation system of drilling from Lagoa Real Design, aiming obtainment representative fraction of the half from drilling outlier. The error of sampling + analysis and analytical accuracy was obtainment by delayed neutron analysis. (author)

  13. Within-otolith variability in chemical fingerprints: implications for sampling designs and possible environmental interpretation.

    Science.gov (United States)

    Di Franco, Antonio; Bulleri, Fabio; Pennetta, Antonio; De Benedetto, Giuseppe; Clarke, K Robert; Guidetti, Paolo

    2014-01-01

    Largely used as a natural biological tag in studies of dispersal/connectivity of fish, otolith elemental fingerprinting is usually analyzed by laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS). LA-ICP-MS produces an elemental fingerprint at a discrete time-point in the life of a fish and can generate data on within-otolith variability of that fingerprint. The presence of within-otolith variability has been previously acknowledged but not incorporated into experimental designs on the presumed, but untested, grounds of both its negligibility compared to among-otolith variability and of spatial autocorrelation among multiple ablations within an otolith. Here, using a hierarchical sampling design of spatial variation at multiple scales in otolith chemical fingerprints for two Mediterranean coastal fishes, we explore: 1) whether multiple ablations within an otolith can be used as independent replicates for significance tests among otoliths, and 2) the implications of incorporating within-otolith variability when assessing spatial variability in otolith chemistry at a hierarchy of spatial scales (different fish, from different sites, at different locations on the Apulian Adriatic coast). We find that multiple ablations along the same daily rings do not necessarily exhibit spatial dependency within the otolith and can be used to estimate residual variability in a hierarchical sampling design. Inclusion of within-otolith measurements reveals that individuals at the same site can show significant variability in elemental uptake. Within-otolith variability examined across the spatial hierarchy identifies differences between the two fish species investigated, and this finding leads to discussion of the potential for within-otolith variability to be used as a marker for fish exposure to stressful conditions. We also demonstrate that a 'cost'-optimal allocation of sampling effort should typically include some level of within-otolith replication in the

  14. NEON terrestrial field observations: designing continental scale, standardized sampling

    Science.gov (United States)

    R. H. Kao; C.M. Gibson; R. E. Gallery; C. L. Meier; D. T. Barnett; K. M. Docherty; K. K. Blevins; P. D. Travers; E. Azuaje; Y. P. Springer; K. M. Thibault; V. J. McKenzie; M. Keller; L. F. Alves; E. L. S. Hinckley; J. Parnell; D. Schimel

    2012-01-01

    Rapid changes in climate and land use and the resulting shifts in species distributions and ecosystem functions have motivated the development of the National Ecological Observatory Network (NEON). Integrating across spatial scales from ground sampling to remote sensing, NEON will provide data for users to address ecological responses to changes in climate, land use,...

  15. Spatial Variability of Indicators of Jiaokou Reservoir Under Different Sampling Scales

    Directory of Open Access Journals (Sweden)

    WEI Wen-juan

    2016-12-01

    Full Text Available This research determined total nitrogen, total phosphorus, ammonia nitrogen and potassium permanganate contents in different scales of Jiaokou reservoir with the purpose of exploring the applicability of spatial variability and its characteristic in different sampling scales. The results showed that, compared the sampling scales of 100 m with 200 m, there were some differences among four indicators in the spatial variation, interpolation simulation and spatial distribution. About the testing model fit, the fitting model for the total nitrogen, permanganate index was Gaussian model, the fitting model for total phosphorus, ammonia nitrogen was the spherical model; Combining evaluation of parameters of models and comprehensive evaluation of spatial interpolation, total nitrogen, total phosphorus showed stronger spatial correlation and better interpolation simulation quality on the sampling scales of 200 m, while total phosphorus and permanganate index showed certain advantages on the 100 m scale; On the aspect of spatial distributions, the contents of ammonia nitrogen and potassium permanganate were mainly affected by human factors, the total phosphorus was affected by internal factors of the reservoir, while total nitrogen was closely related to farming activities around reservoir. The above results showed that total nitrogen, ammonia nitrogen were more available for the 200 m scales and total phosphorus, potassium permanganate were more available for the 100 m scales.

  16. Design of Capillary Flows with Spatially Graded Porous Films

    Science.gov (United States)

    Joung, Young Soo; Figliuzzi, Bruno Michel; Buie, Cullen

    2013-11-01

    We have developed a new capillary tube model, consisting of multi-layered capillary tubes oriented in the direction of flow, to predict capillary speeds on spatially graded porous films. Capillary flows through thin porous media have been widely utilized for small size liquid transport systems. However, for most media it is challenging to realize arbitrary shapes and spatially functionalized micro-structures with variable flow properties. Therefore, conventional media can only be used for capillary flows obeying Washburn's equation and the modifications thereof. Given this background, we recently developed a method called breakdown anodization (BDA) to produce highly wetting porous films. The resulting surfaces show nearly zero contact angles and fast water spreading speed. Furthermore, capillary pressure and spreading diffusivity can be expressed as functions of capillary height when customized electric fields are used in BDA. From the capillary tube model, we derived a general capillary flow equation of motion in terms of capillary pressure and spreading diffusivity. The theoretical model shows good agreement with experimental capillary flows. The study will provide novel design methodologies for paper-based microfluidic devices.

  17. Spatiotemporally Representative and Cost-Efficient Sampling Design for Validation Activities in Wanglang Experimental Site

    Directory of Open Access Journals (Sweden)

    Gaofei Yin

    2017-11-01

    Full Text Available Spatiotemporally representative Elementary Sampling Units (ESUs are required for capturing the temporal variations in surface spatial heterogeneity through field measurements. Since inaccessibility often coexists with heterogeneity, a cost-efficient sampling design is mandatory. We proposed a sampling strategy to generate spatiotemporally representative and cost-efficient ESUs based on the conditioned Latin hypercube sampling scheme. The proposed strategy was constrained by multi-temporal Normalized Difference Vegetation Index (NDVI imagery, and the ESUs were limited within a sampling feasible region established based on accessibility criteria. A novel criterion based on the Overlapping Area (OA between the NDVI frequency distribution histogram from the sampled ESUs and that from the entire study area was used to assess the sampling efficiency. A case study in Wanglang National Nature Reserve in China showed that the proposed strategy improves the spatiotemporally representativeness of sampling (mean annual OA = 74.7% compared to the single-temporally constrained (OA = 68.7% and the random sampling (OA = 63.1% strategies. The introduction of the feasible region constraint significantly reduces in-situ labour-intensive characterization necessities at expenses of about 9% loss in the spatiotemporal representativeness of the sampling. Our study will support the validation activities in Wanglang experimental site providing a benchmark for locating the nodes of automatic observation systems (e.g., LAINet which need a spatially distributed and temporally fixed sampling design.

  18. A Frequency Domain Design Method For Sampled-Data Compensators

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Jannerup, Ole Erik

    1990-01-01

    A new approach to the design of a sampled-data compensator in the frequency domain is investigated. The starting point is a continuous-time compensator for the continuous-time system which satisfy specific design criteria. The new design method will graphically show how the discrete...

  19. Spatial effects, sampling errors, and task specialization in the honey bee.

    Science.gov (United States)

    Johnson, B R

    2010-05-01

    Task allocation patterns should depend on the spatial distribution of work within the nest, variation in task demand, and the movement patterns of workers, however, relatively little research has focused on these topics. This study uses a spatially explicit agent based model to determine whether such factors alone can generate biases in task performance at the individual level in the honey bees, Apis mellifera. Specialization (bias in task performance) is shown to result from strong sampling error due to localized task demand, relatively slow moving workers relative to nest size, and strong spatial variation in task demand. To date, specialization has been primarily interpreted with the response threshold concept, which is focused on intrinsic (typically genotypic) differences between workers. Response threshold variation and sampling error due to spatial effects are not mutually exclusive, however, and this study suggests that both contribute to patterns of task bias at the individual level. While spatial effects are strong enough to explain some documented cases of specialization; they are relatively short term and not explanatory for long term cases of specialization. In general, this study suggests that the spatial layout of tasks and fluctuations in their demand must be explicitly controlled for in studies focused on identifying genotypic specialists.

  20. Prospective and retrospective spatial sampling scheme to characterize geochemicals in a mine tailings area

    CSIR Research Space (South Africa)

    Debba, Pravesh

    2009-07-01

    Full Text Available This study demonstrates that designing sampling schemes using simulated annealing results in much better selection of samples from an existing scheme in terms of prediction accuracy. The presentation to the SASA Eastern Cape Chapter as an invited...

  1. Design and implementation of distributed spatial computing node based on WPS

    International Nuclear Information System (INIS)

    Liu, Liping; Li, Guoqing; Xie, Jibo

    2014-01-01

    Currently, the research work of SIG (Spatial Information Grid) technology mostly emphasizes on the spatial data sharing in grid environment, while the importance of spatial computing resources is ignored. In order to implement the sharing and cooperation of spatial computing resources in grid environment, this paper does a systematical research of the key technologies to construct Spatial Computing Node based on the WPS (Web Processing Service) specification by OGC (Open Geospatial Consortium). And a framework of Spatial Computing Node is designed according to the features of spatial computing resources. Finally, a prototype of Spatial Computing Node is implemented and the relevant verification work under the environment is completed

  2. Sampling Design of Soil Physical Properties in a Conilon Coffee Field

    Directory of Open Access Journals (Sweden)

    Eduardo Oliveira de Jesus Santos

    Full Text Available ABSTRACT Establishing the number of samples required to determine values of soil physical properties ultimately results in optimization of labor and allows better representation of such attributes. The objective of this study was to analyze the spatial variability of soil physical properties in a Conilon coffee field and propose a soil sampling method better attuned to conditions of the management system. The experiment was performed in a Conilon coffee field in Espírito Santo state, Brazil, under a 3.0 × 2.0 × 1.0 m (4,000 plants ha-1 double spacing design. An irregular grid, with dimensions of 107 × 95.7 m and 65 sampling points, was set up. Soil samples were collected from the 0.00-0.20 m depth from each sampling point. Data were analyzed under descriptive statistical and geostatistical methods. Using statistical parameters, the adequate number of samples for analyzing the attributes under study was established, which ranged from 1 to 11 sampling points. With the exception of particle density, all soil physical properties showed a spatial dependence structure best fitted to the spherical model. Establishment of the number of samples and spatial variability for the physical properties of soils may be useful in developing sampling strategies that minimize costs for farmers within a tolerable and predictable level of error.

  3. Minimum detection limit and spatial resolution of thin-sample field-emission electron probe microanalysis

    International Nuclear Information System (INIS)

    Kubo, Yugo; Hamada, Kotaro; Urano, Akira

    2013-01-01

    The minimum detection limit and spatial resolution for a thinned semiconductor sample were determined by electron probe microanalysis (EPMA) using a Schottky field emission (FE) electron gun and wavelength dispersive X-ray spectrometry. Comparison of the FE-EPMA results with those obtained using energy dispersive X-ray spectrometry in conjunction with scanning transmission electron microscopy, confirmed that FE-EPMA is largely superior in terms of detection sensitivity. Thin-sample FE-EPMA is demonstrated as a very effective method for high resolution, high sensitivity analysis in a laboratory environment because a high probe current and high signal-to-noise ratio can be achieved. - Highlights: • Minimum detection limit and spatial resolution determined for FE-EPMA. • Detection sensitivity of FE-EPMA greatly superior to that of STEM-EDX. • Minimum detection limit and spatial resolution controllable by probe current

  4. Spatial scan statistics to assess sampling strategy of antimicrobial resistance monitoring programme

    DEFF Research Database (Denmark)

    Vieira, Antonio; Houe, Hans; Wegener, Henrik Caspar

    2009-01-01

    Pie collection and analysis of data on antimicrobial resistance in human and animal Populations are important for establishing a baseline of the occurrence of resistance and for determining trends over time. In animals, targeted monitoring with a stratified sampling plan is normally used. However...... sampled by the Danish Integrated Antimicrobial Resistance Monitoring and Research Programme (DANMAP), by identifying spatial Clusters of samples and detecting areas with significantly high or low sampling rates. These analyses were performed for each year and for the total 5-year study period for all...... by an antimicrobial monitoring program....

  5. Sampling design for use by the soil decontamination project

    International Nuclear Information System (INIS)

    Rutherford, D.W.; Stevens, J.R.

    1981-01-01

    This report proposes a general approach to the problem and discusses sampling of soil to map the contaminated area and to provide samples for characterizaton of soil components and contamination. Basic concepts in sample design are reviewed with reference to environmental transuranic studies. Common designs are reviewed and evaluated for use with specific objectives that might be required by the soil decontamination project. Examples of a hierarchial design pilot study and a combined hierarchial and grid study are proposed for the Rocky Flats 903 pad area

  6. System design description for sampling fuel in K basins

    International Nuclear Information System (INIS)

    Baker, R.B.

    1996-01-01

    This System Design Description provides: (1) statements of the Spent Nuclear Fuel Projects (SNFP) needs requiring sampling of fuel in the K East and K West Basins, (2) the sampling equipment functions and requirements, (3) a general work plan and the design logic being followed to develop the equipment, and (4) a summary description of the design for the sampling equipment. The report summarizes the integrated application of both the subject equipment and the canister sludge sampler in near-term characterization campaigns at K Basins

  7. Test Sample for the Spatially Resolved Quantification of Illicit Drugs on Fingerprints Using Imaging Mass Spectrometry

    NARCIS (Netherlands)

    Muramoto, S.; Forbes, T.P.; van Asten, A.C.; Gillen, G.

    2015-01-01

    A novel test sample for the spatially resolved quantification of illicit drugs on the surface of a fingerprint using time-of-flight secondary ion mass spectrometry (ToF-SIMS) and desorption electrospray ionization mass spectrometry (DESI-MS) was demonstrated. Calibration curves relating the signal

  8. Design and development of multiple sample counting setup

    International Nuclear Information System (INIS)

    Rath, D.P.; Murali, S.; Babu, D.A.R.

    2010-01-01

    Full text: The analysis of active samples on regular basis for ambient air activity and floor contamination from radio chemical lab accounts for major chunk of the operational activity in Health Physicist's responsibility. The requirement for daily air sample analysis on immediate counting and delayed counting from various labs in addition to samples of smear swipe check of lab led to the urge for development of system that could cater multiple sample analysis in a time programmed manner on a single sample loading. A multiple alpha/beta counting system for counting was designed and fabricated. It has arrangements for loading 10 samples in slots in order, get counted in a time programmed manner with results displayed and records maintained in PC. The paper describes the design and development of multiple sample counting setup presently in use at the facility has resulted in reduction of man-hour consumption in counting and recording of the results

  9. Probability sampling design in ethnobotanical surveys of medicinal plants

    Directory of Open Access Journals (Sweden)

    Mariano Martinez Espinosa

    2012-07-01

    Full Text Available Non-probability sampling design can be used in ethnobotanical surveys of medicinal plants. However, this method does not allow statistical inferences to be made from the data generated. The aim of this paper is to present a probability sampling design that is applicable in ethnobotanical studies of medicinal plants. The sampling design employed in the research titled "Ethnobotanical knowledge of medicinal plants used by traditional communities of Nossa Senhora Aparecida do Chumbo district (NSACD, Poconé, Mato Grosso, Brazil" was used as a case study. Probability sampling methods (simple random and stratified sampling were used in this study. In order to determine the sample size, the following data were considered: population size (N of 1179 families; confidence coefficient, 95%; sample error (d, 0.05; and a proportion (p, 0.5. The application of this sampling method resulted in a sample size (n of at least 290 families in the district. The present study concludes that probability sampling methods necessarily have to be employed in ethnobotanical studies of medicinal plants, particularly where statistical inferences have to be made using data obtained. This can be achieved by applying different existing probability sampling methods, or better still, a combination of such methods.

  10. Design of a gravity corer for near shore sediment sampling

    Digital Repository Service at National Institute of Oceanography (India)

    Bhat, S.T.; Sonawane, A.V.; Nayak, B.U.

    For the purpose of geotechnical investigation a gravity corer has been designed and fabricated to obtain undisturbed sediment core samples from near shore waters. The corer was successfully operated at 75 stations up to water depth 30 m. Simplicity...

  11. Spatial issues in user interface design from a graphic design perspective

    Science.gov (United States)

    Marcus, Aaron

    1989-01-01

    The user interface of a computer system is a visual display that provides information about the status of operations on data within the computer and control options to the user that enable adjustments to these operations. From the very beginning of computer technology the user interface was a spatial display, although its spatial features were not necessarily complex or explicitly recognized by the users. All text and nonverbal signs appeared in a virtual space generally thought of as a single flat plane of symbols. Current technology of high performance workstations permits any element of the display to appear as dynamic, multicolor, 3-D signs in a virtual 3-D space. The complexity of appearance and the user's interaction with the display provide significant challenges to the graphic designer of current and future user interfaces. In particular, spatial depiction provides many opportunities for effective communication of objects, structures, processes, navigation, selection, and manipulation. Issues are presented that are relevant to the graphic designer seeking to optimize the user interface's spatial attributes for effective visual communication.

  12. OpenMSI Arrayed Analysis Toolkit: Analyzing Spatially Defined Samples Using Mass Spectrometry Imaging

    Energy Technology Data Exchange (ETDEWEB)

    de Raad, Markus [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); de Rond, Tristan [Univ. of California, Berkeley, CA (United States); Rübel, Oliver [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Keasling, Jay D. [Univ. of California, Berkeley, CA (United States); Joint BioEnergy Inst. (JBEI), Emeryville, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Technical Univ. of Denmark, Lyngby (Denmark); Northen, Trent R. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Bowen, Benjamin P. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States)

    2017-05-03

    Mass spectrometry imaging (MSI) has primarily been applied in localizing biomolecules within biological matrices. Although well-suited, the application of MSI for comparing thousands of spatially defined spotted samples has been limited. One reason for this is a lack of suitable and accessible data processing tools for the analysis of large arrayed MSI sample sets. In this paper, the OpenMSI Arrayed Analysis Toolkit (OMAAT) is a software package that addresses the challenges of analyzing spatially defined samples in MSI data sets. OMAAT is written in Python and is integrated with OpenMSI (http://openmsi.nersc.gov), a platform for storing, sharing, and analyzing MSI data. By using a web-based python notebook (Jupyter), OMAAT is accessible to anyone without programming experience yet allows experienced users to leverage all features. OMAAT was evaluated by analyzing an MSI data set of a high-throughput glycoside hydrolase activity screen comprising 384 samples arrayed onto a NIMS surface at a 450 μm spacing, decreasing analysis time >100-fold while maintaining robust spot-finding. The utility of OMAAT was demonstrated for screening metabolic activities of different sized soil particles, including hydrolysis of sugars, revealing a pattern of size dependent activities. Finally, these results introduce OMAAT as an effective toolkit for analyzing spatially defined samples in MSI. OMAAT runs on all major operating systems, and the source code can be obtained from the following GitHub repository: https://github.com/biorack/omaat.

  13. ANL small-sample calorimeter system design and operation

    International Nuclear Information System (INIS)

    Roche, C.T.; Perry, R.B.; Lewis, R.N.; Jung, E.A.; Haumann, J.R.

    1978-07-01

    The Small-Sample Calorimetric System is a portable instrument designed to measure the thermal power produced by radioactive decay of plutonium-containing fuels. The small-sample calorimeter is capable of measuring samples producing power up to 32 milliwatts at a rate of one sample every 20 min. The instrument is contained in two packages: a data-acquisition module consisting of a microprocessor with an 8K-byte nonvolatile memory, and a measurement module consisting of the calorimeter and a sample preheater. The total weight of the system is 18 kg

  14. Coherent optical adaptive technique improves the spatial resolution of STED microscopy in thick samples

    Science.gov (United States)

    Yan, Wei; Yang, Yanlong; Tan, Yu; Chen, Xun; Li, Yang; Qu, Junle; Ye, Tong

    2018-01-01

    Stimulated emission depletion microscopy (STED) is one of far-field optical microscopy techniques that can provide sub-diffraction spatial resolution. The spatial resolution of the STED microscopy is determined by the specially engineered beam profile of the depletion beam and its power. However, the beam profile of the depletion beam may be distorted due to aberrations of optical systems and inhomogeneity of specimens’ optical properties, resulting in a compromised spatial resolution. The situation gets deteriorated when thick samples are imaged. In the worst case, the sever distortion of the depletion beam profile may cause complete loss of the super resolution effect no matter how much depletion power is applied to specimens. Previously several adaptive optics approaches have been explored to compensate aberrations of systems and specimens. However, it is hard to correct the complicated high-order optical aberrations of specimens. In this report, we demonstrate that the complicated distorted wavefront from a thick phantom sample can be measured by using the coherent optical adaptive technique (COAT). The full correction can effectively maintain and improve the spatial resolution in imaging thick samples. PMID:29400356

  15. Extending cluster lot quality assurance sampling designs for surveillance programs.

    Science.gov (United States)

    Hund, Lauren; Pagano, Marcello

    2014-07-20

    Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance on the basis of the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than simple random sampling. By applying survey sampling results to the binary classification procedure, we develop a simple and flexible nonparametric procedure to incorporate clustering effects into the LQAS sample design to appropriately inflate the sample size, accommodating finite numbers of clusters in the population when relevant. We use this framework to then discuss principled selection of survey design parameters in longitudinal surveillance programs. We apply this framework to design surveys to detect rises in malnutrition prevalence in nutrition surveillance programs in Kenya and South Sudan, accounting for clustering within villages. By combining historical information with data from previous surveys, we design surveys to detect spikes in the childhood malnutrition rate. Copyright © 2014 John Wiley & Sons, Ltd.

  16. Experimental and Sampling Design for the INL-2 Sample Collection Operational Test

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.; Amidan, Brett G.; Matzke, Brett D.

    2009-02-16

    This report describes the experimental and sampling design developed to assess sampling approaches and methods for detecting contamination in a building and clearing the building for use after decontamination. An Idaho National Laboratory (INL) building will be contaminated with BG (Bacillus globigii, renamed Bacillus atrophaeus), a simulant for Bacillus anthracis (BA). The contamination, sampling, decontamination, and re-sampling will occur per the experimental and sampling design. This INL-2 Sample Collection Operational Test is being planned by the Validated Sampling Plan Working Group (VSPWG). The primary objectives are: 1) Evaluate judgmental and probabilistic sampling for characterization as well as probabilistic and combined (judgment and probabilistic) sampling approaches for clearance, 2) Conduct these evaluations for gradient contamination (from low or moderate down to absent or undetectable) for different initial concentrations of the contaminant, 3) Explore judgment composite sampling approaches to reduce sample numbers, 4) Collect baseline data to serve as an indication of the actual levels of contamination in the tests. A combined judgmental and random (CJR) approach uses Bayesian methodology to combine judgmental and probabilistic samples to make clearance statements of the form "X% confidence that at least Y% of an area does not contain detectable contamination” (X%/Y% clearance statements). The INL-2 experimental design has five test events, which 1) vary the floor of the INL building on which the contaminant will be released, 2) provide for varying the amount of contaminant released to obtain desired concentration gradients, and 3) investigate overt as well as covert release of contaminants. Desirable contaminant gradients would have moderate to low concentrations of contaminant in rooms near the release point, with concentrations down to zero in other rooms. Such gradients would provide a range of contamination levels to challenge the sampling

  17. Extending cluster Lot Quality Assurance Sampling designs for surveillance programs

    OpenAIRE

    Hund, Lauren; Pagano, Marcello

    2014-01-01

    Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance based on the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than ...

  18. Multiobjective design of aquifer monitoring networks for optimal spatial prediction and geostatistical parameter estimation

    Science.gov (United States)

    Alzraiee, Ayman H.; Bau, Domenico A.; Garcia, Luis A.

    2013-06-01

    Effective sampling of hydrogeological systems is essential in guiding groundwater management practices. Optimal sampling of groundwater systems has previously been formulated based on the assumption that heterogeneous subsurface properties can be modeled using a geostatistical approach. Therefore, the monitoring schemes have been developed to concurrently minimize the uncertainty in the spatial distribution of systems' states and parameters, such as the hydraulic conductivity K and the hydraulic head H, and the uncertainty in the geostatistical model of system parameters using a single objective function that aggregates all objectives. However, it has been shown that the aggregation of possibly conflicting objective functions is sensitive to the adopted aggregation scheme and may lead to distorted results. In addition, the uncertainties in geostatistical parameters affect the uncertainty in the spatial prediction of K and H according to a complex nonlinear relationship, which has often been ineffectively evaluated using a first-order approximation. In this study, we propose a multiobjective optimization framework to assist the design of monitoring networks of K and H with the goal of optimizing their spatial predictions and estimating the geostatistical parameters of the K field. The framework stems from the combination of a data assimilation (DA) algorithm and a multiobjective evolutionary algorithm (MOEA). The DA algorithm is based on the ensemble Kalman filter, a Monte-Carlo-based Bayesian update scheme for nonlinear systems, which is employed to approximate the posterior uncertainty in K, H, and the geostatistical parameters of K obtained by collecting new measurements. Multiple MOEA experiments are used to investigate the trade-off among design objectives and identify the corresponding monitoring schemes. The methodology is applied to design a sampling network for a shallow unconfined groundwater system located in Rocky Ford, Colorado. Results indicate that

  19. Sample design for the residential energy consumption survey

    Energy Technology Data Exchange (ETDEWEB)

    1994-08-01

    The purpose of this report is to provide detailed information about the multistage area-probability sample design used for the Residential Energy Consumption Survey (RECS). It is intended as a technical report, for use by statisticians, to better understand the theory and procedures followed in the creation of the RECS sample frame. For a more cursory overview of the RECS sample design, refer to the appendix entitled ``How the Survey was Conducted,`` which is included in the statistical reports produced for each RECS survey year.

  20. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys

    OpenAIRE

    Hund, Lauren; Bedrick, Edward J.; Pagano, Marcello

    2015-01-01

    Lot quality assurance sampling (LQAS) surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we comp...

  1. Design compliance matrix waste sample container filling system for nested, fixed-depth sampling system

    International Nuclear Information System (INIS)

    BOGER, R.M.

    1999-01-01

    This design compliance matrix document provides specific design related functional characteristics, constraints, and requirements for the container filling system that is part of the nested, fixed-depth sampling system. This document addresses performance, external interfaces, ALARA, Authorization Basis, environmental and design code requirements for the container filling system. The container filling system will interface with the waste stream from the fluidic pumping channels of the nested, fixed-depth sampling system and will fill containers with waste that meet the Resource Conservation and Recovery Act (RCRA) criteria for waste that contains volatile and semi-volatile organic materials. The specifications for the nested, fixed-depth sampling system are described in a Level 2 Specification document (HNF-3483, Rev. 1). The basis for this design compliance matrix document is the Tank Waste Remediation System (TWRS) desk instructions for design Compliance matrix documents (PI-CP-008-00, Rev. 0)

  2. On the Spatial and Temporal Sampling Errors of Remotely Sensed Precipitation Products

    Directory of Open Access Journals (Sweden)

    Ali Behrangi

    2017-11-01

    Full Text Available Observation with coarse spatial and temporal sampling can cause large errors in quantification of the amount, intensity, and duration of precipitation events. In this study, the errors resulting from temporal and spatial sampling of precipitation events were quantified and examined using the latest version (V4 of the Global Precipitation Measurement (GPM mission integrated multi-satellite retrievals for GPM (IMERG, which is available since spring of 2014. Relative mean square error was calculated at 0.1° × 0.1° every 0.5 h between the degraded (temporally and spatially and original IMERG products. The temporal and spatial degradation was performed by producing three-hour (T3, six-hour (T6, 0.5° × 0.5° (S5, and 1.0° × 1.0° (S10 maps. The results show generally larger errors over land than ocean, especially over mountainous regions. The relative error of T6 is almost 20% larger than T3 over tropical land, but is smaller in higher latitudes. Over land relative error of T6 is larger than S5 across all latitudes, while T6 has larger relative error than S10 poleward of 20°S–20°N. Similarly, the relative error of T3 exceeds S5 poleward of 20°S–20°N, but does not exceed S10, except in very high latitudes. Similar results are also seen over ocean, but the error ratios are generally less sensitive to seasonal changes. The results also show that the spatial and temporal relative errors are not highly correlated. Overall, lower correlations between the spatial and temporal relative errors are observed over ocean than over land. Quantification of such spatiotemporal effects provides additional insights into evaluation studies, especially when different products are cross-compared at a range of spatiotemporal scales.

  3. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    Science.gov (United States)

    Pavlacky, David C; Lukacs, Paul M; Blakesley, Jennifer A; Skorkowsky, Robert C; Klute, David S; Hahn, Beth A; Dreitz, Victoria J; George, T Luke; Hanni, David J

    2017-01-01

    Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1) coordination across organizations and regions, 2) meaningful management and conservation objectives, and 3) rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR) program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17). We provide two examples for the Brewer's sparrow (Spizella breweri) in BCR 17 demonstrating the ability of the design to 1) determine hierarchical population responses to landscape change and 2) estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous statistical

  4. A statistically rigorous sampling design to integrate avian monitoring and management within Bird Conservation Regions.

    Directory of Open Access Journals (Sweden)

    David C Pavlacky

    Full Text Available Monitoring is an essential component of wildlife management and conservation. However, the usefulness of monitoring data is often undermined by the lack of 1 coordination across organizations and regions, 2 meaningful management and conservation objectives, and 3 rigorous sampling designs. Although many improvements to avian monitoring have been discussed, the recommendations have been slow to emerge in large-scale programs. We introduce the Integrated Monitoring in Bird Conservation Regions (IMBCR program designed to overcome the above limitations. Our objectives are to outline the development of a statistically defensible sampling design to increase the value of large-scale monitoring data and provide example applications to demonstrate the ability of the design to meet multiple conservation and management objectives. We outline the sampling process for the IMBCR program with a focus on the Badlands and Prairies Bird Conservation Region (BCR 17. We provide two examples for the Brewer's sparrow (Spizella breweri in BCR 17 demonstrating the ability of the design to 1 determine hierarchical population responses to landscape change and 2 estimate hierarchical habitat relationships to predict the response of the Brewer's sparrow to conservation efforts at multiple spatial scales. The collaboration across organizations and regions provided economy of scale by leveraging a common data platform over large spatial scales to promote the efficient use of monitoring resources. We designed the IMBCR program to address the information needs and core conservation and management objectives of the participating partner organizations. Although it has been argued that probabilistic sampling designs are not practical for large-scale monitoring, the IMBCR program provides a precedent for implementing a statistically defensible sampling design from local to bioregional scales. We demonstrate that integrating conservation and management objectives with rigorous

  5. Enhancing sampling design in mist-net bat surveys by accounting for sample size optimization

    OpenAIRE

    Trevelin, Leonardo Carreira; Novaes, Roberto Leonan Morim; Colas-Rosas, Paul François; Benathar, Thayse Cristhina Melo; Peres, Carlos A.

    2017-01-01

    The advantages of mist-netting, the main technique used in Neotropical bat community studies to date, include logistical implementation, standardization and sampling representativeness. Nonetheless, study designs still have to deal with issues of detectability related to how different species behave and use the environment. Yet there is considerable sampling heterogeneity across available studies in the literature. Here, we approach the problem of sample size optimization. We evaluated the co...

  6. Sampling design optimisation for rainfall prediction using a non-stationary geostatistical model

    Science.gov (United States)

    Wadoux, Alexandre M. J.-C.; Brus, Dick J.; Rico-Ramirez, Miguel A.; Heuvelink, Gerard B. M.

    2017-09-01

    The accuracy of spatial predictions of rainfall by merging rain-gauge and radar data is partly determined by the sampling design of the rain-gauge network. Optimising the locations of the rain-gauges may increase the accuracy of the predictions. Existing spatial sampling design optimisation methods are based on minimisation of the spatially averaged prediction error variance under the assumption of intrinsic stationarity. Over the past years, substantial progress has been made to deal with non-stationary spatial processes in kriging. Various well-documented geostatistical models relax the assumption of stationarity in the mean, while recent studies show the importance of considering non-stationarity in the variance for environmental processes occurring in complex landscapes. We optimised the sampling locations of rain-gauges using an extension of the Kriging with External Drift (KED) model for prediction of rainfall fields. The model incorporates both non-stationarity in the mean and in the variance, which are modelled as functions of external covariates such as radar imagery, distance to radar station and radar beam blockage. Spatial predictions are made repeatedly over time, each time recalibrating the model. The space-time averaged KED variance was minimised by Spatial Simulated Annealing (SSA). The methodology was tested using a case study predicting daily rainfall in the north of England for a one-year period. Results show that (i) the proposed non-stationary variance model outperforms the stationary variance model, and (ii) a small but significant decrease of the rainfall prediction error variance is obtained with the optimised rain-gauge network. In particular, it pays off to place rain-gauges at locations where the radar imagery is inaccurate, while keeping the distribution over the study area sufficiently uniform.

  7. Spatial distribution of single-nucleotide polymorphisms related to fungicide resistance and implications for sampling.

    Science.gov (United States)

    Van der Heyden, H; Dutilleul, P; Brodeur, L; Carisse, O

    2014-06-01

    Spatial distribution of single-nucleotide polymorphisms (SNPs) related to fungicide resistance was studied for Botrytis cinerea populations in vineyards and for B. squamosa populations in onion fields. Heterogeneity in this distribution was characterized by performing geostatistical analyses based on semivariograms and through the fitting of discrete probability distributions. Two SNPs known to be responsible for boscalid resistance (H272R and H272Y), both located on the B subunit of the succinate dehydrogenase gene, and one SNP known to be responsible for dicarboximide resistance (I365S) were chosen for B. cinerea in grape. For B. squamosa in onion, one SNP responsible for dicarboximide resistance (I365S homologous) was chosen. One onion field was sampled in 2009 and another one was sampled in 2010 for B. squamosa, and two vineyards were sampled in 2011 for B. cinerea, for a total of four sampled sites. Cluster sampling was carried on a 10-by-10 grid, each of the 100 nodes being the center of a 10-by-10-m quadrat. In each quadrat, 10 samples were collected and analyzed by restriction fragment length polymorphism polymerase chain reaction (PCR) or allele specific PCR. Mean SNP incidence varied from 16 to 68%, with an overall mean incidence of 43%. In the geostatistical analyses, omnidirectional variograms showed spatial autocorrelation characterized by ranges of 21 to 1 m. Various levels of anisotropy were detected, however, with variograms computed in four directions (at 0°, 45°, 90°, and 135° from the within-row direction used as reference), indicating that spatial autocorrelation was prevalent or characterized by a longer range in one direction. For all eight data sets, the β-binomial distribution was found to fit the data better than the binomial distribution. This indicates local aggregation of fungicide resistance among sampling units, as supported by estimates of the parameter θ of the β-binomial distribution of 0.09 to 0.23 (overall median value = 0

  8. Detecting the Land-Cover Changes Induced by Large-Physical Disturbances Using Landscape Metrics, Spatial Sampling, Simulation and Spatial Analysis

    Directory of Open Access Journals (Sweden)

    Hone-Jay Chu

    2009-08-01

    Full Text Available The objectives of the study are to integrate the conditional Latin Hypercube Sampling (cLHS, sequential Gaussian simulation (SGS and spatial analysis in remotely sensed images, to monitor the effects of large chronological disturbances on spatial characteristics of landscape changes including spatial heterogeneity and variability. The multiple NDVI images demonstrate that spatial patterns of disturbed landscapes were successfully delineated by spatial analysis such as variogram, Moran’I and landscape metrics in the study area. The hybrid method delineates the spatial patterns and spatial variability of landscapes caused by these large disturbances. The cLHS approach is applied to select samples from Normalized Difference Vegetation Index (NDVI images from SPOT HRV images in the Chenyulan watershed of Taiwan, and then SGS with sufficient samples is used to generate maps of NDVI images. In final, the NDVI simulated maps are verified using indexes such as the correlation coefficient and mean absolute error (MAE. Therefore, the statistics and spatial structures of multiple NDVI images present a very robust behavior, which advocates the use of the index for the quantification of the landscape spatial patterns and land cover change. In addition, the results transferred by Open Geospatial techniques can be accessed from web-based and end-user applications of the watershed management.

  9. OpenMSI Arrayed Analysis Toolkit: Analyzing Spatially Defined Samples Using Mass Spectrometry Imaging

    DEFF Research Database (Denmark)

    de Raad, Markus; de Rond, Tristan; Rübel, Oliver

    2017-01-01

    ://openmsinersc.gov), a platform for storing, sharing, and analyzing MSI data. By using a web-based python notebook (Jupyter), OMAAT is accessible to anyone without programming experience yet allows experienced users to leverage all features. OMAAT was :evaluated by analyzing an MSI data set of a high-throughput glycoside...... processing tools for the analysis of large arrayed MSI sample sets. The OpenMSI Arrayed Analysis Toolkit (OMAAT) is a software package that addresses the challenges of analyzing spatially defined samples in MSI data sets. OMAAT is written in Python and is integrated with OpenMSI (http...

  10. Spatial distribution of grape root borer (Lepidoptera: Sesiidae) infestations in Virginia vineyards and implications for sampling.

    Science.gov (United States)

    Rijal, J P; Brewster, C C; Bergh, J C

    2014-06-01

    Grape root borer, Vitacea polistiformis (Harris) (Lepidoptera: Sesiidae) is a potentially destructive pest of grape vines, Vitis spp. in the eastern United States. After feeding on grape roots for ≍2 yr in Virginia, larvae pupate beneath the soil surface around the vine base. Adults emerge during July and August, leaving empty pupal exuviae on or protruding from the soil. Weekly collections of pupal exuviae from an ≍1-m-diameter weed-free zone around the base of a grid of sample vines in Virginia vineyards were conducted in July and August, 2008-2012, and their distribution was characterized using both nonspatial (dispersion) and spatial techniques. Taylor's power law showed a significant aggregation of pupal exuviae, based on data from 19 vineyard blocks. Combined use of geostatistical and Spatial Analysis by Distance IndicEs methods indicated evidence of an aggregated pupal exuviae distribution pattern in seven of the nine blocks used for those analyses. Grape root borer pupal exuviae exhibited spatial dependency within a mean distance of 8.8 m, based on the range values of best-fitted variograms. Interpolated and clustering index-based infestation distribution maps were developed to show the spatial pattern of the insect within the vineyard blocks. The temporal distribution of pupal exuviae showed that the majority of moths emerged during the 3-wk period spanning the third week of July and the first week of August. The spatial distribution of grape root borer pupal exuviae was used in combination with temporal moth emergence patterns to develop a quantitative and efficient sampling scheme to assess infestations.

  11. Analysis of spatial patterns informs community assembly and sampling requirements for Collembola in forest soils

    Science.gov (United States)

    Dirilgen, Tara; Juceviča, Edite; Melecis, Viesturs; Querner, Pascal; Bolger, Thomas

    2018-01-01

    The relative importance of niche separation, non-equilibrial and neutral models of community assembly has been a theme in community ecology for many decades with none appearing to be applicable under all circumstances. In this study, Collembola species abundances were recorded over eleven consecutive years in a spatially explicit grid and used to examine (i) whether observed beta diversity differed from that expected under conditions of neutrality, (ii) whether sampling points differed in their relative contributions to overall beta diversity, and (iii) the number of samples required to provide comparable estimates of species richness across three forest sites. Neutrality could not be rejected for 26 of the forest by year combinations. However, there is a trend toward greater structure in the oldest forest, where beta diversity was greater than predicted by neutrality on five of the eleven sampling dates. The lack of difference in individual- and sample-based rarefaction curves also suggests randomness in the system at this particular scale of investigation. It seems that Collembola communities are not spatially aggregated and assembly is driven primarily by neutral processes particularly in the younger two sites. Whether this finding is due to small sample size or unaccounted for environmental variables cannot be determined. Variability between dates and sites illustrates the potential of drawing incorrect conclusions if data are collected at a single site and a single point in time.

  12. Effects of Spatial Sampling Interval on Roughness Parameters and Microwave Backscatter over Agricultural Soil Surfaces

    Directory of Open Access Journals (Sweden)

    Matías Ernesto Barber

    2016-06-01

    Full Text Available The spatial sampling interval, as related to the ability to digitize a soil profile with a certain number of features per unit length, depends on the profiling technique itself. From a variety of profiling techniques, roughness parameters are estimated at different sampling intervals. Since soil profiles have continuous spectral components, it is clear that roughness parameters are influenced by the sampling interval of the measurement device employed. In this work, we contributed to answer which sampling interval the profiles needed to be measured at to accurately account for the microwave response of agricultural surfaces. For this purpose, a 2-D laser profiler was built and used to measure surface soil roughness at field scale over agricultural sites in Argentina. Sampling intervals ranged from large (50 mm to small ones (1 mm, with several intermediate values. Large- and intermediate-sampling-interval profiles were synthetically derived from nominal, 1 mm ones. With these data, the effect of sampling-interval-dependent roughness parameters on backscatter response was assessed using the theoretical backscatter model IEM2M. Simulations demonstrated that variations of roughness parameters depended on the working wavelength and was less important at L-band than at C- or X-band. In any case, an underestimation of the backscattering coefficient of about 1-4 dB was observed at larger sampling intervals. As a general rule a sampling interval of 15 mm can be recommended for L-band and 5 mm for C-band.

  13. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys.

    Directory of Open Access Journals (Sweden)

    Lauren Hund

    Full Text Available Lot quality assurance sampling (LQAS surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we compare these latter cluster LQAS methodologies and provide recommendations for choosing a cluster LQAS design. We compare technical differences in the three methods and determine situations in which the choice of method results in a substantively different design. We consider two different aspects of the methods: the distributional assumptions and the clustering parameterization. Further, we provide software tools for implementing each method and clarify misconceptions about these designs in the literature. We illustrate the differences in these methods using vaccination and nutrition cluster LQAS surveys as example designs. The cluster methods are not sensitive to the distributional assumptions but can result in substantially different designs (sample sizes depending on the clustering parameterization. However, none of the clustering parameterizations used in the existing methods appears to be consistent with the observed data, and, consequently, choice between the cluster LQAS methods is not straightforward. Further research should attempt to characterize clustering patterns in specific applications and provide suggestions for best-practice cluster LQAS designs on a setting-specific basis.

  14. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys.

    Science.gov (United States)

    Hund, Lauren; Bedrick, Edward J; Pagano, Marcello

    2015-01-01

    Lot quality assurance sampling (LQAS) surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we compare these latter cluster LQAS methodologies and provide recommendations for choosing a cluster LQAS design. We compare technical differences in the three methods and determine situations in which the choice of method results in a substantively different design. We consider two different aspects of the methods: the distributional assumptions and the clustering parameterization. Further, we provide software tools for implementing each method and clarify misconceptions about these designs in the literature. We illustrate the differences in these methods using vaccination and nutrition cluster LQAS surveys as example designs. The cluster methods are not sensitive to the distributional assumptions but can result in substantially different designs (sample sizes) depending on the clustering parameterization. However, none of the clustering parameterizations used in the existing methods appears to be consistent with the observed data, and, consequently, choice between the cluster LQAS methods is not straightforward. Further research should attempt to characterize clustering patterns in specific applications and provide suggestions for best-practice cluster LQAS designs on a setting-specific basis.

  15. Within-otolith variability in chemical fingerprints: implications for sampling designs and possible environmental interpretation.

    Directory of Open Access Journals (Sweden)

    Antonio Di Franco

    Full Text Available Largely used as a natural biological tag in studies of dispersal/connectivity of fish, otolith elemental fingerprinting is usually analyzed by laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS. LA-ICP-MS produces an elemental fingerprint at a discrete time-point in the life of a fish and can generate data on within-otolith variability of that fingerprint. The presence of within-otolith variability has been previously acknowledged but not incorporated into experimental designs on the presumed, but untested, grounds of both its negligibility compared to among-otolith variability and of spatial autocorrelation among multiple ablations within an otolith. Here, using a hierarchical sampling design of spatial variation at multiple scales in otolith chemical fingerprints for two Mediterranean coastal fishes, we explore: 1 whether multiple ablations within an otolith can be used as independent replicates for significance tests among otoliths, and 2 the implications of incorporating within-otolith variability when assessing spatial variability in otolith chemistry at a hierarchy of spatial scales (different fish, from different sites, at different locations on the Apulian Adriatic coast. We find that multiple ablations along the same daily rings do not necessarily exhibit spatial dependency within the otolith and can be used to estimate residual variability in a hierarchical sampling design. Inclusion of within-otolith measurements reveals that individuals at the same site can show significant variability in elemental uptake. Within-otolith variability examined across the spatial hierarchy identifies differences between the two fish species investigated, and this finding leads to discussion of the potential for within-otolith variability to be used as a marker for fish exposure to stressful conditions. We also demonstrate that a 'cost'-optimal allocation of sampling effort should typically include some level of within

  16. Sample design considerations of indoor air exposure surveys

    International Nuclear Information System (INIS)

    Cox, B.G.; Mage, D.T.; Immerman, F.W.

    1988-01-01

    Concern about the potential for indoor air pollution has prompted recent surveys of radon and NO 2 concentrations in homes and personal exposure studies of volatile organics, carbon monoxide and pesticides, to name a few. The statistical problems in designing sample surveys that measure the physical environment are diverse and more complicated than those encountered in traditional surveys of human attitudes and attributes. This paper addresses issues encountered when designing indoor air quality (IAQ) studies. General statistical concepts related to target population definition, frame creation, and sample selection for area household surveys and telephone surveys are presented. The implications of different measurement approaches are discussed, and response rate considerations are described

  17. Mechanical design and simulation of an automatized sample exchanger

    International Nuclear Information System (INIS)

    Lopez, Yon; Gora, Jimmy; Bedregal, Patricia; Hernandez, Yuri; Baltuano, Oscar; Gago, Javier

    2013-01-01

    The design of a turntable type sample exchanger for irradiation and with a capacity for up to 20 capsules was performed. Its function is the automatic sending of samples contained in polyethylene capsules, for irradiation in the grid position of the reactor core, using a pneumatic system and further analysis by neutron activation. This study shows the structural design analysis and calculations in selecting motors and actuators. This development will improve efficiency in the analysis, reducing the contribution of the workers and also the radiation exposure time. (authors).

  18. Characterization of spatial distribution of Tetranychus urticae in peppermint in California and implication for improving sampling plan.

    Science.gov (United States)

    Rijal, Jhalendra P; Wilson, Rob; Godfrey, Larry D

    2016-02-01

    Twospotted spider mite, Tetranychus urticae Koch, is an important pest of peppermint in California, USA. Spider mite feeding on peppermint leaves causes physiological changes in the plant, which coupling with the favorable environmental condition can lead to increased mite infestations. Significant yield loss can occur in absence of pest monitoring and timely management. Understating the within-field spatial distribution of T. urticae is critical for the development of reliable sampling plan. The study reported here aims to characterize the spatial distribution of mite infestation in four commercial peppermint fields in northern California using spatial techniques, variogram and Spatial Analysis by Distance IndicEs (SADIE). Variogram analysis revealed that there was a strong evidence for spatially dependent (aggregated) mite population in 13 of 17 sampling dates and the physical distance of the aggregation reached maximum to 7 m in peppermint fields. Using SADIE, 11 of 17 sampling dates showed aggregated distribution pattern of mite infestation. Combining results from variogram and SADIE analysis, the spatial aggregation of T. urticae was evident in all four fields for all 17 sampling dates evaluated. Comparing spatial association using SADIE, ca. 62% of the total sampling pairs showed a positive association of mite spatial distribution patterns between two consecutive sampling dates, which indicates a strong spatial and temporal stability of mite infestation in peppermint fields. These results are discussed in relation to behavior of spider mite distribution within field, and its implications for improving sampling guidelines that are essential for effective pest monitoring and management.

  19. Spatial distribution and sequential sampling plans for Tuta absoluta (Lepidoptera: Gelechiidae) in greenhouse tomato crops.

    Science.gov (United States)

    Cocco, Arturo; Serra, Giuseppe; Lentini, Andrea; Deliperi, Salvatore; Delrio, Gavino

    2015-09-01

    The within- and between-plant distribution of the tomato leafminer, Tuta absoluta (Meyrick), was investigated in order to define action thresholds based on leaf infestation and to propose enumerative and binomial sequential sampling plans for pest management applications in protected crops. The pest spatial distribution was aggregated between plants, and median leaves were the most suitable sample to evaluate the pest density. Action thresholds of 36 and 48%, 43 and 56% and 60 and 73% infested leaves, corresponding to economic thresholds of 1 and 3% damaged fruits, were defined for tomato cultivars with big, medium and small fruits respectively. Green's method was a more suitable enumerative sampling plan as it required a lower sampling effort. Binomial sampling plans needed lower average sample sizes than enumerative plans to make a treatment decision, with probabilities of error of sampling plan required 87 or 343 leaves to estimate the population density in extensive or intensive ecological studies respectively. Binomial plans would be more practical and efficient for control purposes, needing average sample sizes of 17, 20 and 14 leaves to take a pest management decision in order to avoid fruit damage higher than 1% in cultivars with big, medium and small fruits respectively. © 2014 Society of Chemical Industry.

  20. Spatially explicit population estimates for black bears based on cluster sampling

    Science.gov (United States)

    Humm, J.; McCown, J. Walter; Scheick, B.K.; Clark, Joseph D.

    2017-01-01

    We estimated abundance and density of the 5 major black bear (Ursus americanus) subpopulations (i.e., Eglin, Apalachicola, Osceola, Ocala-St. Johns, Big Cypress) in Florida, USA with spatially explicit capture-mark-recapture (SCR) by extracting DNA from hair samples collected at barbed-wire hair sampling sites. We employed a clustered sampling configuration with sampling sites arranged in 3 × 3 clusters spaced 2 km apart within each cluster and cluster centers spaced 16 km apart (center to center). We surveyed all 5 subpopulations encompassing 38,960 km2 during 2014 and 2015. Several landscape variables, most associated with forest cover, helped refine density estimates for the 5 subpopulations we sampled. Detection probabilities were affected by site-specific behavioral responses coupled with individual capture heterogeneity associated with sex. Model-averaged bear population estimates ranged from 120 (95% CI = 59–276) bears or a mean 0.025 bears/km2 (95% CI = 0.011–0.44) for the Eglin subpopulation to 1,198 bears (95% CI = 949–1,537) or 0.127 bears/km2 (95% CI = 0.101–0.163) for the Ocala-St. Johns subpopulation. The total population estimate for our 5 study areas was 3,916 bears (95% CI = 2,914–5,451). The clustered sampling method coupled with information on land cover was efficient and allowed us to estimate abundance across extensive areas that would not have been possible otherwise. Clustered sampling combined with spatially explicit capture-recapture methods has the potential to provide rigorous population estimates for a wide array of species that are extensive and heterogeneous in their distribution.

  1. A binary logistic regression model with complex sampling design of ...

    African Journals Online (AJOL)

    2017-09-03

    Sep 3, 2017 ... Bi-variable and multi-variable binary logistic regression model with complex sampling design was fitted. .... Data was entered into STATA-12 and analyzed using. SPSS-21. .... lack of access/too far or costs too much. 35. 1.2.

  2. Optimizing incomplete sample designs for item response model parameters

    NARCIS (Netherlands)

    van der Linden, Willem J.

    Several models for optimizing incomplete sample designs with respect to information on the item parameters are presented. The following cases are considered: (1) known ability parameters; (2) unknown ability parameters; (3) item sets with multiple ability scales; and (4) response models with

  3. Insights into a spatially embedded social network from a large-scale snowball sample

    Science.gov (United States)

    Illenberger, J.; Kowald, M.; Axhausen, K. W.; Nagel, K.

    2011-12-01

    Much research has been conducted to obtain insights into the basic laws governing human travel behaviour. While the traditional travel survey has been for a long time the main source of travel data, recent approaches to use GPS data, mobile phone data, or the circulation of bank notes as a proxy for human travel behaviour are promising. The present study proposes a further source of such proxy-data: the social network. We collect data using an innovative snowball sampling technique to obtain details on the structure of a leisure-contacts network. We analyse the network with respect to its topology, the individuals' characteristics, and its spatial structure. We further show that a multiplication of the functions describing the spatial distribution of leisure contacts and the frequency of physical contacts results in a trip distribution that is consistent with data from the Swiss travel survey.

  4. Secondary Analysis under Cohort Sampling Designs Using Conditional Likelihood

    Directory of Open Access Journals (Sweden)

    Olli Saarela

    2012-01-01

    Full Text Available Under cohort sampling designs, additional covariate data are collected on cases of a specific type and a randomly selected subset of noncases, primarily for the purpose of studying associations with a time-to-event response of interest. With such data available, an interest may arise to reuse them for studying associations between the additional covariate data and a secondary non-time-to-event response variable, usually collected for the whole study cohort at the outset of the study. Following earlier literature, we refer to such a situation as secondary analysis. We outline a general conditional likelihood approach for secondary analysis under cohort sampling designs and discuss the specific situations of case-cohort and nested case-control designs. We also review alternative methods based on full likelihood and inverse probability weighting. We compare the alternative methods for secondary analysis in two simulated settings and apply them in a real-data example.

  5. The Effects of Computer-Aided Design Software on Engineering Students' Spatial Visualisation Skills

    Science.gov (United States)

    Kösa, Temel; Karakus, Fatih

    2018-01-01

    The purpose of this study was to determine the influence of computer-aided design (CAD) software-based instruction on the spatial visualisation skills of freshman engineering students in a computer-aided engineering drawing course. A quasi-experimental design was applied, using the Purdue Spatial Visualization Test-Visualization of Rotations…

  6. Visualisation and research strategy for computational spatial and structural design interaction

    NARCIS (Netherlands)

    Peeten, D.; Hofmeyer, H.; Thabet, W

    2010-01-01

    A research engine is under development for studying the interaction of spatial and structural design processes. The design processes are being implemented as two separate configurable transformation steps; a conversion step and an optimisation step. A significant part of the spatial-to-structural

  7. Latin hypercube sampling and geostatistical modeling of spatial uncertainty in a spatially explicit forest landscape model simulation

    Science.gov (United States)

    Chonggang Xu; Hong S. He; Yuanman Hu; Yu Chang; Xiuzhen Li; Rencang Bu

    2005-01-01

    Geostatistical stochastic simulation is always combined with Monte Carlo method to quantify the uncertainty in spatial model simulations. However, due to the relatively long running time of spatially explicit forest models as a result of their complexity, it is always infeasible to generate hundreds or thousands of Monte Carlo simulations. Thus, it is of great...

  8. Rapid Sampling of Hydrogen Bond Networks for Computational Protein Design.

    Science.gov (United States)

    Maguire, Jack B; Boyken, Scott E; Baker, David; Kuhlman, Brian

    2018-05-08

    Hydrogen bond networks play a critical role in determining the stability and specificity of biomolecular complexes, and the ability to design such networks is important for engineering novel structures, interactions, and enzymes. One key feature of hydrogen bond networks that makes them difficult to rationally engineer is that they are highly cooperative and are not energetically favorable until the hydrogen bonding potential has been satisfied for all buried polar groups in the network. Existing computational methods for protein design are ill-equipped for creating these highly cooperative networks because they rely on energy functions and sampling strategies that are focused on pairwise interactions. To enable the design of complex hydrogen bond networks, we have developed a new sampling protocol in the molecular modeling program Rosetta that explicitly searches for sets of amino acid mutations that can form self-contained hydrogen bond networks. For a given set of designable residues, the protocol often identifies many alternative sets of mutations/networks, and we show that it can readily be applied to large sets of residues at protein-protein interfaces or in the interior of proteins. The protocol builds on a recently developed method in Rosetta for designing hydrogen bond networks that has been experimentally validated for small symmetric systems but was not extensible to many larger protein structures and complexes. The sampling protocol we describe here not only recapitulates previously validated designs with performance improvements but also yields viable hydrogen bond networks for cases where the previous method fails, such as the design of large, asymmetric interfaces relevant to engineering protein-based therapeutics.

  9. A census-weighted, spatially-stratified household sampling strategy for urban malaria epidemiology

    Directory of Open Access Journals (Sweden)

    Slutsker Laurence

    2008-02-01

    Full Text Available Abstract Background Urban malaria is likely to become increasingly important as a consequence of the growing proportion of Africans living in cities. A novel sampling strategy was developed for urban areas to generate a sample simultaneously representative of population and inhabited environments. Such a strategy should facilitate analysis of important epidemiological relationships in this ecological context. Methods Census maps and summary data for Kisumu, Kenya, were used to create a pseudo-sampling frame using the geographic coordinates of census-sampled structures. For every enumeration area (EA designated as urban by the census (n = 535, a sample of structures equal to one-tenth the number of households was selected. In EAs designated as rural (n = 32, a geographically random sample totalling one-tenth the number of households was selected from a grid of points at 100 m intervals. The selected samples were cross-referenced to a geographic information system, and coordinates transferred to handheld global positioning units. Interviewers found the closest eligible household to the sampling point and interviewed the caregiver of a child aged Results 4,336 interviews were completed in 473 of the 567 study area EAs from June 2002 through February 2003. EAs without completed interviews were randomly distributed, and non-response was approximately 2%. Mean distance from the assigned sampling point to the completed interview was 74.6 m, and was significantly less in urban than rural EAs, even when controlling for number of households. The selected sample had significantly more children and females of childbearing age than the general population, and fewer older individuals. Conclusion This method selected a sample that was simultaneously population-representative and inclusive of important environmental variation. The use of a pseudo-sampling frame and pre-programmed handheld GPS units is more efficient and may yield a more complete sample than

  10. Sampling-Based Motion Planning Algorithms for Replanning and Spatial Load Balancing

    Energy Technology Data Exchange (ETDEWEB)

    Boardman, Beth Leigh [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-12

    The common theme of this dissertation is sampling-based motion planning with the two key contributions being in the area of replanning and spatial load balancing for robotic systems. Here, we begin by recalling two sampling-based motion planners: the asymptotically optimal rapidly-exploring random tree (RRT*), and the asymptotically optimal probabilistic roadmap (PRM*). We also provide a brief background on collision cones and the Distributed Reactive Collision Avoidance (DRCA) algorithm. The next four chapters detail novel contributions for motion replanning in environments with unexpected static obstacles, for multi-agent collision avoidance, and spatial load balancing. First, we show improved performance of the RRT* when using the proposed Grandparent-Connection (GP) or Focused-Refinement (FR) algorithms. Next, the Goal Tree algorithm for replanning with unexpected static obstacles is detailed and proven to be asymptotically optimal. A multi-agent collision avoidance problem in obstacle environments is approached via the RRT*, leading to the novel Sampling-Based Collision Avoidance (SBCA) algorithm. The SBCA algorithm is proven to guarantee collision free trajectories for all of the agents, even when subject to uncertainties in the knowledge of the other agents’ positions and velocities. Given that a solution exists, we prove that livelocks and deadlock will lead to the cost to the goal being decreased. We introduce a new deconfliction maneuver that decreases the cost-to-come at each step. This new maneuver removes the possibility of livelocks and allows a result to be formed that proves convergence to the goal configurations. Finally, we present a limited range Graph-based Spatial Load Balancing (GSLB) algorithm which fairly divides a non-convex space among multiple agents that are subject to differential constraints and have a limited travel distance. The GSLB is proven to converge to a solution when maximizing the area covered by the agents. The analysis

  11. Study and design of a very high spatial resolution beta imaging system

    International Nuclear Information System (INIS)

    Donnard, J.

    2008-01-01

    The b autoradiography is a widely used technique in pharmacology or biological fields. It is able to locate in two dimensions molecules labeled with beta emitters. The development of a gaseous detector incorporating micro-mesh called PIM in the Subatech laboratory leads to the construction of a very high spatial resolution apparatus dedicated to b imaging. This device is devoted to small analysis surface of a half microscope slide in particular of 3 H or 14 C and the measured spatial resolution is 20 μm FWHM. The recent development of a new reconstruction method allows enlarging the field of investigation to high energy beta emitters such as 131 I, 18 F or 46 Sc. A new device with a large active area of 18*18 cm 2 has been built with a user friendly design. This allows to image simultaneously 10 microscope slides. Thanks to a multi-modality solution, it retains the good characteristics of spatial resolution obtained previously on a small surface. Moreover, different kinds of samples, like microscope slides or scotches can be analysed. The simulation and experimentation work achieved during this thesis led to an optimal disposition of the inner structure of the detector. These results and characterization show that the PIM structure has to be considered for a next generation of b-Imager. (author)

  12. Development of spatial scaling technique of forest health sample point information

    Science.gov (United States)

    Lee, J.; Ryu, J.; Choi, Y. Y.; Chung, H. I.; Kim, S. H.; Jeon, S. W.

    2017-12-01

    Most forest health assessments are limited to monitoring sampling sites. The monitoring of forest health in Britain in Britain was carried out mainly on five species (Norway spruce, Sitka spruce, Scots pine, Oak, Beech) Database construction using Oracle database program with density The Forest Health Assessment in GreatBay in the United States was conducted to identify the characteristics of the ecosystem populations of each area based on the evaluation of forest health by tree species, diameter at breast height, water pipe and density in summer and fall of 200. In the case of Korea, in the first evaluation report on forest health vitality, 1000 sample points were placed in the forests using a systematic method of arranging forests at 4Km × 4Km at regular intervals based on an sample point, and 29 items in four categories such as tree health, vegetation, soil, and atmosphere. As mentioned above, existing researches have been done through the monitoring of the survey sample points, and it is difficult to collect information to support customized policies for the regional survey sites. In the case of special forests such as urban forests and major forests, policy and management appropriate to the forest characteristics are needed. Therefore, it is necessary to expand the survey headquarters for diagnosis and evaluation of customized forest health. For this reason, we have constructed a method of spatial scale through the spatial interpolation according to the characteristics of each index of the main sample point table of 29 index in the four points of diagnosis and evaluation report of the first forest health vitality report, PCA statistical analysis and correlative analysis are conducted to construct the indicators with significance, and then weights are selected for each index, and evaluation of forest health is conducted through statistical grading.

  13. ACS sampling system: design, implementation, and performance evaluation

    Science.gov (United States)

    Di Marcantonio, Paolo; Cirami, Roberto; Chiozzi, Gianluca

    2004-09-01

    By means of ACS (ALMA Common Software) framework we designed and implemented a sampling system which allows sampling of every Characteristic Component Property with a specific, user-defined, sustained frequency limited only by the hardware. Collected data are sent to various clients (one or more Java plotting widgets, a dedicated GUI or a COTS application) using the ACS/CORBA Notification Channel. The data transport is optimized: samples are cached locally and sent in packets with a lower and user-defined frequency to keep network load under control. Simultaneous sampling of the Properties of different Components is also possible. Together with the design and implementation issues we present the performance of the sampling system evaluated on two different platforms: on a VME based system using VxWorks RTOS (currently adopted by ALMA) and on a PC/104+ embedded platform using Red Hat 9 Linux operating system. The PC/104+ solution offers, as an alternative, a low cost PC compatible hardware environment with free and open operating system.

  14. The Study on Mental Health at Work: Design and sampling.

    Science.gov (United States)

    Rose, Uwe; Schiel, Stefan; Schröder, Helmut; Kleudgen, Martin; Tophoven, Silke; Rauch, Angela; Freude, Gabriele; Müller, Grit

    2017-08-01

    The Study on Mental Health at Work (S-MGA) generates the first nationwide representative survey enabling the exploration of the relationship between working conditions, mental health and functioning. This paper describes the study design, sampling procedures and data collection, and presents a summary of the sample characteristics. S-MGA is a representative study of German employees aged 31-60 years subject to social security contributions. The sample was drawn from the employment register based on a two-stage cluster sampling procedure. Firstly, 206 municipalities were randomly selected from a pool of 12,227 municipalities in Germany. Secondly, 13,590 addresses were drawn from the selected municipalities for the purpose of conducting 4500 face-to-face interviews. The questionnaire covers psychosocial working and employment conditions, measures of mental health, work ability and functioning. Data from personal interviews were combined with employment histories from register data. Descriptive statistics of socio-demographic characteristics and logistic regressions analyses were used for comparing population, gross sample and respondents. In total, 4511 face-to-face interviews were conducted. A test for sampling bias revealed that individuals in older cohorts participated more often, while individuals with an unknown educational level, residing in major cities or with a non-German ethnic background were slightly underrepresented. There is no indication of major deviations in characteristics between the basic population and the sample of respondents. Hence, S-MGA provides representative data for research on work and health, designed as a cohort study with plans to rerun the survey 5 years after the first assessment.

  15. The Study on Mental Health at Work: Design and sampling

    Science.gov (United States)

    Rose, Uwe; Schiel, Stefan; Schröder, Helmut; Kleudgen, Martin; Tophoven, Silke; Rauch, Angela; Freude, Gabriele; Müller, Grit

    2017-01-01

    Aims: The Study on Mental Health at Work (S-MGA) generates the first nationwide representative survey enabling the exploration of the relationship between working conditions, mental health and functioning. This paper describes the study design, sampling procedures and data collection, and presents a summary of the sample characteristics. Methods: S-MGA is a representative study of German employees aged 31–60 years subject to social security contributions. The sample was drawn from the employment register based on a two-stage cluster sampling procedure. Firstly, 206 municipalities were randomly selected from a pool of 12,227 municipalities in Germany. Secondly, 13,590 addresses were drawn from the selected municipalities for the purpose of conducting 4500 face-to-face interviews. The questionnaire covers psychosocial working and employment conditions, measures of mental health, work ability and functioning. Data from personal interviews were combined with employment histories from register data. Descriptive statistics of socio-demographic characteristics and logistic regressions analyses were used for comparing population, gross sample and respondents. Results: In total, 4511 face-to-face interviews were conducted. A test for sampling bias revealed that individuals in older cohorts participated more often, while individuals with an unknown educational level, residing in major cities or with a non-German ethnic background were slightly underrepresented. Conclusions: There is no indication of major deviations in characteristics between the basic population and the sample of respondents. Hence, S-MGA provides representative data for research on work and health, designed as a cohort study with plans to rerun the survey 5 years after the first assessment. PMID:28673202

  16. Towards spatial isolation design in a multi-core real-time kernel targeting safety-critical applications

    DEFF Research Database (Denmark)

    Li, Gang; Top, Søren

    2013-01-01

    . Partitioning can prevent fault propagation among mixed-criticality applications, if spatial and temporal isolation are adequately ensured. This paper focuses on the solution of spatial isolation in the HARTEX kernel on a multi-core platform in terms of memory, communication between applications and I/O sharing....... According to formulated isolation requirements, a simple partitioning multi-core hardware architecture is proposed using SoC and memory protection units, and the kernel is extended to support spatial isolation between the kernel and applications as well as between applications. Combined design of hardware...... and software can easily achieve this isolation. At last, the spatial isolation is evaluated using a statistical sampling method and its performance is tested in terms of task switch, system call and footprint....

  17. Virtual Reality As A Spatial Experience For Architecture Design: A Study of Effectiveness for Architecture Students

    Directory of Open Access Journals (Sweden)

    Sapto Pamungkas Luhur

    2018-01-01

    Full Text Available Studios. This ability gained through visual design thinking. The spatial experience honed by three dimensional thinking from the medium diversity. The spatial experience learned through a room layout, proportion, and composition. This research used an experimental method and the primary data obtained by a “Likert” scale questionnaire. The Respondents are 50 students of the Architectural Design Studio. Moreover, the analysis focuses on the VR for spatial experience. The result was a descriptive explanation of the effectiveness of Virtual Reality for a spatial experience of architecture students at Technology University of Yogyakarta.

  18. Spatial Variation of Soil Lead in an Urban Community Garden: Implications for Risk-Based Sampling.

    Science.gov (United States)

    Bugdalski, Lauren; Lemke, Lawrence D; McElmurry, Shawn P

    2014-01-01

    Soil lead pollution is a recalcitrant problem in urban areas resulting from a combination of historical residential, industrial, and transportation practices. The emergence of urban gardening movements in postindustrial cities necessitates accurate assessment of soil lead levels to ensure safe gardening. In this study, we examined small-scale spatial variability of soil lead within a 15 × 30 m urban garden plot established on two adjacent residential lots located in Detroit, Michigan, USA. Eighty samples collected using a variably spaced sampling grid were analyzed for total, fine fraction (less than 250 μm), and bioaccessible soil lead. Measured concentrations varied at sampling scales of 1-10 m and a hot spot exceeding 400 ppm total soil lead was identified in the northwest portion of the site. An interpolated map of total lead was treated as an exhaustive data set, and random sampling was simulated to generate Monte Carlo distributions and evaluate alternative sampling strategies intended to estimate the average soil lead concentration or detect hot spots. Increasing the number of individual samples decreases the probability of overlooking the hot spot (type II error). However, the practice of compositing and averaging samples decreased the probability of overestimating the mean concentration (type I error) at the expense of increasing the chance for type II error. The results reported here suggest a need to reconsider U.S. Environmental Protection Agency sampling objectives and consequent guidelines for reclaimed city lots where soil lead distributions are expected to be nonuniform. © 2013 Society for Risk Analysis.

  19. A cautionary note on substituting spatial subunits for repeated temporal sampling in studies of site occupancy

    Science.gov (United States)

    Kendall, William L.; White, Gary C.

    2009-01-01

    1. Assessing the probability that a given site is occupied by a species of interest is important to resource managers, as well as metapopulation or landscape ecologists. Managers require accurate estimates of the state of the system, in order to make informed decisions. Models that yield estimates of occupancy, while accounting for imperfect detection, have proven useful by removing a potentially important source of bias. To account for detection probability, multiple independent searches per site for the species are required, under the assumption that the species is available for detection during each search of an occupied site. 2. We demonstrate that when multiple samples per site are defined by searching different locations within a site, absence of the species from a subset of these spatial subunits induces estimation bias when locations are exhaustively assessed or sampled without replacement. 3. We further demonstrate that this bias can be removed by choosing sampling locations with replacement, or if the species is highly mobile over a short period of time. 4. Resampling an existing data set does not mitigate bias due to exhaustive assessment of locations or sampling without replacement. 5. Synthesis and applications. Selecting sampling locations for presence/absence surveys with replacement is practical in most cases. Such an adjustment to field methods will prevent one source of bias, and therefore produce more robust statistical inferences about species occupancy. This will in turn permit managers to make resource decisions based on better knowledge of the state of the system.

  20. A new formulation of the linear sampling method: spatial resolution and post-processing

    International Nuclear Information System (INIS)

    Piana, M; Aramini, R; Brignone, M; Coyle, J

    2008-01-01

    A new formulation of the linear sampling method is described, which requires the regularized solution of a single functional equation set in a direct sum of L 2 spaces. This new approach presents the following notable advantages: it is computationally more effective than the traditional implementation, since time consuming samplings of the Tikhonov minimum problem and of the generalized discrepancy equation are avoided; it allows a quantitative estimate of the spatial resolution achievable by the method; it facilitates a post-processing procedure for the optimal selection of the scatterer profile by means of edge detection techniques. The formulation is described in a two-dimensional framework and in the case of obstacle scattering, although generalizations to three dimensions and penetrable inhomogeneities are straightforward

  1. Evaluation of single and two-stage adaptive sampling designs for estimation of density and abundance of freshwater mussels in a large river

    Science.gov (United States)

    Smith, D.R.; Rogala, J.T.; Gray, B.R.; Zigler, S.J.; Newton, T.J.

    2011-01-01

    Reliable estimates of abundance are needed to assess consequences of proposed habitat restoration and enhancement projects on freshwater mussels in the Upper Mississippi River (UMR). Although there is general guidance on sampling techniques for population assessment of freshwater mussels, the actual performance of sampling designs can depend critically on the population density and spatial distribution at the project site. To evaluate various sampling designs, we simulated sampling of populations, which varied in density and degree of spatial clustering. Because of logistics and costs of large river sampling and spatial clustering of freshwater mussels, we focused on adaptive and non-adaptive versions of single and two-stage sampling. The candidate designs performed similarly in terms of precision (CV) and probability of species detection for fixed sample size. Both CV and species detection were determined largely by density, spatial distribution and sample size. However, designs did differ in the rate that occupied quadrats were encountered. Occupied units had a higher probability of selection using adaptive designs than conventional designs. We used two measures of cost: sample size (i.e. number of quadrats) and distance travelled between the quadrats. Adaptive and two-stage designs tended to reduce distance between sampling units, and thus performed better when distance travelled was considered. Based on the comparisons, we provide general recommendations on the sampling designs for the freshwater mussels in the UMR, and presumably other large rivers.

  2. High spatial sampling global mode structure measurements via multichannel reflectometry in NSTX

    Energy Technology Data Exchange (ETDEWEB)

    Crocker, N A; Peebles, W A; Kubota, S; Zhang, J [Department of Physics and Astronomy, University of California-Los Angeles, Los Angeles, CA 90095-7099 (United States); Bell, R E; Fredrickson, E D; Gorelenkov, N N; LeBlanc, B P; Menard, J E; Podesta, M [Princeton Plasma Physics Laboratory, PO Box 451, Princeton, NJ 08543-0451 (United States); Sabbagh, S A [Department of Applied Physics and Applied Mathematics, Columbia University, New York, NY 10027 (United States); Tritz, K [Johns Hopkins University, Baltimore, MD 21218 (United States); Yuh, H [Nova Photonics, Princeton, NJ 08540 (United States)

    2011-10-15

    Global modes-including kinks and tearing modes (f <{approx} 50 kHz), toroidicity-induced Alfven eigenmodes (TAE; f {approx} 50-250 kHz) and global and compressional Alfven eigenmodes (GAE and CAE; f >{approx} 400 kHz)-play critical roles in many aspects of plasma performance. Their investigation on NSTX is aided by an array of fixed-frequency quadrature reflectometers used to determine their radial density perturbation structure. The array has been recently upgraded to 16 channels spanning 30-75 GHz (n{sub cutoff} = (1.1-6.9) x 10{sup 19} m{sup -3} in O-mode), improving spatial sampling and access to the core of H-mode plasmas. The upgrade has yielded significant new results that advance the understanding of global modes in NSTX. The GAE and CAE structures have been measured for the first time in the core of an NSTX high-power (6 MW) beam-heated H-mode plasma. The CAE structure is strongly core-localized, which has important implications for electron thermal transport. The TAE structure has been measured with greatly improved spatial sampling, and measurements of the TAE phase, the first in NSTX, show strong radial variation near the midplane, indicating radial propagation caused by non-ideal MHD effects. Finally, the tearing mode structure measurements provide unambiguous evidence of coupling to an external kink.

  3. The effects of environmental variability and spatial sampling on the three-dimensional inversion problem.

    Science.gov (United States)

    Bender, Christopher M; Ballard, Megan S; Wilson, Preston S

    2014-06-01

    The overall goal of this work is to quantify the effects of environmental variability and spatial sampling on the accuracy and uncertainty of estimates of the three-dimensional ocean sound-speed field. In this work, ocean sound speed estimates are obtained with acoustic data measured by a sparse autonomous observing system using a perturbative inversion scheme [Rajan, Lynch, and Frisk, J. Acoust. Soc. Am. 82, 998-1017 (1987)]. The vertical and horizontal resolution of the solution depends on the bandwidth of acoustic data and on the quantity of sources and receivers, respectively. Thus, for a simple, range-independent ocean sound speed profile, a single source-receiver pair is sufficient to estimate the water-column sound-speed field. On the other hand, an environment with significant variability may not be fully characterized by a large number of sources and receivers, resulting in uncertainty in the solution. This work explores the interrelated effects of environmental variability and spatial sampling on the accuracy and uncertainty of the inversion solution though a set of case studies. Synthetic data representative of the ocean variability on the New Jersey shelf are used.

  4. Patchiness of Ciliate Communities Sampled at Varying Spatial Scales along the New England Shelf.

    Directory of Open Access Journals (Sweden)

    Jean-David Grattepanche

    Full Text Available Although protists (microbial eukaryotes provide an important link between bacteria and Metazoa in food webs, we do not yet have a clear understanding of the spatial scales on which protist diversity varies. Here, we use a combination of DNA fingerprinting (denaturant gradient gel electrophoresis or DGGE and high-throughput sequencing (HTS to assess the ciliate community in the class Spirotrichea at varying scales of 1-3 km sampled in three locations separated by at least 25 km-offshore, midshelf and inshore-along the New England shelf. Analyses of both abundant community (DGGE and the total community (HTS members reveal that: 1 ciliate communities are patchily distributed inshore (i.e. the middle station of a transect is distinct from its two neighboring stations, whereas communities are more homogeneous among samples within the midshelf and offshore stations; 2 a ciliate closely related to Pelagostrobilidium paraepacrum 'blooms' inshore and; 3 environmental factors may differentially impact the distributions of individual ciliates (i.e. OTUs rather than the community as a whole as OTUs tend to show distinct biogeographies (e.g. some OTUs are restricted to the offshore locations, some to the surface, etc.. Together, these data show the complexity underlying the spatial distributions of marine protists, and suggest that biogeography may be a property of ciliate species rather than communities.

  5. Seasonal phenology, spatial distribution, and sampling plan for the invasive mealybug Phenacoccus peruvianus (Hemiptera: Pseudococcidae).

    Science.gov (United States)

    Beltrá, A; Garcia-Marí, F; Soto, A

    2013-06-01

    Phlenacoccus peruvianus Granara de Willink (Hemiptera: Pseudococcidae) is an invasive mealybug of Neotropical origin. In recent years it has invaded the Mediterranean Basin causing significant damages in bougainvillea and other ornamental plants. This article examines its phenology, location on the plant and spatial distribution, and presents a sampling plan to determine P. peruvianus population density for the management of this mealybug in southern Europe. Six urban green spaces with bougainvillea plants were periodically surveyed between March 2008 and September 2010 in eastern Spain, sampling bracts, leaves, and twigs. Our results show that P. peruvianus abundance was high in spring and summer, declining to almost undetectable levels in autumn and winter. The mealybugs showed a preference for settling on bracts and there were no significant migrations between plant organs. P. peruvianus showed a highly aggregated distribution on bracts, leaves, and twigs. We recommend abinomial sampling of 200 leaves and an action threshold of 55% infested leaves for integrated pest management purposes on urban landscapes and enumerative sampling for ornamental nursery management and additional biological studies.

  6. DESIGN FOR CONNECTING SPATIAL DATA INFRASTRUCTURES WITH SENSOR WEB (SENSDI

    Directory of Open Access Journals (Sweden)

    D. Bhattacharya

    2016-06-01

    Full Text Available Integrating Sensor Web With Spatial Data Infrastructures (SENSDI aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. It is about research to harness the sensed environment by utilizing domain specific sensor data to create a generalized sensor webframework. The challenges being semantic enablement for Spatial Data Infrastructures, and connecting the interfaces of SDI with interfaces of Sensor Web. The proposed research plan is to Identify sensor data sources, Setup an open source SDI, Match the APIs and functions between Sensor Web and SDI, and Case studies like hazard applications, urban applications etc. We take up co-operative development of SDI best practices to enable a new realm of a location enabled and semantically enriched World Wide Web - the "Geospatial Web" or "Geosemantic Web" by setting up one to one correspondence between WMS, WFS, WCS, Metadata and 'Sensor Observation Service' (SOS; 'Sensor Planning Service' (SPS; 'Sensor Alert Service' (SAS; a service that facilitates asynchronous message interchange between users and services, and between two OGC-SWE services, called the 'Web Notification Service' (WNS. Hence in conclusion, it is of importance to geospatial studies to integrate SDI with Sensor Web. The integration can be done through merging the common OGC interfaces of SDI and Sensor Web. Multi-usability studies to validate integration has to be undertaken as future research.

  7. INVESTIGATING THE EFFECT OF EMPLOYING IMMERSIVE VIRTUAL ENVIRONMENT ON ENHANCING SPATIAL PERCEPTION WITHIN DESIGN PROCESS

    Directory of Open Access Journals (Sweden)

    Rawan Taisser Abu Alatta

    2017-07-01

    Full Text Available The recent developments in Information Technology (IT and digital media have introduced new opportunities to design studio and new dimensions to design and architecture. The current research studies how the immersion of Virtual Reality (VR in architectural design studio affects spatial perception through the design process. The aim of this study is to investigate the effect of using such environments on changing the way how to design for human experience: how it will improve students' spatial understanding of Three Dimensions (3D volumes, and how it will enhance their imagination, enrich their creativity and promote their ability to experience their design's sensations. This study hypothesizes that using an immersive virtual environment in design studio will empower students' imaginations and give them the ability to understand and experience their ideas. It will give them the opportunity to check their design's validity with greater 3D exploration, understanding and comprehension of spatial volumes.  Within a framework of an experimental design research, a series of experiments was conducted to evaluate what had been assumed.  The research used teaching, monitoring, explanatory observation and evaluation methods. The results showed that VR can not only enhance spatial perception and improve the design, but also it can affect the design process and make changes in the architectural design way of thinking. It can help designers to incorporate human experience within the design process.

  8. Sampling and energy evaluation challenges in ligand binding protein design.

    Science.gov (United States)

    Dou, Jiayi; Doyle, Lindsey; Jr Greisen, Per; Schena, Alberto; Park, Hahnbeom; Johnsson, Kai; Stoddard, Barry L; Baker, David

    2017-12-01

    The steroid hormone 17α-hydroxylprogesterone (17-OHP) is a biomarker for congenital adrenal hyperplasia and hence there is considerable interest in development of sensors for this compound. We used computational protein design to generate protein models with binding sites for 17-OHP containing an extended, nonpolar, shape-complementary binding pocket for the four-ring core of the compound, and hydrogen bonding residues at the base of the pocket to interact with carbonyl and hydroxyl groups at the more polar end of the ligand. Eight of 16 designed proteins experimentally tested bind 17-OHP with micromolar affinity. A co-crystal structure of one of the designs revealed that 17-OHP is rotated 180° around a pseudo-two-fold axis in the compound and displays multiple binding modes within the pocket, while still interacting with all of the designed residues in the engineered site. Subsequent rounds of mutagenesis and binding selection improved the ligand affinity to nanomolar range, while appearing to constrain the ligand to a single bound conformation that maintains the same "flipped" orientation relative to the original design. We trace the discrepancy in the design calculations to two sources: first, a failure to model subtle backbone changes which alter the distribution of sidechain rotameric states and second, an underestimation of the energetic cost of desolvating the carbonyl and hydroxyl groups of the ligand. The difference between design model and crystal structure thus arises from both sampling limitations and energy function inaccuracies that are exacerbated by the near two-fold symmetry of the molecule. © 2017 The Authors Protein Science published by Wiley Periodicals, Inc. on behalf of The Protein Society.

  9. PROPOSAL OF SPATIAL OPTIMIZATION OF PRODUCTION PROCESS IN PROCESS DESIGNER

    Directory of Open Access Journals (Sweden)

    Peter Malega

    2015-03-01

    Full Text Available This contribution is focused on optimizing the use of space in the production process using software Process Designer. The aim of this contribution is to suggest possible improvements to the existing layout of the selected production process. Production process was analysed in terms of inputs, outputs and course of actions. Nowadays there are many software solutions aimed at optimizing the use of space. One of these software products is the Process Designer, which belongs to the product line Tecnomatix. This software is primarily aimed at production planning. With Process Designer is possible to design the layout of production and subsequently to analyse the production or to change according to the current needs of the company.

  10. 3D-CAD Effects on Creative Design Performance of Different Spatial Abilities Students

    Science.gov (United States)

    Chang, Y.

    2014-01-01

    Students' creativity is an important focus globally and is interrelated with students' spatial abilities. Additionally, three-dimensional computer-assisted drawing (3D-CAD) overcomes barriers to spatial expression during the creative design process. Does 3D-CAD affect students' creative abilities? The purpose of this study was to explore the…

  11. Visual-Spatial Art and Design Literacy as a Prelude to Aesthetic Growth

    Science.gov (United States)

    Lerner, Fern

    2018-01-01

    In bridging ideas from the forum of visual-spatial learning with those of art and design learning, inspiration is taken from Piaget who explained that the evolution of spatial cognition occurs through perception, as well as through thought and imagination. Insights are embraced from interdisciplinary educational theorists, intertwining and…

  12. Spatial and Social Comparison of the Traditional Neighbourhood and the Modern Gated Community: Eskisehir Sample

    Science.gov (United States)

    Koca, Güler; Kayılıoğlu, Begüm

    2017-10-01

    People’s expectations from the city have changed with the transformation of urban life. Urban space is not the only place where structures are formed. Urban space also consists of a combination of public spaces, semi-public spaces, and private spaces. As social and cultural phenomena, social events occur and people communicate with each other in these spaces. Therefore, streets and neighbourhoods composed of houses are not only physical spaces, but they also have important social and cultural dimensions. Modern life has brought a plethora of changes that affected the cities. Due to rapid changes today, the urban space forms in the conversion process are also designed differently. Historically, the space organization based on the streets of the semi-public life in Turkish cities has been transformed into mass housing and housing estate-style life in recent years. This transformation has been expressed differently in urban life not only physically, but also socially and culturally. The street which is regarded as a public space was a place where people communicated and social events happened in the past; but today, the streets are rife with security problems and they have become a concept evoking an image of street that is bordered with buildings. Spatial separation has emerged with middle and upper classes isolating themselves from the streets and heading towards gated communities, especially for security reasons. This social and spatial separation has begun to lead to various problems in cities. Eskisehir is an important Anatolian city located between Ankara, the capital of Turkey, and Istanbul. This research was conducted in two research sites in Eskisehir: one is a gated community where middle and upper-income groups reside, and the other is a residential neighbourhood where middle-income groups live. These groups were studied through a survey. The spatial preferences of the residents in these two areas and their relation with the neighbourhood are examined

  13. Development of Spatial Scaling Technique of Forest Health Sample Point Information

    Science.gov (United States)

    Lee, J. H.; Ryu, J. E.; Chung, H. I.; Choi, Y. Y.; Jeon, S. W.; Kim, S. H.

    2018-04-01

    Forests provide many goods, Ecosystem services, and resources to humans such as recreation air purification and water protection functions. In rececnt years, there has been an increase in the factors that threaten the health of forests such as global warming due to climate change, environmental pollution, and the increase in interest in forests, and efforts are being made in various countries for forest management. Thus, existing forest ecosystem survey method is a monitoring method of sampling points, and it is difficult to utilize forests for forest management because Korea is surveying only a small part of the forest area occupying 63.7 % of the country (Ministry of Land Infrastructure and Transport Korea, 2016). Therefore, in order to manage large forests, a method of interpolating and spatializing data is needed. In this study, The 1st Korea Forest Health Management biodiversity Shannon;s index data (National Institute of Forests Science, 2015) were used for spatial interpolation. Two widely used methods of interpolation, Kriging method and IDW(Inverse Distance Weighted) method were used to interpolate the biodiversity index. Vegetation indices SAVI, NDVI, LAI and SR were used. As a result, Kriging method was the most accurate method.

  14. DEVELOPMENT OF SPATIAL SCALING TECHNIQUE OF FOREST HEALTH SAMPLE POINT INFORMATION

    Directory of Open Access Journals (Sweden)

    J. H. Lee

    2018-04-01

    Full Text Available Forests provide many goods, Ecosystem services, and resources to humans such as recreation air purification and water protection functions. In rececnt years, there has been an increase in the factors that threaten the health of forests such as global warming due to climate change, environmental pollution, and the increase in interest in forests, and efforts are being made in various countries for forest management. Thus, existing forest ecosystem survey method is a monitoring method of sampling points, and it is difficult to utilize forests for forest management because Korea is surveying only a small part of the forest area occupying 63.7 % of the country (Ministry of Land Infrastructure and Transport Korea, 2016. Therefore, in order to manage large forests, a method of interpolating and spatializing data is needed. In this study, The 1st Korea Forest Health Management biodiversity Shannon;s index data (National Institute of Forests Science, 2015 were used for spatial interpolation. Two widely used methods of interpolation, Kriging method and IDW(Inverse Distance Weighted method were used to interpolate the biodiversity index. Vegetation indices SAVI, NDVI, LAI and SR were used. As a result, Kriging method was the most accurate method.

  15. A novel sampling design to explore gene-longevity associations

    DEFF Research Database (Denmark)

    De Rango, Francesco; Dato, Serena; Bellizzi, Dina

    2008-01-01

    To investigate the genetic contribution to familial similarity in longevity, we set up a novel experimental design where cousin-pairs born from siblings who were concordant or discordant for the longevity trait were analyzed. To check this design, two chromosomal regions already known to encompass...... from concordant and discordant siblings. In addition, we analyzed haplotype transmission from centenarians to offspring, and a statistically significant Transmission Ratio Distortion (TRD) was observed for both chromosomal regions in the discordant families (P=0.007 for 6p21.3 and P=0.015 for 11p15.......5). In concordant families, a marginally significant TRD was observed at 6p21.3 only (P=0.06). Although no significant difference emerged between the two groups of cousin-pairs, our study gave new insights on the hindrances to recruiting a suitable sample to obtain significant IBD data on longevity...

  16. Evaluation of design flood estimates with respect to sample size

    Science.gov (United States)

    Kobierska, Florian; Engeland, Kolbjorn

    2016-04-01

    Estimation of design floods forms the basis for hazard management related to flood risk and is a legal obligation when building infrastructure such as dams, bridges and roads close to water bodies. Flood inundation maps used for land use planning are also produced based on design flood estimates. In Norway, the current guidelines for design flood estimates give recommendations on which data, probability distribution, and method to use dependent on length of the local record. If less than 30 years of local data is available, an index flood approach is recommended where the local observations are used for estimating the index flood and regional data are used for estimating the growth curve. For 30-50 years of data, a 2 parameter distribution is recommended, and for more than 50 years of data, a 3 parameter distribution should be used. Many countries have national guidelines for flood frequency estimation, and recommended distributions include the log Pearson II, generalized logistic and generalized extreme value distributions. For estimating distribution parameters, ordinary and linear moments, maximum likelihood and Bayesian methods are used. The aim of this study is to r-evaluate the guidelines for local flood frequency estimation. In particular, we wanted to answer the following questions: (i) Which distribution gives the best fit to the data? (ii) Which estimation method provides the best fit to the data? (iii) Does the answer to (i) and (ii) depend on local data availability? To answer these questions we set up a test bench for local flood frequency analysis using data based cross-validation methods. The criteria were based on indices describing stability and reliability of design flood estimates. Stability is used as a criterion since design flood estimates should not excessively depend on the data sample. The reliability indices describe to which degree design flood predictions can be trusted.

  17. Transfer function design based on user selected samples for intuitive multivariate volume exploration

    KAUST Repository

    Zhou, Liang; Hansen, Charles

    2013-01-01

    Multivariate volumetric datasets are important to both science and medicine. We propose a transfer function (TF) design approach based on user selected samples in the spatial domain to make multivariate volumetric data visualization more accessible for domain users. Specifically, the user starts the visualization by probing features of interest on slices and the data values are instantly queried by user selection. The queried sample values are then used to automatically and robustly generate high dimensional transfer functions (HDTFs) via kernel density estimation (KDE). Alternatively, 2D Gaussian TFs can be automatically generated in the dimensionality reduced space using these samples. With the extracted features rendered in the volume rendering view, the user can further refine these features using segmentation brushes. Interactivity is achieved in our system and different views are tightly linked. Use cases show that our system has been successfully applied for simulation and complicated seismic data sets. © 2013 IEEE.

  18. Transfer function design based on user selected samples for intuitive multivariate volume exploration

    KAUST Repository

    Zhou, Liang

    2013-02-01

    Multivariate volumetric datasets are important to both science and medicine. We propose a transfer function (TF) design approach based on user selected samples in the spatial domain to make multivariate volumetric data visualization more accessible for domain users. Specifically, the user starts the visualization by probing features of interest on slices and the data values are instantly queried by user selection. The queried sample values are then used to automatically and robustly generate high dimensional transfer functions (HDTFs) via kernel density estimation (KDE). Alternatively, 2D Gaussian TFs can be automatically generated in the dimensionality reduced space using these samples. With the extracted features rendered in the volume rendering view, the user can further refine these features using segmentation brushes. Interactivity is achieved in our system and different views are tightly linked. Use cases show that our system has been successfully applied for simulation and complicated seismic data sets. © 2013 IEEE.

  19. Analysis of Regularly and Irregularly Sampled Spatial, Multivariate, and Multi-temporal Data

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    1994-01-01

    This thesis describes different methods that are useful in the analysis of multivariate data. Some methods focus on spatial data (sampled regularly or irregularly), others focus on multitemporal data or data from multiple sources. The thesis covers selected and not all aspects of relevant data......-variograms are described. As a new way of setting up a well-balanced kriging support the Delaunay triangulation is suggested. Two case studies show the usefulness of 2-D semivariograms of geochemical data from areas in central Spain (with a geologist's comment) and South Greenland, and kriging/cokriging of an undersampled...... are considered as repetitions. Three case studies show the strength of the methods; one uses SPOT High Resolution Visible (HRV) multispectral (XS) data covering economically important pineapple and coffee plantations near Thika, Kiambu District, Kenya, the other two use Landsat Thematic Mapper (TM) data covering...

  20. Spatial heavy metals Zn and Cr distribution in soil samples taken from Tatra Mountains

    International Nuclear Information System (INIS)

    Stobinski, M.; Misiak, R.; Kubica, B.

    2008-03-01

    The basic issue of presented report is showing the spatial heavy metals (Zn and Cr) distribution in soil samples taken from High Mts area. The expertise was done using two analytical techniques: AAS (atomic absorption spectroscopy) and micro-PIXIE (proton induced X-ray emission).Given heavy metals concentration were originated either from soil surface (10 cm depth) or from the whole soil profile. Our evaluation indicates that the Zn and Cr levels measured for mountains region were comparable to the data presented by other authors. Furthermore, the amount of heavy metals is strongly correlated with its natural concentration in parental rock.We also observed that zinc was prone to accumulate in surface, rich in organic matter, soil levels. (author)

  1. Asymptotic analysis of the role of spatial sampling for covariance parameter estimation of Gaussian processes

    International Nuclear Information System (INIS)

    Bachoc, Francois

    2014-01-01

    Covariance parameter estimation of Gaussian processes is analyzed in an asymptotic framework. The spatial sampling is a randomly perturbed regular grid and its deviation from the perfect regular grid is controlled by a single scalar regularity parameter. Consistency and asymptotic normality are proved for the Maximum Likelihood and Cross Validation estimators of the covariance parameters. The asymptotic covariance matrices of the covariance parameter estimators are deterministic functions of the regularity parameter. By means of an exhaustive study of the asymptotic covariance matrices, it is shown that the estimation is improved when the regular grid is strongly perturbed. Hence, an asymptotic confirmation is given to the commonly admitted fact that using groups of observation points with small spacing is beneficial to covariance function estimation. Finally, the prediction error, using a consistent estimator of the covariance parameters, is analyzed in detail. (authors)

  2. The Field Trip as Part of Spatial (Architectural) Design Art Classes

    Science.gov (United States)

    Batic, Janja

    2011-01-01

    Spatial (architectural) design is one of five fields introduced to pupils as part of art education. In planning architectural design tasks, one should take into consideration the particularities of the architectural design process and enable pupils to experience space and relationships within space through their own movement. Furthermore, pupils…

  3. The backbone of a City Information Model (CIM) : Implementing a spatial data model for urban design

    NARCIS (Netherlands)

    Gil, J.A.; Almeida, J.; Duarte, J.P.

    2011-01-01

    We have been witnessing an increased interest in a more holistic approach to urban design practice and education. In this paper we present a spatial data model for urban design that proposes the combination of urban environment feature classes with design process feature classes. This data model is

  4. The SDSS-IV MaNGA Sample: Design, Optimization, and Usage Considerations

    Science.gov (United States)

    Wake, David A.; Bundy, Kevin; Diamond-Stanic, Aleksandar M.; Yan, Renbin; Blanton, Michael R.; Bershady, Matthew A.; Sánchez-Gallego, José R.; Drory, Niv; Jones, Amy; Kauffmann, Guinevere; Law, David R.; Li, Cheng; MacDonald, Nicholas; Masters, Karen; Thomas, Daniel; Tinker, Jeremy; Weijmans, Anne-Marie; Brownstein, Joel R.

    2017-09-01

    We describe the sample design for the SDSS-IV MaNGA survey and present the final properties of the main samples along with important considerations for using these samples for science. Our target selection criteria were developed while simultaneously optimizing the size distribution of the MaNGA integral field units (IFUs), the IFU allocation strategy, and the target density to produce a survey defined in terms of maximizing signal-to-noise ratio, spatial resolution, and sample size. Our selection strategy makes use of redshift limits that only depend on I-band absolute magnitude (M I ), or, for a small subset of our sample, M I and color (NUV - I). Such a strategy ensures that all galaxies span the same range in angular size irrespective of luminosity and are therefore covered evenly by the adopted range of IFU sizes. We define three samples: the Primary and Secondary samples are selected to have a flat number density with respect to M I and are targeted to have spectroscopic coverage to 1.5 and 2.5 effective radii (R e ), respectively. The Color-Enhanced supplement increases the number of galaxies in the low-density regions of color-magnitude space by extending the redshift limits of the Primary sample in the appropriate color bins. The samples cover the stellar mass range 5× {10}8≤slant {M}* ≤slant 3× {10}11 {M}⊙ {h}-2 and are sampled at median physical resolutions of 1.37 and 2.5 kpc for the Primary and Secondary samples, respectively. We provide weights that will statistically correct for our luminosity and color-dependent selection function and IFU allocation strategy, thus correcting the observed sample to a volume-limited sample.

  5. The Transformative Potential of Game Spatiality in Service Design

    NARCIS (Netherlands)

    van Amstel, Frederick; Garde, Julia Anne

    2016-01-01

    Background: Services have tangible and intangible aspects. Services are organized as a system of conceptual ideas (space of possibilities) and are enacted through social and physical arrangements (possibilities of space). Games are employed in service design to expand the space of possibilities with

  6. Visual Sample Plan (VSP) Software: Designs and Data Analyses for Sampling Contaminated Buildings

    International Nuclear Information System (INIS)

    Pulsipher, Brent A.; Wilson, John E.; Gilbert, Richard O.; Nuffer, Lisa L.; Hassig, Nancy L.

    2005-01-01

    A new module of the Visual Sample Plan (VSP) software has been developed to provide sampling designs and data analyses for potentially contaminated buildings. An important application is assessing levels of contamination in buildings after a terrorist attack. This new module, funded by DHS through the Combating Terrorism Technology Support Office, Technical Support Working Group, was developed to provide a tailored, user-friendly and visually-orientated buildings module within the existing VSP software toolkit, the latest version of which can be downloaded from http://dqo.pnl.gov/vsp. In case of, or when planning against, a chemical, biological, or radionuclide release within a building, the VSP module can be used to quickly and easily develop and visualize technically defensible sampling schemes for walls, floors, ceilings, and other surfaces to statistically determine if contamination is present, its magnitude and extent throughout the building and if decontamination has been effective. This paper demonstrates the features of this new VSP buildings module, which include: the ability to import building floor plans or to easily draw, manipulate, and view rooms in several ways; being able to insert doors, windows and annotations into a room; 3-D graphic room views with surfaces labeled and floor plans that show building zones that have separate air handing units. The paper will also discuss the statistical design and data analysis options available in the buildings module. Design objectives supported include comparing an average to a threshold when the data distribution is normal or unknown, and comparing measurements to a threshold to detect hotspots or to insure most of the area is uncontaminated when the data distribution is normal or unknown

  7. Design and realization of tourism spatial decision support system based on GIS

    Science.gov (United States)

    Ma, Zhangbao; Qi, Qingwen; Xu, Li

    2008-10-01

    In this paper, the existing problems of current tourism management information system are analyzed. GIS, tourism as well as spatial decision support system are introduced, and the application of geographic information system technology and spatial decision support system to tourism management and the establishment of tourism spatial decision support system based on GIS are proposed. System total structure, system hardware and software environment, database design and structure module design of this system are introduced. Finally, realization methods of this systemic core functions are elaborated.

  8. Improving neutron multiplicity counting for the spatial dependence of multiplication: Results for spherical plutonium samples

    Energy Technology Data Exchange (ETDEWEB)

    Göttsche, Malte, E-mail: malte.goettsche@physik.uni-hamburg.de; Kirchner, Gerald

    2015-10-21

    The fissile mass deduced from a neutron multiplicity counting measurement of high mass dense items is underestimated if the spatial dependence of the multiplication is not taken into account. It is shown that an appropriate physics-based correction successfully removes the bias. It depends on four correction coefficients which can only be exactly determined if the sample geometry and composition are known. In some cases, for example in warhead authentication, available information on the sample will be very limited. MCNPX-PoliMi simulations have been performed to obtain the correction coefficients for a range of spherical plutonium metal geometries, with and without polyethylene reflection placed around the spheres. For hollow spheres, the analysis shows that the correction coefficients can be approximated with high accuracy as a function of the sphere's thickness depending only slightly on the radius. If the thickness remains unknown, less accurate estimates of the correction coefficients can be obtained from the neutron multiplication. The influence of isotopic composition is limited. The correction coefficients become somewhat smaller when reflection is present.

  9. Effects of Spatial Experiences & Cognitive Styles in the Solution Process of Space-Based Design Problems in the First Year of Architectural Design Education

    Science.gov (United States)

    Erkan Yazici, Yasemin

    2013-01-01

    There are many factors that influence designers in the architectural design process. Cognitive style, which varies according to the cognitive structure of persons, and spatial experience, which is created with spatial data acquired during life are two of these factors. Designers usually refer to their spatial experiences in order to find solutions…

  10. A framework for inference about carnivore density from unstructured spatial sampling of scat using detector dogs

    Science.gov (United States)

    Thompson, Craig M.; Royle, J. Andrew; Garner, James D.

    2012-01-01

    Wildlife management often hinges upon an accurate assessment of population density. Although undeniably useful, many of the traditional approaches to density estimation such as visual counts, livetrapping, or mark–recapture suffer from a suite of methodological and analytical weaknesses. Rare, secretive, or highly mobile species exacerbate these problems through the reality of small sample sizes and movement on and off study sites. In response to these difficulties, there is growing interest in the use of non-invasive survey techniques, which provide the opportunity to collect larger samples with minimal increases in effort, as well as the application of analytical frameworks that are not reliant on large sample size arguments. One promising survey technique, the use of scat detecting dogs, offers a greatly enhanced probability of detection while at the same time generating new difficulties with respect to non-standard survey routes, variable search intensity, and the lack of a fixed survey point for characterizing non-detection. In order to account for these issues, we modified an existing spatially explicit, capture–recapture model for camera trap data to account for variable search intensity and the lack of fixed, georeferenced trap locations. We applied this modified model to a fisher (Martes pennanti) dataset from the Sierra National Forest, California, and compared the results (12.3 fishers/100 km2) to more traditional density estimates. We then evaluated model performance using simulations at 3 levels of population density. Simulation results indicated that estimates based on the posterior mode were relatively unbiased. We believe that this approach provides a flexible analytical framework for reconciling the inconsistencies between detector dog survey data and density estimation procedures.

  11. Mars Rover Sample Return aerocapture configuration design and packaging constraints

    Science.gov (United States)

    Lawson, Shelby J.

    1989-01-01

    This paper discusses the aerodynamics requirements, volume and mass constraints that lead to a biconic aeroshell vehicle design that protects the Mars Rover Sample Return (MRSR) mission elements from launch to Mars landing. The aerodynamic requirements for Mars aerocapture and entry and packaging constraints for the MRSR elements result in a symmetric biconic aeroshell that develops a L/D of 1.0 at 27.0 deg angle of attack. A significant problem in the study is obtaining a cg that provides adequate aerodynamic stability and performance within the mission imposed constraints. Packaging methods that relieve the cg problems include forward placement of aeroshell propellant tanks and incorporating aeroshell structure as lander structure. The MRSR missions developed during the pre-phase A study are discussed with dimensional and mass data included. Further study is needed for some missions to minimize MRSR element volume so that launch mass constraints can be met.

  12. Constant Flux of Spatial Niche Partitioning through High-Resolution Sampling of Magnetotactic Bacteria.

    Science.gov (United States)

    He, Kuang; Gilder, Stuart A; Orsi, William D; Zhao, Xiangyu; Petersen, Nikolai

    2017-10-15

    Magnetotactic bacteria (MTB) swim along magnetic field lines in water. They are found in aquatic habitats throughout the world, yet knowledge of their spatial and temporal distribution remains limited. To help remedy this, we took MTB-bearing sediment from a natural pond, mixed the thoroughly homogenized sediment into two replicate aquaria, and then counted three dominant MTB morphotypes (coccus, spirillum, and rod-shaped MTB cells) at a high spatiotemporal sampling resolution: 36 discrete points in replicate aquaria were sampled every ∼30 days over 198 days. Population centers of the MTB coccus and MTB spirillum morphotypes moved in continual flux, yet they consistently inhabited separate locations, displaying significant anticorrelation. Rod-shaped MTB were initially concentrated toward the northern end of the aquaria, but at the end of the experiment, they were most densely populated toward the south. The finding that the total number of MTB cells increased over time during the experiment argues that population reorganization arose from relative changes in cell division and death and not from migration. The maximum net growth rates were 10, 3, and 1 doublings day -1 and average net growth rates were 0.24, 0.11, and 0.02 doublings day -1 for MTB cocci, MTB spirilla, and rod-shaped MTB, respectively; minimum growth rates for all three morphotypes were -0.03 doublings day -1 Our results suggest that MTB cocci and MTB spirilla occupy distinctly different niches: their horizontal positioning in sediment is anticorrelated and under constant flux. IMPORTANCE Little is known about the horizontal distribution of magnetotactic bacteria in sediment or how the distribution changes over time. We therefore measured three dominant magnetotactic bacterium morphotypes at 36 places in two replicate aquaria each month for 7 months. We found that the spatial positioning of population centers changed over time and that the two most abundant morphotypes (MTB cocci and MTB spirilla

  13. SAMPLING ADAPTIVE STRATEGY AND SPATIAL ORGANISATION ESTIMATION OF SOIL ANIMAL COMMUNITIES AT VARIOUS HIERARCHICAL LEVELS OF URBANISED TERRITORIES

    Directory of Open Access Journals (Sweden)

    Baljuk J.A.

    2014-12-01

    Full Text Available In work the algorithm of adaptive strategy of optimum spatial sampling for studying of the spatial organisation of communities of soil animals in the conditions of an urbanization have been presented. As operating variables the principal components obtained as a result of the analysis of the field data on soil penetration resistance, soils electrical conductivity and density of a forest stand, collected on a quasiregular grid have been used. The locations of experimental polygons have been stated by means of program ESAP. The sampling has been made on a regular grid within experimental polygons. The biogeocoenological estimation of experimental polygons have been made on a basis of A.L.Belgard's ecomorphic analysis. The spatial configuration of biogeocoenosis types has been established on the basis of the data of earth remote sensing and the analysis of digital elevation model. The algorithm was suggested which allows to reveal the spatial organisation of soil animal communities at investigated point, biogeocoenosis, and landscape.

  14. Spatial design and control of graphene flake motion

    Science.gov (United States)

    Ghorbanfekr-Kalashami, H.; Peeters, F. M.; Novoselov, K. S.; Neek-Amal, M.

    2017-08-01

    The force between a sharp scanning probe tip and a surface can drive a graphene flake over crystalline substrates. The recent design of particular patterns of structural defects on a graphene surface allows us to propose an alternative approach for controlling the motion of a graphene flake over a graphene substrate. The thermally induced motion of a graphene flake is controlled by engineering topological defects in the substrate. Such defect regions lead to an inhomogeneous energy landscape and are energetically unfavorable for the motion of the flake, and will invert and scatter graphene flakes when they are moving toward the defect line. Engineering the distribution of these energy barriers results in a controllable trajectory for the thermal motion of the flake without using any external force. We predict superlubricity of the graphene flake for motion along and between particular defect lines. This Rapid Communication provides insights into the frictional forces of interfaces and opens a route to the engineering of the stochastic motion of a graphene flake over any crystalline substrate.

  15. Optimal sampling designs for large-scale fishery sample surveys in Greece

    Directory of Open Access Journals (Sweden)

    G. BAZIGOS

    2007-12-01

    The paper deals with the optimization of the following three large scale sample surveys: biological sample survey of commercial landings (BSCL, experimental fishing sample survey (EFSS, and commercial landings and effort sample survey (CLES.

  16. Estimating black bear density in New Mexico using noninvasive genetic sampling coupled with spatially explicit capture-recapture methods

    Science.gov (United States)

    Gould, Matthew J.; Cain, James W.; Roemer, Gary W.; Gould, William R.

    2016-01-01

    samples. We identified 725 (367 M, 358 F) individuals; the sex ratio for each study area was approximately equal. Our density estimates varied within and among mountain ranges with an estimated density of 21.86 bears/100 km2 (95% CI: 17.83 – 26.80) for the NSC, 19.74 bears/100 km2 (95% CI: 13.77 – 28.30) in the SSC, 25.75 bears/100 km2 (95% CI: 13.22 – 50.14) in the Sandias, 21.86 bears/100 km2 (95% CI: 17.83 – 26.80) in the NSacs, and 16.55 bears/100 km2 (95% CI: 11.64 – 23.53) in the SSacs. Overall detection probability for hair traps and bear rubs, combined, was low across all study areas and ranged from 0.00001 to 0.02. We speculate that detection probabilities were affected by failure of some hair samples to produce a complete genotype due to UV degradation of DNA, and our inability to set and check some sampling devices due to wildfires in the SSC. Ultraviolet radiation levels are particularly high in New Mexico compared to other states where NGS methods have been used because New Mexico receives substantial amounts of sunshine, is relatively high in elevation (1,200 m – 4,000 m), and is at a lower latitude. Despite these sampling difficulties, we were able to produce density estimates for New Mexico black bear populations with levels of precision comparable to estimated black bear densities made elsewhere in the U.S.Our ability to generate reliable black bear density estimates for 3 New Mexico mountain ranges is attributable to our use of a statistically robust study design and analytical method. There are multiple factors that need to be considered when developing future SECR-based density estimation projects. First, the spatial extent of the population of interest and the smallest average home range size must be determined; these will dictate size of the trapping array and spacing necessary between hair traps. The number of technicians needed and access to the study areas will also influence configuration of the trapping array. We believe shorter

  17. Filter Bank Regularized Common Spatial Pattern Ensemble for Small Sample Motor Imagery Classification.

    Science.gov (United States)

    Park, Sang-Hoon; Lee, David; Lee, Sang-Goog

    2018-02-01

    For the last few years, many feature extraction methods have been proposed based on biological signals. Among these, the brain signals have the advantage that they can be obtained, even by people with peripheral nervous system damage. Motor imagery electroencephalograms (EEG) are inexpensive to measure, offer a high temporal resolution, and are intuitive. Therefore, these have received a significant amount of attention in various fields, including signal processing, cognitive science, and medicine. The common spatial pattern (CSP) algorithm is a useful method for feature extraction from motor imagery EEG. However, performance degradation occurs in a small-sample setting (SSS), because the CSP depends on sample-based covariance. Since the active frequency range is different for each subject, it is also inconvenient to set the frequency range to be different every time. In this paper, we propose the feature extraction method based on a filter bank to solve these problems. The proposed method consists of five steps. First, motor imagery EEG is divided by a using filter bank. Second, the regularized CSP (R-CSP) is applied to the divided EEG. Third, we select the features according to mutual information based on the individual feature algorithm. Fourth, parameter sets are selected for the ensemble. Finally, we classify using ensemble based on features. The brain-computer interface competition III data set IVa is used to evaluate the performance of the proposed method. The proposed method improves the mean classification accuracy by 12.34%, 11.57%, 9%, 4.95%, and 4.47% compared with CSP, SR-CSP, R-CSP, filter bank CSP (FBCSP), and SR-FBCSP. Compared with the filter bank R-CSP ( , ), which is a parameter selection version of the proposed method, the classification accuracy is improved by 3.49%. In particular, the proposed method shows a large improvement in performance in the SSS.

  18. Spatial Characterization of Polycyclic Aromatic Hydrocarbons in 2008 TC3 Samples

    Science.gov (United States)

    Sabbah, Hassan; Morrow, A.; Zare, R. N.; Jenniskens, P.

    2009-09-01

    Hassan Sabbah1, Amy L. Morrow1, Richard N. Zare1 and Petrus Jenniskens2 1Stanford University, Stanford, California 94305, 2 SETI Institute, Carl Sagan Center, 515 North Whisman Road, Mountain View, California 94043, USA. In October 2006 a small asteroid (2-3 meters) was observed in outer space. On October 7, 2008, it entered the Earth's atmosphere creating a fireball over Northern Sudan. Some 280 meteorites were collected by the University of Khartoum. In order to explore the existence of organic materials, specifically polycyclic aromatic hydrocarbons (PAHs), we applied two-step laser desorption laser ionization mass spectrometry (L2MS) to some selected fragments. This technique consists of desorbing with a pulsed infrared laser beam the solid materials into a gaseous phase with no fragmentation followed by resonance enhanced multiphoton ionization to analyze the PAH content. L2MS was already applied to an array of extraterrestrial objects including interplanetary dust particles IDPs, carbonaceous chondrites and comet coma particles. Moreover, spatial resolution of PAHs in 2008 TC3 samples was achieved to explore the heterogeneity within individual fragments. The results of these studies and their contribution to understanding the formation of this asteroid will be discussed.

  19. Modeling Alveolar Epithelial Cell Behavior In Spatially Designed Hydrogel Microenvironments

    Science.gov (United States)

    Lewis, Katherine Jean Reeder

    The alveolar epithelium consists of two cell phenotypes, elongated alveolar type I cells (AT1) and rounded alveolar type II cells (ATII), and exists in a complex three-dimensional environment as a polarized cell layer attached to a thin basement membrane and enclosing a roughly spherical lumen. Closely surrounding the alveolar cysts are capillary endothelial cells as well as interstitial pulmonary fibroblasts. Many factors are thought to influence alveolar epithelial cell differentiation during lung development and wound repair, including physical and biochemical signals from the extracellular matrix (ECM), and paracrine signals from the surrounding mesenchyme. In particular, disrupted signaling between the alveolar epithelium and local fibroblasts has been implicated in the progression of several pulmonary diseases. However, given the complexity of alveolar tissue architecture and the multitude of signaling pathways involved, designing appropriate experimental platforms for this biological system has been difficult. In order to isolate key factors regulating cellular behavior, the researcher ideally should have control over biophysical properties of the ECM, as well as the ability to organize multiple cell types within the scaffold. This thesis aimed to develop a 3D synthetic hydrogel platform to control alveolar epithelial cyst formation, which could then be used to explore how extracellular cues influence cell behavior in a tissue-relevant cellular arrangement. To accomplish this, a poly(ethylene glycol) (PEG) hydrogel network containing enzymatically-degradable crosslinks and bioadhesive pendant peptides was employed as a base material for encapsulating primary alveolar epithelial cells. First, an array of microwells of various cross-sectional shapes was photopatterned into a PEG gel containing photo-labile crosslinks, and primary ATII cells were seeded into the wells to examine the role of geometric confinement on differentiation and multicellular arrangement

  20. Assessment of long-term gas sampling design at two commercial manure-belt layer barns.

    Science.gov (United States)

    Chai, Li-Long; Ni, Ji-Qin; Chen, Yan; Diehl, Claude A; Heber, Albert J; Lim, Teng T

    2010-06-01

    Understanding temporal and spatial variations of aerial pollutant concentrations is important for designing air quality monitoring systems. In long-term and continuous air quality monitoring in large livestock and poultry barns, these systems usually use location-shared analyzers and sensors and can only sample air at limited number of locations. To assess the validity of the gas sampling design at a commercial layer farm, a new methodology was developed to map pollutant gas concentrations using portable sensors under steady-state or quasi-steady-state barn conditions. Three assessment tests were conducted from December 2008 to February 2009 in two manure-belt layer barns. Each barn was 140.2 m long and 19.5 m wide and had 250,000 birds. Each test included four measurements of ammonia and carbon dioxide concentrations at 20 locations that covered all operating fans, including six of the fans used in the long-term sampling that represented three zones along the lengths of the barns, to generate data for complete-barn monitoring. To simulate the long-term monitoring, gas concentrations from the six long-term sampling locations were extracted from the 20 assessment locations. Statistical analyses were performed to test the variances (F-test) and sample means (t test) between the 6- and 20-sample data. The study clearly demonstrated ammonia and carbon dioxide concentration gradients that were characterized by increasing concentrations from the west to east ends of the barns following the under-cage manure-belt travel direction. Mean concentrations increased from 7.1 to 47.7 parts per million (ppm) for ammonia and from 2303 to 3454 ppm for carbon dioxide from the west to east of the barns. Variations of mean gas concentrations were much less apparent between the south and north sides of the barns, because they were 21.2 and 20.9 ppm for ammonia and 2979 and 2951 ppm for carbon dioxide, respectively. The null hypotheses that the variances and means between the 6- and 20

  1. Sampling design and procedures for fixed surface-water sites in the Georgia-Florida coastal plain study unit, 1993

    Science.gov (United States)

    Hatzell, H.H.; Oaksford, E.T.; Asbury, C.E.

    1995-01-01

    The implementation of design guidelines for the National Water-Quality Assessment (NAWQA) Program has resulted in the development of new sampling procedures and the modification of existing procedures commonly used in the Water Resources Division of the U.S. Geological Survey. The Georgia-Florida Coastal Plain (GAFL) study unit began the intensive data collection phase of the program in October 1992. This report documents the implementation of the NAWQA guidelines by describing the sampling design and procedures for collecting surface-water samples in the GAFL study unit in 1993. This documentation is provided for agencies that use water-quality data and for future study units that will be entering the intensive phase of data collection. The sampling design is intended to account for large- and small-scale spatial variations, and temporal variations in water quality for the study area. Nine fixed sites were selected in drainage basins of different sizes and different land-use characteristics located in different land-resource provinces. Each of the nine fixed sites was sampled regularly for a combination of six constituent groups composed of physical and chemical constituents: field measurements, major ions and metals, nutrients, organic carbon, pesticides, and suspended sediments. Some sites were also sampled during high-flow conditions and storm events. Discussion of the sampling procedure is divided into three phases: sample collection, sample splitting, and sample processing. A cone splitter was used to split water samples for the analysis of the sampling constituent groups except organic carbon from approximately nine liters of stream water collected at four fixed sites that were sampled intensively. An example of the sample splitting schemes designed to provide the sample volumes required for each sample constituent group is described in detail. Information about onsite sample processing has been organized into a flowchart that describes a pathway for each of

  2. Investigating the tradeoffs between spatial resolution and diffusion sampling for brain mapping with diffusion tractography: time well spent?

    Science.gov (United States)

    Calabrese, Evan; Badea, Alexandra; Coe, Christopher L; Lubach, Gabriele R; Styner, Martin A; Johnson, G Allan

    2014-11-01

    Interest in mapping white matter pathways in the brain has peaked with the recognition that altered brain connectivity may contribute to a variety of neurologic and psychiatric diseases. Diffusion tractography has emerged as a popular method for postmortem brain mapping initiatives, including the ex-vivo component of the human connectome project, yet it remains unclear to what extent computer-generated tracks fully reflect the actual underlying anatomy. Of particular concern is the fact that diffusion tractography results vary widely depending on the choice of acquisition protocol. The two major acquisition variables that consume scan time, spatial resolution, and diffusion sampling, can each have profound effects on the resulting tractography. In this analysis, we determined the effects of the temporal tradeoff between spatial resolution and diffusion sampling on tractography in the ex-vivo rhesus macaque brain, a close primate model for the human brain. We used the wealth of autoradiography-based connectivity data available for the rhesus macaque brain to assess the anatomic accuracy of six time-matched diffusion acquisition protocols with varying balance between spatial and diffusion sampling. We show that tractography results vary greatly, even when the subject and the total acquisition time are held constant. Further, we found that focusing on either spatial resolution or diffusion sampling at the expense of the other is counterproductive. A balanced consideration of both sampling domains produces the most anatomically accurate and consistent results. Copyright © 2014 Wiley Periodicals, Inc.

  3. The study of combining Latin Hypercube Sampling method and LU decomposition method (LULHS method) for constructing spatial random field

    Science.gov (United States)

    WANG, P. T.

    2015-12-01

    Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.

  4. Software Architecture Design for Spatially-Indexed Media in Smart Environments

    Directory of Open Access Journals (Sweden)

    SCHIPOR, O.-A.

    2017-05-01

    Full Text Available We introduce in this work a new software architecture design, based on well-established web communication protocols and scripting languages, for implementing spatially-indexed media in smart environments. We based our approach on specific design guidelines. Our concept of spatially-indexed media enables users to readily instantiate mappings between digital content and specific regions of the physical space. We present an implementation of the architecture using a motion capture system, a large visualization display, and several smart devices. We also present an experimental evaluation of our new software architecture by reporting response times function of changes in the complexity of physical-digital environment.

  5. A General-Purpose Spatial Survey Design for Collaborative Science and Monitoring of Global Environmental Change: The Global Grid

    Directory of Open Access Journals (Sweden)

    David M. Theobald

    2016-09-01

    Full Text Available Recent guidance on environmental modeling and global land-cover validation stresses the need for a probability-based design. Additionally, spatial balance has also been recommended as it ensures more efficient sampling, which is particularly relevant for understanding land use change. In this paper I describe a global sample design and database called the Global Grid (GG that has both of these statistical characteristics, as well as being flexible, multi-scale, and globally comprehensive. The GG is intended to facilitate collaborative science and monitoring of land changes among local, regional, and national groups of scientists and citizens, and it is provided in a variety of open source formats to promote collaborative and citizen science. Since the GG sample grid is provided at multiple scales and is globally comprehensive, it provides a universal, readily-available sample. It also supports uneven probability sample designs through filtering sample locations by user-defined strata. The GG is not appropriate for use at locations above ±85° because the shape and topological distortion of quadrants becomes extreme near the poles. Additionally, the file sizes of the GG datasets are very large at fine scale (resolution ~600 m × 600 m and require a 64-bit integer representation.

  6. A two-phase sampling design for increasing detections of rare species in occupancy surveys

    Science.gov (United States)

    Pacifici, Krishna; Dorazio, Robert M.; Dorazio, Michael J.

    2012-01-01

    1. Occupancy estimation is a commonly used tool in ecological studies owing to the ease at which data can be collected and the large spatial extent that can be covered. One major obstacle to using an occupancy-based approach is the complications associated with designing and implementing an efficient survey. These logistical challenges become magnified when working with rare species when effort can be wasted in areas with none or very few individuals. 2. Here, we develop a two-phase sampling approach that mitigates these problems by using a design that places more effort in areas with higher predicted probability of occurrence. We compare our new sampling design to traditional single-season occupancy estimation under a range of conditions and population characteristics. We develop an intuitive measure of predictive error to compare the two approaches and use simulations to assess the relative accuracy of each approach. 3. Our two-phase approach exhibited lower predictive error rates compared to the traditional single-season approach in highly spatially correlated environments. The difference was greatest when detection probability was high (0·75) regardless of the habitat or sample size. When the true occupancy rate was below 0·4 (0·05-0·4), we found that allocating 25% of the sample to the first phase resulted in the lowest error rates. 4. In the majority of scenarios, the two-phase approach showed lower error rates compared to the traditional single-season approach suggesting our new approach is fairly robust to a broad range of conditions and design factors and merits use under a wide variety of settings. 5. Synthesis and applications. Conservation and management of rare species are a challenging task facing natural resource managers. It is critical for studies involving rare species to efficiently allocate effort and resources as they are usually of a finite nature. We believe our approach provides a framework for optimal allocation of effort while

  7. Random sampling or geostatistical modelling? Choosing between design-based and model-based sampling strategies for soil (with discussion)

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    1997-01-01

    Classical sampling theory has been repeatedly identified with classical statistics which assumes that data are identically and independently distributed. This explains the switch of many soil scientists from design-based sampling strategies, based on classical sampling theory, to the model-based

  8. Spatial filter lens design for the main laser of the National Ignition Facility

    International Nuclear Information System (INIS)

    Korniski, R.J.

    1998-01-01

    The National Ignition Facility (NIF), being designed and constructed at Lawrence Livermore National Laboratory (LLNL), comprises 192 laser beams The lasing medium is neodymium in phosphate glass with a fundamental frequency (1ω) of 1 053microm Sum frequency generation in a pair of conversion crystals (KDP/KD*P) will produce 1 8 megajoules of the third harmonic light (3ω or λ=351microm) at the target The purpose of this paper is to provide the lens design community with the current lens design details of the large optics in the Main Laser This paper describes the lens design configuration and design considerations of the Main Laser The Main Laser is 123 meters long and includes two spatial filters one 13 5 meters and one 60 meters These spatial filters perform crucial beam filtering and relaying functions We shall describe the significant lens design aspects of these spatial filter lenses which allow them to successfully deliver the appropriate beam characteristic onto the target For an overview of NIF please see ''Optical system design of the National Ignition Facility,'' by R Edward English. et al also found in this volume

  9. Generating Improved Experimental Designs with Spatially and Genetically Correlated Observations Using Mixed Models

    Directory of Open Access Journals (Sweden)

    Lazarus K. Mramba

    2018-03-01

    Full Text Available The aim of this study was to generate and evaluate the efficiency of improved field experiments while simultaneously accounting for spatial correlations and different levels of genetic relatedness using a mixed models framework for orthogonal and non-orthogonal designs. Optimality criteria and a search algorithm were implemented to generate randomized complete block (RCB, incomplete block (IB, augmented block (AB and unequally replicated (UR designs. Several conditions were evaluated including size of the experiment, levels of heritability, and optimality criteria. For RCB designs with half-sib or full-sib families, the optimization procedure yielded important improvements under the presence of mild to strong spatial correlation levels and relatively low heritability values. Also, for these designs, improvements in terms of overall design efficiency (ODE% reached values of up to 8.7%, but these gains varied depending on the evaluated conditions. In general, for all evaluated designs, higher ODE% values were achieved from genetically unrelated individuals compared to experiments with half-sib and full-sib families. As expected, accuracy of prediction of genetic values improved as levels of heritability and spatial correlations increased. This study has demonstrated that important improvements in design efficiency and prediction accuracies can be achieved by optimizing how the levels of a treatment are assigned to the experimental units.

  10. Spatial Distribution and Sampling Plans With Fixed Level of Precision for Citrus Aphids (Hom., Aphididae) on Two Orange Species.

    Science.gov (United States)

    Kafeshani, Farzaneh Alizadeh; Rajabpour, Ali; Aghajanzadeh, Sirous; Gholamian, Esmaeil; Farkhari, Mohammad

    2018-04-02

    Aphis spiraecola Patch, Aphis gossypii Glover, and Toxoptera aurantii Boyer de Fonscolombe are three important aphid pests of citrus orchards. In this study, spatial distributions of the aphids on two orange species, Satsuma mandarin and Thomson navel, were evaluated using Taylor's power law and Iwao's patchiness. In addition, a fixed-precision sequential sampling plant was developed for each species on the host plant by Green's model at precision levels of 0.25 and 0.1. The results revealed that spatial distribution parameters and therefore the sampling plan were significantly different according to aphid and host plant species. Taylor's power law provides a better fit for the data than Iwao's patchiness regression. Except T. aurantii on Thomson navel orange, spatial distribution patterns of the aphids were aggregative on both citrus. T. aurantii had regular dispersion pattern on Thomson navel orange. Optimum sample size of the aphids varied from 30-2061 and 1-1622 shoots on Satsuma mandarin and Thomson navel orange based on aphid species and desired precision level. Calculated stop lines of the aphid species on Satsuma mandarin and Thomson navel orange ranged from 0.48 to 19 and 0.19 to 80.4 aphids per 24 shoots according to aphid species and desired precision level. The performance of the sampling plan was validated by resampling analysis using resampling for validation of sampling plans (RVSP) software. This sampling program is useful for IPM program of the aphids in citrus orchards.

  11. Design unbiased estimation in line intersect sampling using segmented transects

    Science.gov (United States)

    David L.R. Affleck; Timothy G. Gregoire; Harry T. Valentine; Harry T. Valentine

    2005-01-01

    In many applications of line intersect sampling. transects consist of multiple, connected segments in a prescribed configuration. The relationship between the transect configuration and the selection probability of a population element is illustrated and a consistent sampling protocol, applicable to populations composed of arbitrarily shaped elements, is proposed. It...

  12. Spatial interpolation

    NARCIS (Netherlands)

    Stein, A.

    1991-01-01

    The theory and practical application of techniques of statistical interpolation are studied in this thesis, and new developments in multivariate spatial interpolation and the design of sampling plans are discussed. Several applications to studies in soil science are

  13. Sampling effects on the identification of roadkill hotspots: Implications for survey design.

    Science.gov (United States)

    Santos, Sara M; Marques, J Tiago; Lourenço, André; Medinas, Denis; Barbosa, A Márcia; Beja, Pedro; Mira, António

    2015-10-01

    Although locating wildlife roadkill hotspots is essential to mitigate road impacts, the influence of study design on hotspot identification remains uncertain. We evaluated how sampling frequency affects the accuracy of hotspot identification, using a dataset of vertebrate roadkills (n = 4427) recorded over a year of daily surveys along 37 km of roads. "True" hotspots were identified using this baseline dataset, as the 500-m segments where the number of road-killed vertebrates exceeded the upper 95% confidence limit of the mean, assuming a Poisson distribution of road-kills per segment. "Estimated" hotspots were identified likewise, using datasets representing progressively lower sampling frequencies, which were produced by extracting data from the baseline dataset at appropriate time intervals (1-30 days). Overall, 24.3% of segments were "true" hotspots, concentrating 40.4% of roadkills. For different groups, "true" hotspots accounted from 6.8% (bats) to 29.7% (small birds) of road segments, concentrating from 60% (lizards, lagomorphs, carnivores) of roadkills. Spatial congruence between "true" and "estimated" hotspots declined rapidly with increasing time interval between surveys, due primarily to increasing false negatives (i.e., missing "true" hotspots). There were also false positives (i.e., wrong "estimated" hotspots), particularly at low sampling frequencies. Spatial accuracy decay with increasing time interval between surveys was higher for smaller-bodied (amphibians, reptiles, small birds, small mammals) than for larger-bodied species (birds of prey, hedgehogs, lagomorphs, carnivores). Results suggest that widely used surveys at weekly or longer intervals may produce poor estimates of roadkill hotspots, particularly for small-bodied species. Surveying daily or at two-day intervals may be required to achieve high accuracy in hotspot identification for multiple species. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. A neural network-based optimal spatial filter design method for motor imagery classification.

    Directory of Open Access Journals (Sweden)

    Ayhan Yuksel

    Full Text Available In this study, a novel spatial filter design method is introduced. Spatial filtering is an important processing step for feature extraction in motor imagery-based brain-computer interfaces. This paper introduces a new motor imagery signal classification method combined with spatial filter optimization. We simultaneously train the spatial filter and the classifier using a neural network approach. The proposed spatial filter network (SFN is composed of two layers: a spatial filtering layer and a classifier layer. These two layers are linked to each other with non-linear mapping functions. The proposed method addresses two shortcomings of the common spatial patterns (CSP algorithm. First, CSP aims to maximize the between-classes variance while ignoring the minimization of within-classes variances. Consequently, the features obtained using the CSP method may have large within-classes variances. Second, the maximizing optimization function of CSP increases the classification accuracy indirectly because an independent classifier is used after the CSP method. With SFN, we aimed to maximize the between-classes variance while minimizing within-classes variances and simultaneously optimizing the spatial filter and the classifier. To classify motor imagery EEG signals, we modified the well-known feed-forward structure and derived forward and backward equations that correspond to the proposed structure. We tested our algorithm on simple toy data. Then, we compared the SFN with conventional CSP and its multi-class version, called one-versus-rest CSP, on two data sets from BCI competition III. The evaluation results demonstrate that SFN is a good alternative for classifying motor imagery EEG signals with increased classification accuracy.

  15. Using remote sensing images to design optimal field sampling schemes

    CSIR Research Space (South Africa)

    Debba, Pravesh

    2008-08-01

    Full Text Available sampling schemes case studies Optimized field sampling representing the overall distribution of a particular mineral Deriving optimal exploration target zones CONTINUUM REMOVAL for vegetation [13, 27, 46]. The convex hull transform is a method... of normalizing spectra [16, 41]. The convex hull technique is anal- ogous to fitting a rubber band over a spectrum to form a continuum. Figure 5 shows the concept of the convex hull transform. The differ- ence between the hull and the orig- inal spectrum...

  16. A stratified two-stage sampling design for digital soil mapping in a Mediterranean basin

    Science.gov (United States)

    Blaschek, Michael; Duttmann, Rainer

    2015-04-01

    The quality of environmental modelling results often depends on reliable soil information. In order to obtain soil data in an efficient manner, several sampling strategies are at hand depending on the level of prior knowledge and the overall objective of the planned survey. This study focuses on the collection of soil samples considering available continuous secondary information in an undulating, 16 km²-sized river catchment near Ussana in southern Sardinia (Italy). A design-based, stratified, two-stage sampling design has been applied aiming at the spatial prediction of soil property values at individual locations. The stratification based on quantiles from density functions of two land-surface parameters - topographic wetness index and potential incoming solar radiation - derived from a digital elevation model. Combined with four main geological units, the applied procedure led to 30 different classes in the given test site. Up to six polygons of each available class were selected randomly excluding those areas smaller than 1ha to avoid incorrect location of the points in the field. Further exclusion rules were applied before polygon selection masking out roads and buildings using a 20m buffer. The selection procedure was repeated ten times and the set of polygons with the best geographical spread were chosen. Finally, exact point locations were selected randomly from inside the chosen polygon features. A second selection based on the same stratification and following the same methodology (selecting one polygon instead of six) was made in order to create an appropriate validation set. Supplementary samples were obtained during a second survey focusing on polygons that have either not been considered during the first phase at all or were not adequately represented with respect to feature size. In total, both field campaigns produced an interpolation set of 156 samples and a validation set of 41 points. The selection of sample point locations has been done using

  17. Accuracy assessment of the National Forest Inventory map of Mexico: sampling designs and the fuzzy characterization of landscapes

    Directory of Open Access Journals (Sweden)

    Stéphane Couturier

    2009-10-01

    Full Text Available There is no record so far in the literature of a comprehensive method to assess the accuracy of regional scale Land Cover/ Land Use (LCLU maps in the sub-tropical belt. The elevated biodiversity and the presence of highly fragmented classes hamper the use of sampling designs commonly employed in previous assessments of mainly temperate zones. A sampling design for assessing the accuracy of the Mexican National Forest Inventory (NFI map at community level is presented. A pilot study was conducted on the Cuitzeo Lake watershed region covering 400 000 ha of the 2000 Landsat-derived map. Various sampling designs were tested in order to find a trade-off between operational costs, a good spatial distribution of the sample and the inclusion of all scarcely distributed classes (‘rare classes’. A two-stage sampling design where the selection of Primary Sampling Units (PSU was done under separate schemes for commonly and scarcely distributed classes, showed best characteristics. A total of 2 023 punctual secondary sampling units were verified against their NFI map label. Issues regarding the assessment strategy and trends of class confusions are devised.

  18. Designing waveforms for temporal encoding using a frequency sampling method

    DEFF Research Database (Denmark)

    Gran, Fredrik; Jensen, Jørgen Arendt

    2007-01-01

    was compared to a linear frequency modulated signal with amplitude tapering, previously used in clinical studies for synthetic transmit aperture imaging. The latter had a relatively flat spectrum which implied that the waveform tried to excite all frequencies including ones with low amplification. The proposed......In this paper a method for designing waveforms for temporal encoding in medical ultrasound imaging is described. The method is based on least squares optimization and is used to design nonlinear frequency modulated signals for synthetic transmit aperture imaging. By using the proposed design method...... waveform, on the other hand, was designed so that only frequencies where the transducer had a large amplification were excited. Hereby, unnecessary heating of the transducer could be avoided and the signal-tonoise ratio could be increased. The experimental ultrasound scanner RASMUS was used to evaluate...

  19. Passive Sampling to Capture the Spatial Variability of Coarse Particles by Composition in Cleveland, OH

    Science.gov (United States)

    Passive samplers deployed at 25 sites for three week-long intervals were used to characterize spatial variability in the mass and composition of coarse particulate matter (PM10-2.5) in Cleveland, OH in summer 2008. The size and composition of individual particles deter...

  20. Simplified methods for spatial sampling: application to first-phase data of Italian National Forest Inventory (INFC in Sicily

    Directory of Open Access Journals (Sweden)

    Cullotta S

    2006-01-01

    Full Text Available Methodological approaches able to integrate data from sample plots with cartographic processes are widely applied. Based on mathematic-statistical techniques, the spatial analysis allows the exploration and spatialization of geographic data. Starting from the punctual information on land use types obtained from the dataset of the first phase of the ongoing new Italian NFI (INFC, a spatialization of land cover classes was carried out using the Inverse Distance Weighting (IDW method. In order to validate the obtained results, an overlay with other vectorial land use data was carried out. In particular, the overlay compared data at different scales, evaluating differences in terms of degree of correspondence between the interpolated and reference land cover.

  1. Implications of Clinical Trial Design on Sample Size Requirements

    OpenAIRE

    Leon, Andrew C.

    2008-01-01

    The primary goal in designing a randomized controlled clinical trial (RCT) is to minimize bias in the estimate of treatment effect. Randomized group assignment, double-blinded assessments, and control or comparison groups reduce the risk of bias. The design must also provide sufficient statistical power to detect a clinically meaningful treatment effect and maintain a nominal level of type I error. An attempt to integrate neurocognitive science into an RCT poses additional challenges. Two par...

  2. Design-based Sample and Probability Law-Assumed Sample: Their Role in Scientific Investigation.

    Science.gov (United States)

    Ojeda, Mario Miguel; Sahai, Hardeo

    2002-01-01

    Discusses some key statistical concepts in probabilistic and non-probabilistic sampling to provide an overview for understanding the inference process. Suggests a statistical model constituting the basis of statistical inference and provides a brief review of the finite population descriptive inference and a quota sampling inferential theory.…

  3. An instrument design and sample strategy for measuring soil respiration in the coastal temperate rain forest

    Science.gov (United States)

    Nay, S. M.; D'Amore, D. V.

    2009-12-01

    The coastal temperate rainforest (CTR) along the northwest coast of North America is a large and complex mosaic of forests and wetlands located on an undulating terrain ranging from sea level to thousands of meters in elevation. This biome stores a dynamic portion of the total carbon stock of North America. The fate of the terrestrial carbon stock is of concern due to the potential for mobilization and export of this store to both the atmosphere as carbon respiration flux and ocean as dissolved organic and inorganic carbon flux. Soil respiration is the largest export vector in the system and must be accurately measured to gain any comprehensive understanding of how carbon moves though this system. Suitable monitoring tools capable of measuring carbon fluxes at small spatial scales are essential for our understanding of carbon dynamics at larger spatial scales within this complex assemblage of ecosystems. We have adapted instrumentation and developed a sampling strategy for optimizing replication of soil respiration measurements to quantify differences among spatially complex landscape units of the CTR. We start with the design of the instrument to ease the technological, ergonomic and financial barriers that technicians encounter in monitoring the efflux of CO2 from the soil. Our sampling strategy optimizes the physical efforts of the field work and manages for the high variation of flux measurements encountered in this difficult environment of rough terrain, dense vegetation and wet climate. Our soil respirometer incorporates an infra-red gas analyzer (LiCor Inc. LI-820) and an 8300 cm3 soil respiration chamber; the device is durable, lightweight, easy to operate and can be built for under $5000 per unit. The modest unit price allows for a multiple unit fleet to be deployed and operated in an intensive field monitoring campaign. We use a large 346 cm2 collar to accommodate as much micro spatial variation as feasible and to facilitate repeated measures for tracking

  4. Optimal Spatial Design of Capacity and Quantity of Rainwater Harvesting Systems for Urban Flood Mitigation

    Directory of Open Access Journals (Sweden)

    Chien-Lin Huang

    2015-09-01

    Full Text Available This study adopts rainwater harvesting systems (RWHS into a stormwater runoff management model (SWMM for the spatial design of capacities and quantities of rain barrel for urban flood mitigation. A simulation-optimization model is proposed for effectively identifying the optimal design. First of all, we particularly classified the characteristic zonal subregions for spatial design by using fuzzy C-means clustering with the investigated data of urban roof, land use and drainage system. In the simulation method, a series of regular spatial arrangements specification are designed by using statistical quartiles analysis for rooftop area and rainfall frequency analysis; accordingly, the corresponding reduced flooding circumstances can be simulated by SWMM. Moreover, the most effective solution for the simulation method is identified from the calculated net benefit, which is equivalent to the subtraction of the facility cost from the decreased inundation loss. It serves as the initially identified solution for the optimization model. In the optimization method, backpropagation neural network (BPNN are first applied for developing a water level simulation model of urban drainage systems to substitute for SWMM to conform to newly considered interdisciplinary multi-objective optimization model, and a tabu search-based algorithm is used with the embedded BPNN-based SWMM to optimize the planning solution. The developed method is applied to the Zhong-He District, Taiwan. Results demonstrate that the application of tabu search and the BPNN-based simulation model into the optimization model can effectively, accurately and fast search optimal design considering economic net benefit. Furthermore, the optimized spatial rain barrel design could reduce 72% of inundation losses according to the simulated flood events.

  5. Cost-effective sampling of 137Cs-derived net soil redistribution: part 1 – estimating the spatial mean across scales of variation

    International Nuclear Information System (INIS)

    Li, Y.; Chappell, A.; Nyamdavaa, B.; Yu, H.; Davaasuren, D.; Zoljargal, K.

    2015-01-01

    The 137 Cs technique for estimating net time-integrated soil redistribution is valuable for understanding the factors controlling soil redistribution by all processes. The literature on this technique is dominated by studies of individual fields and describes its typically time-consuming nature. We contend that the community making these studies has inappropriately assumed that many 137 Cs measurements are required and hence estimates of net soil redistribution can only be made at the field scale. Here, we support future studies of 137 Cs-derived net soil redistribution to apply their often limited resources across scales of variation (field, catchment, region etc.) without compromising the quality of the estimates at any scale. We describe a hybrid, design-based and model-based, stratified random sampling design with composites to estimate the sampling variance and a cost model for fieldwork and laboratory measurements. Geostatistical mapping of net (1954–2012) soil redistribution as a case study on the Chinese Loess Plateau is compared with estimates for several other sampling designs popular in the literature. We demonstrate the cost-effectiveness of the hybrid design for spatial estimation of net soil redistribution. To demonstrate the limitations of current sampling approaches to cut across scales of variation, we extrapolate our estimate of net soil redistribution across the region, show that for the same resources, estimates from many fields could have been provided and would elucidate the cause of differences within and between regional estimates. We recommend that future studies evaluate carefully the sampling design to consider the opportunity to investigate 137 Cs-derived net soil redistribution across scales of variation. - Highlights: • The 137 Cs technique estimates net time-integrated soil redistribution by all processes. • It is time-consuming and dominated by studies of individual fields. • We use limited resources to estimate soil

  6. Conditional estimation of exponential random graph models from snowball sampling designs

    NARCIS (Netherlands)

    Pattison, Philippa E.; Robins, Garry L.; Snijders, Tom A. B.; Wang, Peng

    2013-01-01

    A complete survey of a network in a large population may be prohibitively difficult and costly. So it is important to estimate models for networks using data from various network sampling designs, such as link-tracing designs. We focus here on snowball sampling designs, designs in which the members

  7. Implications of clinical trial design on sample size requirements.

    Science.gov (United States)

    Leon, Andrew C

    2008-07-01

    The primary goal in designing a randomized controlled clinical trial (RCT) is to minimize bias in the estimate of treatment effect. Randomized group assignment, double-blinded assessments, and control or comparison groups reduce the risk of bias. The design must also provide sufficient statistical power to detect a clinically meaningful treatment effect and maintain a nominal level of type I error. An attempt to integrate neurocognitive science into an RCT poses additional challenges. Two particularly relevant aspects of such a design often receive insufficient attention in an RCT. Multiple outcomes inflate type I error, and an unreliable assessment process introduces bias and reduces statistical power. Here we describe how both unreliability and multiple outcomes can increase the study costs and duration and reduce the feasibility of the study. The objective of this article is to consider strategies that overcome the problems of unreliability and multiplicity.

  8. Chemical and Metallurgy Research (CMR) Sample Tracking System Design Document

    International Nuclear Information System (INIS)

    Bargelski, C. J.; Berrett, D. E.

    1998-01-01

    The purpose of this document is to describe the system architecture of the Chemical and Metallurgy Research (CMR) Sample Tracking System at Los Alamos National Laboratory. During the course of the document observations are made concerning the objectives, constraints and limitations, technical approaches, and the technical deliverables

  9. Effects-Driven Participatory Design: Learning from Sampling Interruptions

    DEFF Research Database (Denmark)

    Brandrup, Morten; Østergaard, Kija Lin; Hertzum, Morten

    2017-01-01

    a sustained focus on pursued effects and uses the experience sampling method (ESM) to collect real-use feedback. To illustrate the use of the method we analyze a case that involves the organizational implementation of electronic whiteboards at a Danish hospital to support the clinicians’ intra...

  10. Interface Design Implications for Recalling the Spatial Configuration of Virtual Auditory Environments

    Science.gov (United States)

    McMullen, Kyla A.

    study also found that the presence of visual reference frames significantly increased recall accuracy. Additionally, the incorporation of drastic attenuation significantly improved environment recall accuracy. Through investigating the aforementioned concerns, the present study made initial footsteps guiding the design of virtual auditory environments that support spatial configuration recall.

  11. Concepts: Integrating population survey data from different spatial scales, sampling methods, and species

    Science.gov (United States)

    Dorazio, Robert; Delampady, Mohan; Dey, Soumen; Gopalaswamy, Arjun M.; Karanth, K. Ullas; Nichols, James D.

    2017-01-01

    Conservationists and managers are continually under pressure from the public, the media, and political policy makers to provide “tiger numbers,” not just for protected reserves, but also for large spatial scales, including landscapes, regions, states, nations, and even globally. Estimating the abundance of tigers within relatively small areas (e.g., protected reserves) is becoming increasingly tractable (see Chaps. 9 and 10), but doing so for larger spatial scales still presents a formidable challenge. Those who seek “tiger numbers” are often not satisfied by estimates of tiger occupancy alone, regardless of the reliability of the estimates (see Chaps. 4 and 5). As a result, wherever tiger conservation efforts are underway, either substantially or nominally, scientists and managers are frequently asked to provide putative large-scale tiger numbers based either on a total count or on an extrapolation of some sort (see Chaps. 1 and 2).

  12. Green Infrastructure Design Based on Spatial Conservation Prioritization and Modeling of Biodiversity Features and Ecosystem Services.

    Science.gov (United States)

    Snäll, Tord; Lehtomäki, Joona; Arponen, Anni; Elith, Jane; Moilanen, Atte

    2016-02-01

    There is high-level political support for the use of green infrastructure (GI) across Europe, to maintain viable populations and to provide ecosystem services (ES). Even though GI is inherently a spatial concept, the modern tools for spatial planning have not been recognized, such as in the recent European Environment Agency (EEA) report. We outline a toolbox of methods useful for GI design that explicitly accounts for biodiversity and ES. Data on species occurrence, habitats, and environmental variables are increasingly available via open-access internet platforms. Such data can be synthesized by statistical species distribution modeling, producing maps of biodiversity features. These, together with maps of ES, can form the basis for GI design. We argue that spatial conservation prioritization (SCP) methods are effective tools for GI design, as the overall SCP goal is cost-effective allocation of conservation efforts. Corridors are currently promoted by the EEA as the means for implementing GI design, but they typically target the needs of only a subset of the regional species pool. SCP methods would help to ensure that GI provides a balanced solution for the requirements of many biodiversity features (e.g., species, habitat types) and ES simultaneously in a cost-effective manner. Such tools are necessary to make GI into an operational concept for combating biodiversity loss and promoting ES.

  13. Mobile platform sampling for designing environmental sensor networks.

    Science.gov (United States)

    Budi, Setia; de Souza, Paulo; Timms, Greg; Susanto, Ferry; Malhotra, Vishv; Turner, Paul

    2018-02-09

    This paper proposes a method to design the deployment of sensor nodes in a new region where historical data is not available. A number of mobile platforms are simulated to build initial knowledge of the region. Further, an evolutionary algorithm is employed to find the optimum placement of a given number of sensor nodes that best represents the region of interest.

  14. Planning spatial sampling of the soil from an uncertain reconnaissance variogram

    Science.gov (United States)

    Lark, R. Murray; Hamilton, Elliott M.; Kaninga, Belinda; Maseka, Kakoma K.; Mutondo, Moola; Sakala, Godfrey M.; Watts, Michael J.

    2017-12-01

    An estimated variogram of a soil property can be used to support a rational choice of sampling intensity for geostatistical mapping. However, it is known that estimated variograms are subject to uncertainty. In this paper we address two practical questions. First, how can we make a robust decision on sampling intensity, given the uncertainty in the variogram? Second, what are the costs incurred in terms of oversampling because of uncertainty in the variogram model used to plan sampling? To achieve this we show how samples of the posterior distribution of variogram parameters, from a computational Bayesian analysis, can be used to characterize the effects of variogram parameter uncertainty on sampling decisions. We show how one can select a sample intensity so that a target value of the kriging variance is not exceeded with some specified probability. This will lead to oversampling, relative to the sampling intensity that would be specified if there were no uncertainty in the variogram parameters. One can estimate the magnitude of this oversampling by treating the tolerable grid spacing for the final sample as a random variable, given the target kriging variance and the posterior sample values. We illustrate these concepts with some data on total uranium content in a relatively sparse sample of soil from agricultural land near mine tailings in the Copperbelt Province of Zambia.

  15. Determination of the complex refractive index segments of turbid sample with multispectral spatially modulated structured light and models approximation

    Science.gov (United States)

    Meitav, Omri; Shaul, Oren; Abookasis, David

    2017-09-01

    Spectral data enabling the derivation of a biological tissue sample's complex refractive index (CRI) can provide a range of valuable information in the clinical and research contexts. Specifically, changes in the CRI reflect alterations in tissue morphology and chemical composition, enabling its use as an optical marker during diagnosis and treatment. In the present work, we report a method for estimating the real and imaginary parts of the CRI of a biological sample using Kramers-Kronig (KK) relations in the spatial frequency domain. In this method, phase-shifted sinusoidal patterns at single high spatial frequency are serially projected onto the sample surface at different near-infrared wavelengths while a camera mounted normal to the sample surface acquires the reflected diffuse light. In the offline analysis pipeline, recorded images at each wavelength are converted to spatial phase maps using KK analysis and are then calibrated against phase-models derived from diffusion approximation. The amplitude of the reflected light, together with phase data, is then introduced into Fresnel equations to resolve both real and imaginary segments of the CRI at each wavelength. The technique was validated in tissue-mimicking phantoms with known optical parameters and in mouse models of ischemic injury and heat stress. Experimental data obtained indicate variations in the CRI among brain tissue suffering from injury. CRI fluctuations correlated with alterations in the scattering and absorption coefficients of the injured tissue are demonstrated. This technique for deriving dynamic changes in the CRI of tissue may be further developed as a clinical diagnostic tool and for biomedical research applications. To the best of our knowledge, this is the first report of the estimation of the spectral CRI of a mouse head following injury obtained in the spatial frequency domain.

  16. The effects of computer-aided design software on engineering students' spatial visualisation skills

    Science.gov (United States)

    Kösa, Temel; Karakuş, Fatih

    2018-03-01

    The purpose of this study was to determine the influence of computer-aided design (CAD) software-based instruction on the spatial visualisation skills of freshman engineering students in a computer-aided engineering drawing course. A quasi-experimental design was applied, using the Purdue Spatial Visualization Test-Visualization of Rotations (PSVT:R) for both the pre- and the post-test. The participants were 116 freshman students in the first year of their undergraduate programme in the Department of Mechanical Engineering at a university in Turkey. A total of 72 students comprised the experimental group; they were instructed with CAD-based activities in an engineering drawing course. The control group consisted of 44 students who did not attend this course. The results of the study showed that a CAD-based engineering drawing course had a positive effect on developing engineering students' spatial visualisation skills. Additionally, the results of the study showed that spatial visualisation skills can be a predictor for success in a computer-aided engineering drawing course.

  17. Spatial and temporal variations in cadmium concentrations and burdens in the Pacific oyster (Crassostrea gigas) sampled from the Pacific north-west.

    Science.gov (United States)

    Bendell, Leah I; Feng, Cindy

    2009-08-01

    Oysters from the north-west coast of Canada contain high levels of cadmium, a toxic metal, in amounts that exceed food safety guidelines for international markets. A first required step to determine the sources of cadmium is to identify possible spatial and temporal trends in the accumulation of cadmium by the oyster. To meet this objective, rather than sample wild and cultured oysters of unknown age and origin, an oyster "grow-out" experiment was initiated. Cultured oyster seed was suspended in the water column up to a depth of 7 m and the oyster seed allowed to mature a period of 3 years until market size. Oysters were sampled bimonthly and at time of sampling, temperature, chlorophyll-a, turbidity and salinity were measured. Oyster total shell length, dry tissue weights, cadmium concentrations (microg g(-1)) and burdens (microg of cadmium oyster(-1)) were determined. Oyster cadmium concentrations and burdens were then interpreted with respect to the spatial and temporal sampling design as well as to the measured physio-chemical and biotic variables. When expressed as a concentration, there was a marked seasonality with concentrations being greater in winter as compared in summer; however no spatial trend was evident. When expressed as a burden which corrects for differences in tissue mass, there was no seasonality, however cadmium oyster burdens increased from south to north. Comparison of cadmium accumulation rates oyster(-1) among sites indicated three locations, Webster Island, on the west side of Vancouver Island, and two within Desolation Sound, Teakerne Arm and Redonda Bay, where point sources of cadmium which are not present at all other sampling locations may be contributing to overall oyster cadmium burdens. Of the four physio-chemical factors measured only temperature and turbidity weakly correlated with tissue cadmium concentrations (r(2)=-0.13; p<0.05). By expressing oyster cadmium both as concentration and burden, regional and temporal patterns were

  18. Design of a groundwater sampling network for Minnesota

    International Nuclear Information System (INIS)

    Kanivetsky, R.

    1977-01-01

    This folio was compiled to facilitate the use of groundwater as a sampling medium to aid in exploration for hitherto undiscovered deposits of uranium in the subsurface rocks of Minnesota. The report consists of the following sheets of the hydrogeologic map of Minnesota: (1) map of bedrock hydrogeology, (2) generalized cross sections of the hydrogeologic map of Minnesota, showing both Quaternary deposits and bedrock, (3) map of waterwells that penetrate Precambrian rocks in Minnesota. A list of these wells, showing locations, names of owners, type of Precambrian aquifers penetrated, lithologic material of the aquifers, and well depths is provided in the appendix to this report. Structural settings, locations, and composition of the bedrock aquifers, movement of groundwater, and preliminary suggestions for a sampling program are discussed below under the heading Bedrock Hydrogeology of Minnesota. The map sheet showing Quaternary hydrogeology is not included in this report because the chemistry of groundwater in these deposits is not directly related to bedrock mineralization

  19. Measuring Radionuclides in the environment: radiological quantities and sampling designs

    International Nuclear Information System (INIS)

    Voigt, G.

    1998-10-01

    One aim of the workshop was to support and provide an ICRU report committee (International Union of Radiation Units) with actual information on techniques, data and knowledge of modern radioecology when radionuclides are to be measured in the environment. It has been increasingly recognised that some studies in radioecology, especially those involving both field sampling and laboratory measurements, have not paid adequate attention to the problem of obtaining representative, unbiased samples. This can greatly affect the quality of scientific interpretation, and the ability to manage the environment. Further, as the discipline of radioecology has developed, it has seen a growth in the numbers of quantities and units used, some of which are ill-defined and which are non-standardised. (orig.)

  20. Spatial Distribution of Stony Desertification and Key Influencing Factors on Different Sampling Scales in Small Karst Watersheds

    Science.gov (United States)

    Zhang, Zhenming; Zhou, Yunchao; Wang, Shijie

    2018-01-01

    Karst areas are typical ecologically fragile areas, and stony desertification has become the most serious ecological and economic problems in these areas worldwide as well as a source of disasters and poverty. A reasonable sampling scale is of great importance for research on soil science in karst areas. In this paper, the spatial distribution of stony desertification characteristics and its influencing factors in karst areas are studied at different sampling scales using a grid sampling method based on geographic information system (GIS) technology and geo-statistics. The rock exposure obtained through sampling over a 150 m × 150 m grid in the Houzhai River Basin was utilized as the original data, and five grid scales (300 m × 300 m, 450 m × 450 m, 600 m × 600 m, 750 m × 750 m, and 900 m × 900 m) were used as the subsample sets. The results show that the rock exposure does not vary substantially from one sampling scale to another, while the average values of the five subsamples all fluctuate around the average value of the entire set. As the sampling scale increases, the maximum value and the average value of the rock exposure gradually decrease, and there is a gradual increase in the coefficient of variability. At the scale of 150 m × 150 m, the areas of minor stony desertification, medium stony desertification, and major stony desertification in the Houzhai River Basin are 7.81 km2, 4.50 km2, and 1.87 km2, respectively. The spatial variability of stony desertification at small scales is influenced by many factors, and the variability at medium scales is jointly influenced by gradient, rock content, and rock exposure. At large scales, the spatial variability of stony desertification is mainly influenced by soil thickness and rock content. PMID:29652811

  1. Spatial Distribution of Stony Desertification and Key Influencing Factors on Different Sampling Scales in Small Karst Watersheds

    Directory of Open Access Journals (Sweden)

    Zhenming Zhang

    2018-04-01

    Full Text Available Karst areas are typical ecologically fragile areas, and stony desertification has become the most serious ecological and economic problems in these areas worldwide as well as a source of disasters and poverty. A reasonable sampling scale is of great importance for research on soil science in karst areas. In this paper, the spatial distribution of stony desertification characteristics and its influencing factors in karst areas are studied at different sampling scales using a grid sampling method based on geographic information system (GIS technology and geo-statistics. The rock exposure obtained through sampling over a 150 m × 150 m grid in the Houzhai River Basin was utilized as the original data, and five grid scales (300 m × 300 m, 450 m × 450 m, 600 m × 600 m, 750 m × 750 m, and 900 m × 900 m were used as the subsample sets. The results show that the rock exposure does not vary substantially from one sampling scale to another, while the average values of the five subsamples all fluctuate around the average value of the entire set. As the sampling scale increases, the maximum value and the average value of the rock exposure gradually decrease, and there is a gradual increase in the coefficient of variability. At the scale of 150 m × 150 m, the areas of minor stony desertification, medium stony desertification, and major stony desertification in the Houzhai River Basin are 7.81 km2, 4.50 km2, and 1.87 km2, respectively. The spatial variability of stony desertification at small scales is influenced by many factors, and the variability at medium scales is jointly influenced by gradient, rock content, and rock exposure. At large scales, the spatial variability of stony desertification is mainly influenced by soil thickness and rock content.

  2. Use of spatially distributed time-integrated sediment sampling networks and distributed fine sediment modelling to inform catchment management.

    Science.gov (United States)

    Perks, M T; Warburton, J; Bracken, L J; Reaney, S M; Emery, S B; Hirst, S

    2017-11-01

    Under the EU Water Framework Directive, suspended sediment is omitted from environmental quality standards and compliance targets. This omission is partly explained by difficulties in assessing the complex dose-response of ecological communities. But equally, it is hindered by a lack of spatially distributed estimates of suspended sediment variability across catchments. In this paper, we demonstrate the inability of traditional, discrete sampling campaigns for assessing exposure to fine sediment. Sampling frequencies based on Environmental Quality Standard protocols, whilst reflecting typical manual sampling constraints, are unable to determine the magnitude of sediment exposure with an acceptable level of precision. Deviations from actual concentrations range between -35 and +20% based on the interquartile range of simulations. As an alternative, we assess the value of low-cost, suspended sediment sampling networks for quantifying suspended sediment transfer (SST). In this study of the 362 km 2 upland Esk catchment we observe that spatial patterns of sediment flux are consistent over the two year monitoring period across a network of 17 monitoring sites. This enables the key contributing sub-catchments of Butter Beck (SST: 1141 t km 2 yr -1 ) and Glaisdale Beck (SST: 841 t km 2 yr -1 ) to be identified. The time-integrated samplers offer a feasible alternative to traditional infrequent and discrete sampling approaches for assessing spatio-temporal changes in contamination. In conjunction with a spatially distributed diffuse pollution model (SCIMAP), time-integrated sediment sampling is an effective means of identifying critical sediment source areas in the catchment, which can better inform sediment management strategies for pollution prevention and control. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Spatial Distribution of Stony Desertification and Key Influencing Factors on Different Sampling Scales in Small Karst Watersheds.

    Science.gov (United States)

    Zhang, Zhenming; Zhou, Yunchao; Wang, Shijie; Huang, Xianfei

    2018-04-13

    Karst areas are typical ecologically fragile areas, and stony desertification has become the most serious ecological and economic problems in these areas worldwide as well as a source of disasters and poverty. A reasonable sampling scale is of great importance for research on soil science in karst areas. In this paper, the spatial distribution of stony desertification characteristics and its influencing factors in karst areas are studied at different sampling scales using a grid sampling method based on geographic information system (GIS) technology and geo-statistics. The rock exposure obtained through sampling over a 150 m × 150 m grid in the Houzhai River Basin was utilized as the original data, and five grid scales (300 m × 300 m, 450 m × 450 m, 600 m × 600 m, 750 m × 750 m, and 900 m × 900 m) were used as the subsample sets. The results show that the rock exposure does not vary substantially from one sampling scale to another, while the average values of the five subsamples all fluctuate around the average value of the entire set. As the sampling scale increases, the maximum value and the average value of the rock exposure gradually decrease, and there is a gradual increase in the coefficient of variability. At the scale of 150 m × 150 m, the areas of minor stony desertification, medium stony desertification, and major stony desertification in the Houzhai River Basin are 7.81 km², 4.50 km², and 1.87 km², respectively. The spatial variability of stony desertification at small scales is influenced by many factors, and the variability at medium scales is jointly influenced by gradient, rock content, and rock exposure. At large scales, the spatial variability of stony desertification is mainly influenced by soil thickness and rock content.

  4. Design of the CERN MEDICIS Collection and Sample Extraction System

    CERN Document Server

    Brown, Alexander

    MEDICIS is a new facility at CERN ISOLDE that aims to produce radio-isotopes for medical research. Possible designs for the collection and transport system for the collection of radio-isotopes was investigated. A system using readily available equipment was devised with the the aim of keeping costs to a minimum whilst maintaining the highest safety standards. FLUKA, a Monte Carlo radiation transport code, was used to simulate the radiation from the isotopes to be collected. Of the isotopes to be collected 44Sc was found to give the largest dose by simulating the collection of all isotopes of interest to CERN’s MEDICIS facility, for medical research. The simulations helped guide the amount of shielding used in the final design. Swiss Regulations stipulating allowed activity level of individual isotopes was also considered within the body of the work.

  5. Design of a distributed system of exploration and spatial communication - PiBots

    International Nuclear Information System (INIS)

    Loria Gamboa, Monica; Castro Leiva, Luis; Garro Ruiz, Ricardo

    2012-01-01

    The design and implementation of a prototype robot PiBots has been suggested in Costa Rica for exploration in an alien environment to land. The design has been addressed PiBots considering the resources and the use of current technology for initial exploratory prototype. A real model and the ideal model of a PiBot were prepared. The actual design of a prototype pursued, has provided the testing of theoretical concepts and the collection of recommendations for the development of robots with constructed parts and appropriate materials specified of a spatial mission. The ideal design has resulted from the elaboration of the program that has allowed the selection of components to assist in the design stage, with special emphasis on the size, autonomy and power of PiBots with a functional and efficient energy system [es

  6. Spatial statistics of hydrography and water chemistry in a eutrophic boreal lake based on sounding and water samples.

    Science.gov (United States)

    Leppäranta, Matti; Lewis, John E; Heini, Anniina; Arvola, Lauri

    2018-06-04

    Spatial variability, an essential characteristic of lake ecosystems, has often been neglected in field research and monitoring. In this study, we apply spatial statistical methods for the key physics and chemistry variables and chlorophyll a over eight sampling dates in two consecutive years in a large (area 103 km 2 ) eutrophic boreal lake in southern Finland. In the four summer sampling dates, the water body was vertically and horizontally heterogenic except with color and DOC, in the two winter ice-covered dates DO was vertically stratified, while in the two autumn dates, no significant spatial differences in any of the measured variables were found. Chlorophyll a concentration was one order of magnitude lower under the ice cover than in open water. The Moran statistic for spatial correlation was significant for chlorophyll a and NO 2 +NO 3 -N in all summer situations and for dissolved oxygen and pH in three cases. In summer, the mass centers of the chemicals were within 1.5 km from the geometric center of the lake, and the 2nd moment radius ranged in 3.7-4.1 km respective to 3.9 km for the homogeneous situation. The lateral length scales of the studied variables were 1.5-2.5 km, about 1 km longer in the surface layer. The detected spatial "noise" strongly suggests that besides vertical variation also the horizontal variation in eutrophic lakes, in particular, should be considered when the ecosystems are monitored.

  7. Monitoring well design and sampling techniques at NAPL sites

    International Nuclear Information System (INIS)

    Collins, M.; Rohrman, W.R.; Drake, K.D.

    1992-01-01

    The existence of Non-Aqueous Phase Liquids (NAPLs) at many Superfund and RCRA hazardous waste sites has become a recognized problem in recent years. The large number of sites exhibiting this problem results from the fact that many of the most frequently used industrial solvents and petroleum products can exist as NAPLs. Hazardous waste constituents occurring as NAPLs possess a common characteristic that causes great concern during groundwater contamination evaluation: while solubility in water is generally very low, it is sufficient to cause groundwater to exceed Maximum Contamination Levels (MCLs). Thus, even a small quantity of NAPL within a groundwater regime can act as a point source with the ability to contaminate vast quantities of groundwater over time. This property makes it imperative that groundwater investigations focus heavily on characterizing the nature, extent, and migration pathways of NAPLs at sites where it exists. Two types of NAPLs may exist in a groundwater system. Water-immiscible liquid constituents having a specific gravity greater than one are termed Dense Non-Aqueous Phase Liquids, while those with a specific gravity less than one are considered Light Non-Aqueous Phase Liquids. For a groundwater investigation to properly characterize the two types of NAPLs, careful consideration must be given to the placement and sampling of groundwater monitoring wells. Unfortunately, technical reviewers at EPA Region VII and the Corps of Engineers find that many groundwater investigations fall short in characterizing NAPLs because several basic considerations were overlooked. Included among these are monitoring well location and screen placement with respect to the water table and significant confining units, and the ability of the well sampling method to obtain samples of NAPL. Depending on the specific gravity of the NAPL that occurs at a site, various considerations can substantially enhance adequate characterization of NAPL contaminants

  8. Estimating the spatial distribution of a plant disease epidemic from a sample

    Science.gov (United States)

    Sampling is of central importance in plant pathology. It facilitates our understanding of how epidemics develop in space and time and can also be used to inform disease management decisions. Making inferences from a sample is necessary because we rarely have the resources to conduct a complete censu...

  9. A stochastic optimisation method to estimate the spatial distribution of a pathogen from a sample.

    Science.gov (United States)

    Sampling is of central importance in plant pathology. It facilitates our understanding of how epidemics develop in space and time and can also be used to inform disease management decisions. Making inferences from a sample is necessary because we rarely have the resources to conduct a complete censu...

  10. Multicounter neutron detector for examination of content and spatial distribution of fissile materials in bulk samples

    International Nuclear Information System (INIS)

    Swiderska-Kowalczyk, M.; Starosta, W.; Zoltowski, T.

    1999-01-01

    A new neutron coincidence well-counter is presented. This experimental device can be applied for passive assay of fissile and, in particular, for plutonium bearing materials. It contains of a set of the 3 He tubes placed inside a polyethylene moderator. Outputs from the tubes, first processed by preamplifier/amplifier/discriminator circuits, are then analysed using a correlator connected with PC, and correlation techniques implemented in software. Such a neutron counter enables determination of the 240 Pu effective mass in samples of a small Pu content (i.e., where the multiplication effects can be neglected) having a fairly big volume (up to 0.17 m 3 ), if only the isotopic composition is known. For determination of neutron sources distribution inside a sample, a heuristic method based on hierarchical cluster analysis was applied. As input parameters, amplitudes and phases of two-dimensional Fourier transformation of the count profiles matrices for known point sources distributions and for the examined samples were taken. Such matrices of profiles counts are collected using the sample scanning with detection head. In the clustering processes, process, counts profiles of unknown samples are fitted into dendrograms employing the 'proximity' criterion of the examined sample profile to standard samples profiles. Distribution of neutron sources in the examined sample is then evaluated on the basis of a comparison with standard sources distributions. (author)

  11. Study the effects of varying interference upon the optical properties of turbid samples using NIR spatial light modulation

    Science.gov (United States)

    Shaul, Oren; Fanrazi-Kahana, Michal; Meitav, Omri; Pinhasi, Gad A.; Abookasis, David

    2018-03-01

    Optical properties of biological tissues are valuable diagnostic parameters which can provide necessary information regarding tissue state during disease pathogenesis and therapy. However, different sources of interference, such as temperature changes may modify these properties, introducing confounding factors and artifacts to data, consequently skewing their interpretation and misinforming clinical decision-making. In the current study, we apply spatial light modulation, a type of diffuse reflectance hyperspectral imaging technique, to monitor the variation in optical properties of highly scattering turbid media in the presence varying levels of the following sources of interference: scattering concentration, temperature, and pressure. Spatial near-infrared (NIR) light modulation is a wide-field, non-contact emerging optical imaging platform capable of separating the effects of tissue scattering from those of absorption, thereby accurately estimating both parameters. With this technique, periodic NIR illumination patterns at alternately low and high spatial frequencies, at six discrete wavelengths between 690 to 970 nm, were sequentially projected upon the medium while a CCD camera collects the diffusely reflected light. Data analysis based assumptions is then performed off-line to recover the medium's optical properties. We conducted a series of experiments demonstrating the changes in absorption and reduced scattering coefficients of commercially available fresh milk and chicken breast tissue under different interference conditions. In addition, information on the refractive index was study under increased pressure. This work demonstrates the utility of NIR spatial light modulation to detect varying sources of interference upon the optical properties of biological samples.

  12. Reachable Distance Space: Efficient Sampling-Based Planning for Spatially Constrained Systems

    KAUST Repository

    Xinyu Tang,; Thomas, S.; Coleman, P.; Amato, N. M.

    2010-01-01

    reachable distance space (RD-space), in which all configurations lie in the set of constraint-satisfying subspaces. This enables us to directly sample the constrained subspaces with complexity linear in the number of the robot's degrees of freedom

  13. Fiber Bragg grating based spatially resolved characterization of flux-pinning induced strain of rectangular-shaped bulk YBCO samples

    International Nuclear Information System (INIS)

    Latka, Ines; Habisreuther, Tobias; Litzkendorf, Doris

    2011-01-01

    Highlights: → Fiber Bragg gratings (FBG) act as strain sensors, also at cryogenic temperatures. → FBGs are not sensitive to magnetic fields. → Local, shape dependent magnetostriction was detected on rectangular samples. → Magnetostrictive effects of the top surface and in a gap between two samples are different. - Abstract: We report on measurements of the spatially resolved characterization of flux-pinning induced strain of rectangular-shaped bulk YBCO samples. The spatially resolved strain measurements are accomplished by the use 2 fiber Bragg grating arrays, which are with an included angle of 45 o fixed to the surface. In this paper first attempts to confirm the shape distortions caused by the flux-pinning induced strain as predicted in will be presented. Two sample setups, a single bulk and a 'mirror' arrangement, will be compared. This mirror setup represents a model configuration for a measurement inside the superconductor, where demagnetization effects can be neglected and the magnetic field merely has a z-component.

  14. Optimal Spatial Design of Capacity and Quantity of Rainwater Catchment Systems for Urban Flood Mitigation

    Science.gov (United States)

    Huang, C.; Hsu, N.

    2013-12-01

    This study imports Low-Impact Development (LID) technology of rainwater catchment systems into a Storm-Water runoff Management Model (SWMM) to design the spatial capacity and quantity of rain barrel for urban flood mitigation. This study proposes a simulation-optimization model for effectively searching the optimal design. In simulation method, we design a series of regular spatial distributions of capacity and quantity of rainwater catchment facilities, and thus the reduced flooding circumstances using a variety of design forms could be simulated by SWMM. Moreover, we further calculate the net benefit that is equal to subtract facility cost from decreasing inundation loss and the best solution of simulation method would be the initial searching solution of the optimization model. In optimizing method, first we apply the outcome of simulation method and Back-Propagation Neural Network (BPNN) for developing a water level simulation model of urban drainage system in order to replace SWMM which the operating is based on a graphical user interface and is hard to combine with optimization model and method. After that we embed the BPNN-based simulation model into the developed optimization model which the objective function is minimizing the negative net benefit. Finally, we establish a tabu search-based algorithm to optimize the planning solution. This study applies the developed method in Zhonghe Dist., Taiwan. Results showed that application of tabu search and BPNN-based simulation model into the optimization model not only can find better solutions than simulation method in 12.75%, but also can resolve the limitations of previous studies. Furthermore, the optimized spatial rain barrel design can reduce 72% of inundation loss according to historical flood events.

  15. Spatial delayed nonmatching-to-sample performances in long-living Ames dwarf mice.

    Science.gov (United States)

    Derenne, Adam; Brown-Borg, Holly M; Martner, Sarah; Wolff, Wendy; Frerking, Morgan

    2014-01-17

    Ames dwarf mice have an extended lifespan by comparison with normal mice. Behavioral testing has revealed that sometimes Ames dwarf mice also evince superior performances relative to normal mice, but in other cases they do not. In this experiment, Ames dwarf and normal mice were compared on a T-maze test and on a delayed nonmatching-to-sample variant of a T-maze test. On the simple T-maze, Ames dwarf and normal mice committed comparable numbers of errors. On the nonmatching-to-sample task, normal mice mastered the discrimination by the end of the experiment while Ames dwarf mice did not. The apparatus, distances traveled and session duration were equivalent between the two tasks. The poorer performances of Ames dwarf mice on the nonmatching-to-sample task suggests that Ames dwarf mice may not be as capable of learning relatively cognitively complex tasks as normal mice. © 2013.

  16. Reliability of impingement sampling designs: An example from the Indian Point station

    International Nuclear Information System (INIS)

    Mattson, M.T.; Waxman, J.B.; Watson, D.A.

    1988-01-01

    A 4-year data base (1976-1979) of daily fish impingement counts at the Indian Point electric power station on the Hudson River was used to compare the precision and reliability of three random-sampling designs: (1) simple random, (2) seasonally stratified, and (3) empirically stratified. The precision of daily impingement estimates improved logarithmically for each design as more days in the year were sampled. Simple random sampling was the least, and empirically stratified sampling was the most precise design, and the difference in precision between the two stratified designs was small. Computer-simulated sampling was used to estimate the reliability of the two stratified-random-sampling designs. A seasonally stratified sampling design was selected as the most appropriate reduced-sampling program for Indian Point station because: (1) reasonably precise and reliable impingement estimates were obtained using this design for all species combined and for eight common Hudson River fish by sampling only 30% of the days in a year (110 d); and (2) seasonal strata may be more precise and reliable than empirical strata if future changes in annual impingement patterns occur. The seasonally stratified design applied to the 1976-1983 Indian Point impingement data showed that selection of sampling dates based on daily species-specific impingement variability gave results that were more precise, but not more consistently reliable, than sampling allocations based on the variability of all fish species combined. 14 refs., 1 fig., 6 tabs

  17. Design development of robotic system for on line sampling in fuel reprocessing

    International Nuclear Information System (INIS)

    Balasubramanian, G.R.; Venugopal, P.R.; Padmashali, G.K.

    1990-01-01

    This presentation describes the design and developmental work that is being carried out for the design of an automated sampling system for fast reactor fuel reprocessing plants. The plant proposes to use integrated sampling system. The sample is taken across regular process streams from any intermediate hold up pot. A robot system is planned to take the sample from the sample pot, transfer it to the sample bottle, cap the bottle and transfer the bottle to a pneumatic conveying station. The system covers a large number of sample pots. Alternate automated systems are also examined (1). (author). 4 refs., 2 figs

  18. Sampling designs and methods for estimating fish-impingement losses at cooling-water intakes

    International Nuclear Information System (INIS)

    Murarka, I.P.; Bodeau, D.J.

    1977-01-01

    Several systems for estimating fish impingement at power plant cooling-water intakes are compared to determine the most statistically efficient sampling designs and methods. Compared to a simple random sampling scheme the stratified systematic random sampling scheme, the systematic random sampling scheme, and the stratified random sampling scheme yield higher efficiencies and better estimators for the parameters in two models of fish impingement as a time-series process. Mathematical results and illustrative examples of the applications of the sampling schemes to simulated and real data are given. Some sampling designs applicable to fish-impingement studies are presented in appendixes

  19. Estimating HIES Data through Ratio and Regression Methods for Different Sampling Designs

    Directory of Open Access Journals (Sweden)

    Faqir Muhammad

    2007-01-01

    Full Text Available In this study, comparison has been made for different sampling designs, using the HIES data of North West Frontier Province (NWFP for 2001-02 and 1998-99 collected from the Federal Bureau of Statistics, Statistical Division, Government of Pakistan, Islamabad. The performance of the estimators has also been considered using bootstrap and Jacknife. A two-stage stratified random sample design is adopted by HIES. In the first stage, enumeration blocks and villages are treated as the first stage Primary Sampling Units (PSU. The sample PSU’s are selected with probability proportional to size. Secondary Sampling Units (SSU i.e., households are selected by systematic sampling with a random start. They have used a single study variable. We have compared the HIES technique with some other designs, which are: Stratified Simple Random Sampling. Stratified Systematic Sampling. Stratified Ranked Set Sampling. Stratified Two Phase Sampling. Ratio and Regression methods were applied with two study variables, which are: Income (y and Household sizes (x. Jacknife and Bootstrap are used for variance replication. Simple Random Sampling with sample size (462 to 561 gave moderate variances both by Jacknife and Bootstrap. By applying Systematic Sampling, we received moderate variance with sample size (467. In Jacknife with Systematic Sampling, we obtained variance of regression estimator greater than that of ratio estimator for a sample size (467 to 631. At a sample size (952 variance of ratio estimator gets greater than that of regression estimator. The most efficient design comes out to be Ranked set sampling compared with other designs. The Ranked set sampling with jackknife and bootstrap, gives minimum variance even with the smallest sample size (467. Two Phase sampling gave poor performance. Multi-stage sampling applied by HIES gave large variances especially if used with a single study variable.

  20. Sampling flies or sampling flaws? Experimental design and inference strength in forensic entomology.

    Science.gov (United States)

    Michaud, J-P; Schoenly, Kenneth G; Moreau, G

    2012-01-01

    Forensic entomology is an inferential science because postmortem interval estimates are based on the extrapolation of results obtained in field or laboratory settings. Although enormous gains in scientific understanding and methodological practice have been made in forensic entomology over the last few decades, a majority of the field studies we reviewed do not meet the standards for inference, which are 1) adequate replication, 2) independence of experimental units, and 3) experimental conditions that capture a representative range of natural variability. Using a mock case-study approach, we identify design flaws in field and lab experiments and suggest methodological solutions for increasing inference strength that can inform future casework. Suggestions for improving data reporting in future field studies are also proposed.

  1. Design review report for rotary mode core sample truck (RMCST) modifications for flammable gas tanks, preliminary design

    International Nuclear Information System (INIS)

    Corbett, J.E.

    1996-02-01

    This report documents the completion of a preliminary design review for the Rotary Mode Core Sample Truck (RMCST) modifications for flammable gas tanks. The RMCST modifications are intended to support core sampling operations in waste tanks requiring flammable gas controls. The objective of this review was to validate basic design assumptions and concepts to support a path forward leading to a final design. The conclusion reached by the review committee was that the design was acceptable and efforts should continue toward a final design review

  2. A sero-survey of rinderpest in nomadic pastoral systems in central and southern Somalia from 2002 to 2003, using a spatially integrated random sampling approach.

    Science.gov (United States)

    Tempia, S; Salman, M D; Keefe, T; Morley, P; Freier, J E; DeMartini, J C; Wamwayi, H M; Njeumi, F; Soumaré, B; Abdi, A M

    2010-12-01

    A cross-sectional sero-survey, using a two-stage cluster sampling design, was conducted between 2002 and 2003 in ten administrative regions of central and southern Somalia, to estimate the seroprevalence and geographic distribution of rinderpest (RP) in the study area, as well as to identify potential risk factors for the observed seroprevalence distribution. The study was also used to test the feasibility of the spatially integrated investigation technique in nomadic and semi-nomadic pastoral systems. In the absence of a systematic list of livestock holdings, the primary sampling units were selected by generating random map coordinates. A total of 9,216 serum samples were collected from cattle aged 12 to 36 months at 562 sampling sites. Two apparent clusters of RP seroprevalence were detected. Four potential risk factors associated with the observed seroprevalence were identified: the mobility of cattle herds, the cattle population density, the proximity of cattle herds to cattle trade routes and cattle herd size. Risk maps were then generated to assist in designing more targeted surveillance strategies. The observed seroprevalence in these areas declined over time. In subsequent years, similar seroprevalence studies in neighbouring areas of Kenya and Ethiopia also showed a very low seroprevalence of RP or the absence of antibodies against RP. The progressive decline in RP antibody prevalence is consistent with virus extinction. Verification of freedom from RP infection in the Somali ecosystem is currently in progress.

  3. Creel survey sampling designs for estimating effort in short-duration Chinook salmon fisheries

    Science.gov (United States)

    McCormick, Joshua L.; Quist, Michael C.; Schill, Daniel J.

    2013-01-01

    Chinook Salmon Oncorhynchus tshawytscha sport fisheries in the Columbia River basin are commonly monitored using roving creel survey designs and require precise, unbiased catch estimates. The objective of this study was to examine the relative bias and precision of total catch estimates using various sampling designs to estimate angling effort under the assumption that mean catch rate was known. We obtained information on angling populations based on direct visual observations of portions of Chinook Salmon fisheries in three Idaho river systems over a 23-d period. Based on the angling population, Monte Carlo simulations were used to evaluate the properties of effort and catch estimates for each sampling design. All sampling designs evaluated were relatively unbiased. Systematic random sampling (SYS) resulted in the most precise estimates. The SYS and simple random sampling designs had mean square error (MSE) estimates that were generally half of those observed with cluster sampling designs. The SYS design was more efficient (i.e., higher accuracy per unit cost) than a two-cluster design. Increasing the number of clusters available for sampling within a day decreased the MSE of estimates of daily angling effort, but the MSE of total catch estimates was variable depending on the fishery. The results of our simulations provide guidelines on the relative influence of sample sizes and sampling designs on parameters of interest in short-duration Chinook Salmon fisheries.

  4. Designs of Optoelectronic Trinary Signed-Digit Multiplication by use of Joint Spatial Encodings and Optical Correlation

    Science.gov (United States)

    Cherri, Abdallah K.

    1999-02-01

    Trinary signed-digit (TSD) symbolic-substitution-based (SS-based) optical adders, which were recently proposed, are used as the basic modules for designing highly parallel optical multiplications by use of cascaded optical correlators. The proposed multiplications perform carry-free generation of the multiplication partial products of two words in constant time. Also, three different multiplication designs are presented, and new joint spatial encodings for the TSD numbers are introduced. The proposed joint spatial encodings allow one to reduce the SS computation rules involved in optical multiplication. In addition, the proposed joint spatial encodings increase the space bandwidth product of the spatial light modulators of the optical system. This increase is achieved by reduction of the numbers of pixels in the joint spatial encodings for the input TSD operands as well as reduction of the number of pixels used in the proposed matched spatial filters for the optical multipliers.

  5. Spatial patterns of antimicrobial resistance genes in a cross-sectional sample of pig farms with indoor non-organic production of finishers

    DEFF Research Database (Denmark)

    Birkegård, Anna Camilla; Ersbøll, Annette Kjær; Hisham Beshara Halasa, Tariq

    2017-01-01

    Antimicrobial resistance (AMR) in pig populations is a public health concern. There is a lack of information of spatial distributions of AMR genes in pig populations at large scales. The objective of the study was to describe the spatial pattern of AMR genes in faecal samples from pig farms...... spatial clusters were identified for ermB, ermF, sulII and tet(W). The broad spatial trends in AMR resistance evident in the risk maps were in agreement with the results of the cluster analysis. However, they also showed that there were only small scale spatial differences in the gene levels. We conclude...

  6. The spatial and temporal patterns of odors sampled by lobsters and crabs in a turbulent plume.

    Science.gov (United States)

    Reidenbach, Matthew A; Koehl, M A R

    2011-09-15

    Odors are dispersed across aquatic habitats by turbulent water flow as filamentous, intermittent plumes. Many crustaceans sniff (take discrete samples of ambient water and the odors it carries) by flicking their olfactory antennules. We used planar laser-induced fluorescence to investigate how flicking antennules of different morphologies (long antennules of spiny lobsters, Panulirus argus; short antennules of blue crabs, Callinectes sapidus) sample fluctuating odor signals at different positions in a turbulent odor plume in a flume to determine whether the patterns of concentrations captured can provide information about an animal's position relative to the odor source. Lobster antennules intercept odors during a greater percentage of flicks and encounter higher peak concentrations than do crab antennules, but because crabs flick at higher frequency, the duration of odor-free gaps between encountered odor pulses is similar. For flicking antennules there were longer time gaps between odor encounters as the downstream distance to the odor source decreases, but shorter gaps along the plume centerline than near the edge. In contrast to the case for antennule flicking, almost all odor-free gaps were <500 ms at all positions in the plume if concentration was measured continuously at the same height as the antennules. Variance in concentration is lower and mean concentration is greater near the substratum, where leg chemosensors continuously sample the plume, than in the water where antennules sniff. Concentrations sampled by legs increase as an animal nears an odor source, but decrease for antennules. Both legs and antennules encounter higher concentrations near the centerline than at the edge of the plume.

  7. Optimal experiment design in a filtering context with application to sampled network data

    OpenAIRE

    Singhal, Harsh; Michailidis, George

    2010-01-01

    We examine the problem of optimal design in the context of filtering multiple random walks. Specifically, we define the steady state E-optimal design criterion and show that the underlying optimization problem leads to a second order cone program. The developed methodology is applied to tracking network flow volumes using sampled data, where the design variable corresponds to controlling the sampling rate. The optimal design is numerically compared to a myopic and a naive strategy. Finally, w...

  8. Spatial distribution of metals in soil samples from Zona da Mata, Pernambuco, Brazil using XRF technique

    International Nuclear Information System (INIS)

    Fernandez, Zahily Herrero; Santos Junior, Jose Araujo dos; Amaral, Romilton dos Santos; Menezes, Romulo Simoes Cezar; Santos, Josineide Marques do Nascimento; Bezerra, Jairo Dias; Damascena, Kennedy Francys Rodrigues; Silva, Edvane Borges da; Silva, Alberto Antonio da

    2015-01-01

    Soil contamination is today one of the most important environmental issues for society. In the past, soil pollution was not considered as important as air and water contamination, because this was more difficult to be controlled, becoming an important topic in studies of environmental protection worldwide. Based on this, this paper provides information on the determination of metals in soil samples collected in Zona da Mata, Pernambuco, Brazil, where normally the application of pesticides, insecticides and other agricultural additives are used in a disorderly manner and without control. A total of 24 sampling points were monitored. The analysis of Mn, Fe, Ni, Zn, Br, Rb, Sr, Pb, Ti, La, Al, Si and P were performed using Energy Dispersive X-Ray Fluorescence. In order to assess the development of analytical method, inorganic Certified Reference Materials (IAEA-SOIL-7 and SRM 2709) were analyzed. In each sampling site, the geoaccumulation index were calculated to estimate the level of metal contamination in the soil, this was made taking into account the resolution 460 of the National Environmental Council (CONAMA in Portuguese). The elemental distribution patterns obtained for each metal were associated with different pollution sources. This assessment provides an initial description of pollution levels presented by metals in soils from several areas of Zona da Mata, providing quantitative evidence and demonstrating the need to improve the regulation of agricultural and industrial activities. (author)

  9. Spatial distribution of metals in soil samples from Zona da Mata, Pernambuco, Brazil using XRF technique

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez, Zahily Herrero; Santos Junior, Jose Araujo dos; Amaral, Romilton dos Santos; Menezes, Romulo Simoes Cezar; Santos, Josineide Marques do Nascimento; Bezerra, Jairo Dias; Damascena, Kennedy Francys Rodrigues, E-mail: zahily1985@gmail.com, E-mail: jaraujo@ufpe.br, E-mail: romilton@ufpe.br, E-mail: rmenezes@ufpe.br, E-mail: neideden@hotmail.com, E-mail: jairo.dias@ufpe.br, E-mail: kennedy.eng.ambiental@gmail.com [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Centro de Tecnologia e Geociencias. Departamento de Energia Nuclear; Alvarez, Juan Reinaldo Estevez, E-mail: jestevez@ceaden.cu [Centro de Aplicaciones Tecnologicas y Desarrollo Nuclear (CEADEN), Havana (Cuba); Silva, Edvane Borges da, E-mail: edvane.borges@pq.cnpq.br [Universidade Federal de Pernambuco (UFPE), Vitoria de Santo Antao, PE (Brazil). Nucleo de Biologia; Franca, Elvis Joacir de; Farias, Emerson Emiliano Gualberto de, E-mail: ejfranca@cnen.gov.br, E-mail: emersonemiliano@yahoo.com.br [Centro Regional de Ciencias Nucleares do Nordeste (CRCN-NE/CNEN-PE), Recife, PE (Brazil); Silva, Alberto Antonio da, E-mail: alberto.silva@barreiros.ifpe.edu.br [Instituto Federal de Educacao, Ciencia e Tecnologia de Pernambuco (IFPE), Barreiros, PE (Brazil)

    2015-07-01

    Soil contamination is today one of the most important environmental issues for society. In the past, soil pollution was not considered as important as air and water contamination, because this was more difficult to be controlled, becoming an important topic in studies of environmental protection worldwide. Based on this, this paper provides information on the determination of metals in soil samples collected in Zona da Mata, Pernambuco, Brazil, where normally the application of pesticides, insecticides and other agricultural additives are used in a disorderly manner and without control. A total of 24 sampling points were monitored. The analysis of Mn, Fe, Ni, Zn, Br, Rb, Sr, Pb, Ti, La, Al, Si and P were performed using Energy Dispersive X-Ray Fluorescence. In order to assess the development of analytical method, inorganic Certified Reference Materials (IAEA-SOIL-7 and SRM 2709) were analyzed. In each sampling site, the geoaccumulation index were calculated to estimate the level of metal contamination in the soil, this was made taking into account the resolution 460 of the National Environmental Council (CONAMA in Portuguese). The elemental distribution patterns obtained for each metal were associated with different pollution sources. This assessment provides an initial description of pollution levels presented by metals in soils from several areas of Zona da Mata, providing quantitative evidence and demonstrating the need to improve the regulation of agricultural and industrial activities. (author)

  10. Spatial resolution of 2D ionization chamber arrays for IMRT dose verification: single-detector size and sampling step width

    International Nuclear Information System (INIS)

    Poppe, Bjoern; Djouguela, Armand; Blechschmidt, Arne; Willborn, Kay; Ruehmann, Antje; Harder, Dietrich

    2007-01-01

    The spatial resolution of 2D detector arrays equipped with ionization chambers or diodes, used for the dose verification of IMRT treatment plans, is limited by the size of the single detector and the centre-to-centre distance between the detectors. Optimization criteria with regard to these parameters have been developed by combining concepts of dosimetry and pattern analysis. The 2D-ARRAY Type 10024 (PTW-Freiburg, Germany), single-chamber cross section 5 x 5 mm 2 , centre-to-centre distance between chambers in each row and column 10 mm, served as an example. Additional frames of given dose distributions can be taken by shifting the whole array parallel or perpendicular to the MLC leaves by, e.g., 5 mm. The size of the single detector is characterized by its lateral response function, a trapezoid with 5 mm top width and 9 mm base width. Therefore, values measured with the 2D array are regarded as sample values from the convolution product of the accelerator generated dose distribution and this lateral response function. Consequently, the dose verification, e.g., by means of the gamma index, is performed by comparing the measured values of the 2D array with the values of the convolution product of the treatment planning system (TPS) calculated dose distribution and the single-detector lateral response function. Sufficiently small misalignments of the measured dose distributions in comparison with the calculated ones can be detected since the lateral response function is symmetric with respect to the centre of the chamber, and the change of dose gradients due to the convolution is sufficiently small. The sampling step width of the 2D array should provide a set of sample values representative of the sampled distribution, which is achieved if the highest spatial frequency contained in this function does not exceed the 'Nyquist frequency', one half of the sampling frequency. Since the convolution products of IMRT-typical dose distributions and the single

  11. Implication of the first decision on visual information-sampling in the spatial frequency domain in pulmonary nodule recognition

    Science.gov (United States)

    Pietrzyk, Mariusz W.; Manning, David; Donovan, Tim; Dix, Alan

    2010-02-01

    Aim: To investigate the impact on visual sampling strategy and pulmonary nodule recognition of image-based properties of background locations in dwelled regions where the first overt decision was made. . Background: Recent studies in mammography show that the first overt decision (TP or FP) has an influence on further image reading including the correctness of the following decisions. Furthermore, the correlation between the spatial frequency properties of the local background following decision sites and the first decision correctness has been reported. Methods: Subjects with different radiological experience were eye tracked during detection of pulmonary nodules from PA chest radiographs. Number of outcomes and the overall quality of performance are analysed in terms of the cases where correct or incorrect decisions were made. JAFROC methodology is applied. The spatial frequency properties of selected local backgrounds related to a certain decisions were studied. ANOVA was used to compare the logarithmic values of energy carried by non redundant stationary wavelet packet coefficients. Results: A strong correlation has been found between the number of TP as a first decision and the JAFROC score (r = 0.74). The number of FP as a first decision was found negatively correlated with JAFROC (r = -0.75). Moreover, the differential spatial frequency profiles outcomes depend on the first choice correctness.

  12. Spatially generalizable representations of facial expressions: Decoding across partial face samples.

    Science.gov (United States)

    Greening, Steven G; Mitchell, Derek G V; Smith, Fraser W

    2018-04-01

    A network of cortical and sub-cortical regions is known to be important in the processing of facial expression. However, to date no study has investigated whether representations of facial expressions present in this network permit generalization across independent samples of face information (e.g., eye region vs mouth region). We presented participants with partial face samples of five expression categories in a rapid event-related fMRI experiment. We reveal a network of face-sensitive regions that contain information about facial expression categories regardless of which part of the face is presented. We further reveal that the neural information present in a subset of these regions: dorsal prefrontal cortex (dPFC), superior temporal sulcus (STS), lateral occipital and ventral temporal cortex, and even early visual cortex, enables reliable generalization across independent visual inputs (faces depicting the 'eyes only' vs 'eyes removed'). Furthermore, classification performance was correlated to behavioral performance in STS and dPFC. Our results demonstrate that both higher (e.g., STS, dPFC) and lower level cortical regions contain information useful for facial expression decoding that go beyond the visual information presented, and implicate a key role for contextual mechanisms such as cortical feedback in facial expression perception under challenging conditions of visual occlusion. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Using experimental design and spatial analyses to improve the precision of NDVI estimates in upland cotton field trials

    Science.gov (United States)

    Controlling for spatial variability is important in high-throughput phenotyping studies that enable large numbers of genotypes to be evaluated across time and space. In the current study, we compared the efficacy of different experimental designs and spatial models in the analysis of canopy spectral...

  14. Ant mosaics in Bornean primary rain forest high canopy depend on spatial scale, time of day, and sampling method

    Directory of Open Access Journals (Sweden)

    Kalsum M. Yusah

    2018-01-01

    Full Text Available Background Competitive interactions in biological communities can be thought of as giving rise to “assembly rules” that dictate the species that are able to co-exist. Ant communities in tropical canopies often display a particular pattern, an “ant mosaic”, in which competition between dominant ant species results in a patchwork of mutually exclusive territories. Although ant mosaics have been well-documented in plantation landscapes, their presence in pristine tropical forests remained contentious until recently. Here we assess presence of ant mosaics in a hitherto under-investigated forest stratum, the emergent trees of the high canopy in primary tropical rain forest, and explore how the strength of any ant mosaics is affected by spatial scale, time of day, and sampling method. Methods To test whether these factors might impact the detection of ant mosaics in pristine habitats, we sampled ant communities from emergent trees, which rise above the highest canopy layers in lowland dipterocarp rain forests in North Borneo (38.8–60.2 m, using both baiting and insecticide fogging. Critically, we restricted sampling to only the canopy of each focal tree. For baiting, we carried out sampling during both the day and the night. We used null models of species co-occurrence to assess patterns of segregation at within-tree and between-tree scales. Results The numerically dominant ant species on the emergent trees sampled formed a diverse community, with differences in the identity of dominant species between times of day and sampling methods. Between trees, we found patterns of ant species segregation consistent with the existence of ant mosaics using both methods. Within trees, fogged ants were segregated, while baited ants were segregated only at night. Discussion We conclude that ant mosaics are present within the emergent trees of the high canopy of tropical rain forest in Malaysian Borneo, and that sampling technique, spatial scale, and time

  15. Note: Design and development of wireless controlled aerosol sampling network for large scale aerosol dispersion experiments

    International Nuclear Information System (INIS)

    Gopalakrishnan, V.; Subramanian, V.; Baskaran, R.; Venkatraman, B.

    2015-01-01

    Wireless based custom built aerosol sampling network is designed, developed, and implemented for environmental aerosol sampling. These aerosol sampling systems are used in field measurement campaign, in which sodium aerosol dispersion experiments have been conducted as a part of environmental impact studies related to sodium cooled fast reactor. The sampling network contains 40 aerosol sampling units and each contains custom built sampling head and the wireless control networking designed with Programmable System on Chip (PSoC™) and Xbee Pro RF modules. The base station control is designed using graphical programming language LabView. The sampling network is programmed to operate in a preset time and the running status of the samplers in the network is visualized from the base station. The system is developed in such a way that it can be used for any other environment sampling system deployed in wide area and uneven terrain where manual operation is difficult due to the requirement of simultaneous operation and status logging

  16. Note: Design and development of wireless controlled aerosol sampling network for large scale aerosol dispersion experiments

    Energy Technology Data Exchange (ETDEWEB)

    Gopalakrishnan, V.; Subramanian, V.; Baskaran, R.; Venkatraman, B. [Radiation Impact Assessment Section, Radiological Safety Division, Indira Gandhi Centre for Atomic Research, Kalpakkam 603 102 (India)

    2015-07-15

    Wireless based custom built aerosol sampling network is designed, developed, and implemented for environmental aerosol sampling. These aerosol sampling systems are used in field measurement campaign, in which sodium aerosol dispersion experiments have been conducted as a part of environmental impact studies related to sodium cooled fast reactor. The sampling network contains 40 aerosol sampling units and each contains custom built sampling head and the wireless control networking designed with Programmable System on Chip (PSoC™) and Xbee Pro RF modules. The base station control is designed using graphical programming language LabView. The sampling network is programmed to operate in a preset time and the running status of the samplers in the network is visualized from the base station. The system is developed in such a way that it can be used for any other environment sampling system deployed in wide area and uneven terrain where manual operation is difficult due to the requirement of simultaneous operation and status logging.

  17. Parameterizing Spatial Models of Infectious Disease Transmission that Incorporate Infection Time Uncertainty Using Sampling-Based Likelihood Approximations.

    Directory of Open Access Journals (Sweden)

    Rajat Malik

    Full Text Available A class of discrete-time models of infectious disease spread, referred to as individual-level models (ILMs, are typically fitted in a Bayesian Markov chain Monte Carlo (MCMC framework. These models quantify probabilistic outcomes regarding the risk of infection of susceptible individuals due to various susceptibility and transmissibility factors, including their spatial distance from infectious individuals. The infectious pressure from infected individuals exerted on susceptible individuals is intrinsic to these ILMs. Unfortunately, quantifying this infectious pressure for data sets containing many individuals can be computationally burdensome, leading to a time-consuming likelihood calculation and, thus, computationally prohibitive MCMC-based analysis. This problem worsens when using data augmentation to allow for uncertainty in infection times. In this paper, we develop sampling methods that can be used to calculate a fast, approximate likelihood when fitting such disease models. A simple random sampling approach is initially considered followed by various spatially-stratified schemes. We test and compare the performance of our methods with both simulated data and data from the 2001 foot-and-mouth disease (FMD epidemic in the U.K. Our results indicate that substantial computation savings can be obtained--albeit, of course, with some information loss--suggesting that such techniques may be of use in the analysis of very large epidemic data sets.

  18. Use of virtual steam generator cassette for tube spatial design and SGC assembling procedure

    International Nuclear Information System (INIS)

    Kim, Y. W.; Kim, J. I.; Ji, S. K.

    2003-01-01

    A method of determining spatial arrangement of tube connection and assembling procedure of once-through helical steam generator cassette utilizing three dimensional virtual steam generator cassette has been developed on the basis of recent 3-D modelling technology. One ends of the steam generator tubes are connected to the module feed water header and the other sides are connected to the module steam header. Due to the complex geometry of tube arrangement, it is very difficult to connect the tubes to the module headers without the help of a physical engineering mock up. A comparative study has been performed at each design step for the tube arrangement and heat transfer area. Heat transfer area computed from thermal sizing was 4% less than that of measured. Heat transfer area calculated from the virtual steam generator cassette mock up has only 0.2% difference with that of measured. Assembling procedure of the steam generator cassette also, can be developed in the design stage

  19. Design of an omnidirectional single-point photodetector for large-scale spatial coordinate measurement

    Science.gov (United States)

    Xie, Hongbo; Mao, Chensheng; Ren, Yongjie; Zhu, Jigui; Wang, Chao; Yang, Lei

    2017-10-01

    In high precision and large-scale coordinate measurement, one commonly used approach to determine the coordinate of a target point is utilizing the spatial trigonometric relationships between multiple laser transmitter stations and the target point. A light receiving device at the target point is the key element in large-scale coordinate measurement systems. To ensure high-resolution and highly sensitive spatial coordinate measurement, a high-performance and miniaturized omnidirectional single-point photodetector (OSPD) is greatly desired. We report one design of OSPD using an aspheric lens, which achieves an enhanced reception angle of -5 deg to 45 deg in vertical and 360 deg in horizontal. As the heart of our OSPD, the aspheric lens is designed in a geometric model and optimized by LightTools Software, which enables the reflection of a wide-angle incident light beam into the single-point photodiode. The performance of home-made OSPD is characterized with working distances from 1 to 13 m and further analyzed utilizing developed a geometric model. The experimental and analytic results verify that our device is highly suitable for large-scale coordinate metrology. The developed device also holds great potential in various applications such as omnidirectional vision sensor, indoor global positioning system, and optical wireless communication systems.

  20. Integrated conservation planning for coral reefs: Designing conservation zones for multiple conservation objectives in spatial prioritisation

    Directory of Open Access Journals (Sweden)

    Rafael A. Magris

    2017-07-01

    Full Text Available Decision-makers focus on representing biodiversity pattern, maintaining connectivity, and strengthening resilience to global warming when designing marine protected area (MPA systems, especially in coral reef ecosystems. The achievement of these broad conservation objectives will likely require large areas, and stretch limited funds for MPA implementation. We undertook a spatial prioritisation of Brazilian coral reefs that considered two types of conservation zones (i.e. no-take and multiple use areas and integrated multiple conservation objectives into MPA planning, while assessing the potential impact of different sets of objectives on implementation costs. We devised objectives for biodiversity, connectivity, and resilience to global warming, determined the extent to which existing MPAs achieved them, and designed complementary zoning to achieve all objectives combined in expanded MPA systems. In doing so, we explored interactions between different sets of objectives, determined whether refinements to the existing spatial arrangement of MPAs were necessary, and tested the utility of existing MPAs by comparing their cost effectiveness with an MPA system designed from scratch. We found that MPAs in Brazil protect some aspects of coral reef biodiversity pattern (e.g. threatened fauna and ecosystem types more effectively than connectivity or resilience to global warming. Expanding the existing MPA system was as cost-effective as designing one from scratch only when multiple objectives were considered and management costs were accounted for. Our approach provides a comprehensive assessment of the benefits of integrating multiple objectives in the initial stages of conservation planning, and yields insights for planners of MPAs tackling multiple objectives in other regions.

  1. Design of sampling tools for Monte Carlo particle transport code JMCT

    International Nuclear Information System (INIS)

    Shangguan Danhua; Li Gang; Zhang Baoyin; Deng Li

    2012-01-01

    A class of sampling tools for general Monte Carlo particle transport code JMCT is designed. Two ways are provided to sample from distributions. One is the utilization of special sampling methods for special distribution; the other is the utilization of general sampling methods for arbitrary discrete distribution and one-dimensional continuous distribution on a finite interval. Some open source codes are included in the general sampling method for the maximum convenience of users. The sampling results show sampling correctly from distribution which are popular in particle transport can be achieved with these tools, and the user's convenience can be assured. (authors)

  2. Evaluation of optimized bronchoalveolar lavage sampling designs for characterization of pulmonary drug distribution.

    Science.gov (United States)

    Clewe, Oskar; Karlsson, Mats O; Simonsson, Ulrika S H

    2015-12-01

    Bronchoalveolar lavage (BAL) is a pulmonary sampling technique for characterization of drug concentrations in epithelial lining fluid and alveolar cells. Two hypothetical drugs with different pulmonary distribution rates (fast and slow) were considered. An optimized BAL sampling design was generated assuming no previous information regarding the pulmonary distribution (rate and extent) and with a maximum of two samples per subject. Simulations were performed to evaluate the impact of the number of samples per subject (1 or 2) and the sample size on the relative bias and relative root mean square error of the parameter estimates (rate and extent of pulmonary distribution). The optimized BAL sampling design depends on a characterized plasma concentration time profile, a population plasma pharmacokinetic model, the limit of quantification (LOQ) of the BAL method and involves only two BAL sample time points, one early and one late. The early sample should be taken as early as possible, where concentrations in the BAL fluid ≥ LOQ. The second sample should be taken at a time point in the declining part of the plasma curve, where the plasma concentration is equivalent to the plasma concentration in the early sample. Using a previously described general pulmonary distribution model linked to a plasma population pharmacokinetic model, simulated data using the final BAL sampling design enabled characterization of both the rate and extent of pulmonary distribution. The optimized BAL sampling design enables characterization of both the rate and extent of the pulmonary distribution for both fast and slowly equilibrating drugs.

  3. [Characterizing spatial patterns of NO(x), SO2 and O3 in Pearl River Delta by passive sampling].

    Science.gov (United States)

    Zhao, Yang; Shao, Min; Wang, Chen; Wang, Bo-Guang; Lu, Si-Hua; Zhong, Liu-Ju

    2011-02-01

    Concentrations of NO(x), SO2 and O3 were measured by passive sampling within 200km x 200km grid in Pearl River Delta (PRD). Sampling period was two weeks in November, 2009. Spatial distributions of NO(x), SO2 and O3 were obtained by Kriging interpolation method. The results were compared with emission inventories and modeling results. The transportations of O3 were evaluated by using backward trajectories of air parcels. During the sampling period, the mean concentrations of NO(x), SO2 and O3 were 75.9 microg/m3, 37.3 microg/m3 and 36.2 microg/m3, respectively. And the highest concentrations of NO(x), SO2 and O3 were 195.7 microg/m3, 95.9 microg/m3 and 81.8 microg/m3. Comparing with routine measurements from the regional monitoring network in PRD, the results by passive method were 18.6%, 33.5% and 37.5% lower for NO(x), SO2 and O3, respectively. The spatial patterns demonstrated that higher NO(x) concentrations often appeared in cities such as Guangzhou, Foshan and Shenzhen. SO2 concentrations were higher in west and lower in east. High SO2 concentrations are mainly from emission of power plants and industrial sources. Concentrations of O3 showed the highest levels in the south of PRD. Backward trajectory analysis for higher ozone areas indicated that 53% of the air masses were from the region with high concentration of NO(x). The horizontal transportation caused higher ozone in the south while lower in north in PRD.

  4. Spatial and temporal variations in cadmium concentrations and burdens in the Pacific oyster (Crassostrea gigas) sampled from the Pacific north-west

    International Nuclear Information System (INIS)

    Bendell, Leah I.; Feng, Cindy

    2009-01-01

    Oysters from the north-west coast of Canada contain high levels of cadmium, a toxic metal, in amounts that exceed food safety guidelines for international markets. A first required step to determine the sources of cadmium is to identify possible spatial and temporal trends in the accumulation of cadmium by the oyster. To meet this objective, rather than sample wild and cultured oysters of unknown age and origin, an oyster 'grow-out' experiment was initiated. Cultured oyster seed was suspended in the water column up to a depth of 7 m and the oyster seed allowed to mature a period of 3 years until market size. Oysters were sampled bimonthly and at time of sampling, temperature, chlorophyll-a, turbidity and salinity were measured. Oyster total shell length, dry tissue weights, cadmium concentrations (μg g -1 ) and burdens (μg of cadmium oyster -1 ) were determined. Oyster cadmium concentrations and burdens were then interpreted with respect to the spatial and temporal sampling design as well as to the measured physio-chemical and biotic variables. When expressed as a concentration, there was a marked seasonality with concentrations being greater in winter as compared in summer; however no spatial trend was evident. When expressed as a burden which corrects for differences in tissue mass, there was no seasonality, however cadmium oyster burdens increased from south to north. Comparison of cadmium accumulation rates oyster -1 among sites indicated three locations, Webster Island, on the west side of Vancouver Island, and two within Desolation Sound, Teakerne Arm and Redonda Bay, where point sources of cadmium which are not present at all other sampling locations may be contributing to overall oyster cadmium burdens. Of the four physio-chemical factors measured only temperature and turbidity weakly correlated with tissue cadmium concentrations (r 2 = -0.13; p < 0.05). By expressing oyster cadmium both as concentration and burden, regional and temporal patterns were

  5. Holographic Fabrication of Designed Functional Defect Lines in Photonic Crystal Lattice Using a Spatial Light Modulator

    Directory of Open Access Journals (Sweden)

    Jeffrey Lutkenhaus

    2016-04-01

    Full Text Available We report the holographic fabrication of designed defect lines in photonic crystal lattices through phase engineering using a spatial light modulator (SLM. The diffracted beams from the SLM not only carry the defect’s content but also the defect related phase-shifting information. The phase-shifting induced lattice shifting in photonic lattices around the defects in three-beam interference is less than the one produced by five-beam interference due to the alternating shifting in lattice in three beam interference. By designing the defect line at a 45 degree orientation and using three-beam interference, the defect orientation can be aligned with the background photonic lattice, and the shifting is only in one side of the defect line, in agreement with the theory. Finally, a new design for the integration of functional defect lines in a background phase pattern reduces the relative phase shift of the defect and utilizes the different diffraction efficiency between the defect line and background phase pattern. We demonstrate that the desired and functional defect lattice can be registered into the background lattice through the direct imaging of designed phase patterns.

  6. Designing efficient surveys: spatial arrangement of sample points for detection of invasive species

    Czech Academy of Sciences Publication Activity Database

    Berec, Luděk; Kean, J. M.; Epanchin-Niell, R.; Liebhold, A. M.; Haight, R. G.

    2015-01-01

    Roč. 17, č. 1 (2015), s. 445-459 ISSN 1387-3547 Grant - others:National Science Foundation(US) DEB-0553768 Institutional support: RVO:60077344 Keywords : biosecurity * early pest detection * eradication Subject RIV: EH - Ecology, Behaviour Impact factor: 2.855, year: 2015 http://link.springer.com/article/10.1007%2Fs10530-014-0742-x

  7. Outcome-Dependent Sampling Design and Inference for Cox's Proportional Hazards Model.

    Science.gov (United States)

    Yu, Jichang; Liu, Yanyan; Cai, Jianwen; Sandler, Dale P; Zhou, Haibo

    2016-11-01

    We propose a cost-effective outcome-dependent sampling design for the failure time data and develop an efficient inference procedure for data collected with this design. To account for the biased sampling scheme, we derive estimators from a weighted partial likelihood estimating equation. The proposed estimators for regression parameters are shown to be consistent and asymptotically normally distributed. A criteria that can be used to optimally implement the ODS design in practice is proposed and studied. The small sample performance of the proposed method is evaluated by simulation studies. The proposed design and inference procedure is shown to be statistically more powerful than existing alternative designs with the same sample sizes. We illustrate the proposed method with an existing real data from the Cancer Incidence and Mortality of Uranium Miners Study.

  8. Outcome-Dependent Sampling Design and Inference for Cox’s Proportional Hazards Model

    Science.gov (United States)

    Yu, Jichang; Liu, Yanyan; Cai, Jianwen; Sandler, Dale P.; Zhou, Haibo

    2016-01-01

    We propose a cost-effective outcome-dependent sampling design for the failure time data and develop an efficient inference procedure for data collected with this design. To account for the biased sampling scheme, we derive estimators from a weighted partial likelihood estimating equation. The proposed estimators for regression parameters are shown to be consistent and asymptotically normally distributed. A criteria that can be used to optimally implement the ODS design in practice is proposed and studied. The small sample performance of the proposed method is evaluated by simulation studies. The proposed design and inference procedure is shown to be statistically more powerful than existing alternative designs with the same sample sizes. We illustrate the proposed method with an existing real data from the Cancer Incidence and Mortality of Uranium Miners Study. PMID:28090134

  9. Quality-control design for surface-water sampling in the National Water-Quality Network

    Science.gov (United States)

    Riskin, Melissa L.; Reutter, David C.; Martin, Jeffrey D.; Mueller, David K.

    2018-04-10

    The data-quality objectives for samples collected at surface-water sites in the National Water-Quality Network include estimating the extent to which contamination, matrix effects, and measurement variability affect interpretation of environmental conditions. Quality-control samples provide insight into how well the samples collected at surface-water sites represent the true environmental conditions. Quality-control samples used in this program include field blanks, replicates, and field matrix spikes. This report describes the design for collection of these quality-control samples and the data management needed to properly identify these samples in the U.S. Geological Survey’s national database.

  10. Monitoring forest areas from continental to territorial levels using a sample of medium spatial resolution satellite imagery

    Science.gov (United States)

    Eva, Hugh; Carboni, Silvia; Achard, Frédéric; Stach, Nicolas; Durieux, Laurent; Faure, Jean-François; Mollicone, Danilo

    protocol rules for its overseas department. The latter estimates come from a sample of nearly 17,000 plots analyzed from same spatial imagery acquired between year 1990 and year 2006. This sampling scheme is derived from the traditional forest inventory methods carried out by IFN (Inventaire Forestier National). Our intensified global sampling scheme leads to an estimate of 96,650 ha deforested between 1990 and 2006, which is within the 95% confidence interval of the IFN sampling scheme, which gives an estimate of 91,722 ha, representing a relative difference from the IFN of 5.4%. These results demonstrate that the intensification of the global sampling scheme can provide forest area change estimates close to those achieved by official forest inventories (<6%), with precisions of between 4% and 7%, although we only estimate errors from sampling, not from the use of surrogate data. Such methods could be used by developing countries to demonstrate that they are fulfilling requirements for reducing emissions from deforestation in the framework of an REDD (Reducing Emissions from Deforestation in Developing Countries) mechanism under discussion within the United Nations Framework Convention on Climate Change (UNFCCC). Monitoring systems at national levels in tropical countries can also benefit from pan-tropical and regional observations, to ensure consistency between different national monitoring systems.

  11. Sampling designs matching species biology produce accurate and affordable abundance indices.

    Science.gov (United States)

    Harris, Grant; Farley, Sean; Russell, Gareth J; Butler, Matthew J; Selinger, Jeff

    2013-01-01

    Wildlife biologists often use grid-based designs to sample animals and generate abundance estimates. Although sampling in grids is theoretically sound, in application, the method can be logistically difficult and expensive when sampling elusive species inhabiting extensive areas. These factors make it challenging to sample animals and meet the statistical assumption of all individuals having an equal probability of capture. Violating this assumption biases results. Does an alternative exist? Perhaps by sampling only where resources attract animals (i.e., targeted sampling), it would provide accurate abundance estimates more efficiently and affordably. However, biases from this approach would also arise if individuals have an unequal probability of capture, especially if some failed to visit the sampling area. Since most biological programs are resource limited, and acquiring abundance data drives many conservation and management applications, it becomes imperative to identify economical and informative sampling designs. Therefore, we evaluated abundance estimates generated from grid and targeted sampling designs using simulations based on geographic positioning system (GPS) data from 42 Alaskan brown bears (Ursus arctos). Migratory salmon drew brown bears from the wider landscape, concentrating them at anadromous streams. This provided a scenario for testing the targeted approach. Grid and targeted sampling varied by trap amount, location (traps placed randomly, systematically or by expert opinion), and traps stationary or moved between capture sessions. We began by identifying when to sample, and if bears had equal probability of capture. We compared abundance estimates against seven criteria: bias, precision, accuracy, effort, plus encounter rates, and probabilities of capture and recapture. One grid (49 km(2) cells) and one targeted configuration provided the most accurate results. Both placed traps by expert opinion and moved traps between capture sessions

  12. Sampling designs matching species biology produce accurate and affordable abundance indices

    Directory of Open Access Journals (Sweden)

    Grant Harris

    2013-12-01

    Full Text Available Wildlife biologists often use grid-based designs to sample animals and generate abundance estimates. Although sampling in grids is theoretically sound, in application, the method can be logistically difficult and expensive when sampling elusive species inhabiting extensive areas. These factors make it challenging to sample animals and meet the statistical assumption of all individuals having an equal probability of capture. Violating this assumption biases results. Does an alternative exist? Perhaps by sampling only where resources attract animals (i.e., targeted sampling, it would provide accurate abundance estimates more efficiently and affordably. However, biases from this approach would also arise if individuals have an unequal probability of capture, especially if some failed to visit the sampling area. Since most biological programs are resource limited, and acquiring abundance data drives many conservation and management applications, it becomes imperative to identify economical and informative sampling designs. Therefore, we evaluated abundance estimates generated from grid and targeted sampling designs using simulations based on geographic positioning system (GPS data from 42 Alaskan brown bears (Ursus arctos. Migratory salmon drew brown bears from the wider landscape, concentrating them at anadromous streams. This provided a scenario for testing the targeted approach. Grid and targeted sampling varied by trap amount, location (traps placed randomly, systematically or by expert opinion, and traps stationary or moved between capture sessions. We began by identifying when to sample, and if bears had equal probability of capture. We compared abundance estimates against seven criteria: bias, precision, accuracy, effort, plus encounter rates, and probabilities of capture and recapture. One grid (49 km2 cells and one targeted configuration provided the most accurate results. Both placed traps by expert opinion and moved traps between capture

  13. Sampling designs matching species biology produce accurate and affordable abundance indices

    Science.gov (United States)

    Farley, Sean; Russell, Gareth J.; Butler, Matthew J.; Selinger, Jeff

    2013-01-01

    Wildlife biologists often use grid-based designs to sample animals and generate abundance estimates. Although sampling in grids is theoretically sound, in application, the method can be logistically difficult and expensive when sampling elusive species inhabiting extensive areas. These factors make it challenging to sample animals and meet the statistical assumption of all individuals having an equal probability of capture. Violating this assumption biases results. Does an alternative exist? Perhaps by sampling only where resources attract animals (i.e., targeted sampling), it would provide accurate abundance estimates more efficiently and affordably. However, biases from this approach would also arise if individuals have an unequal probability of capture, especially if some failed to visit the sampling area. Since most biological programs are resource limited, and acquiring abundance data drives many conservation and management applications, it becomes imperative to identify economical and informative sampling designs. Therefore, we evaluated abundance estimates generated from grid and targeted sampling designs using simulations based on geographic positioning system (GPS) data from 42 Alaskan brown bears (Ursus arctos). Migratory salmon drew brown bears from the wider landscape, concentrating them at anadromous streams. This provided a scenario for testing the targeted approach. Grid and targeted sampling varied by trap amount, location (traps placed randomly, systematically or by expert opinion), and traps stationary or moved between capture sessions. We began by identifying when to sample, and if bears had equal probability of capture. We compared abundance estimates against seven criteria: bias, precision, accuracy, effort, plus encounter rates, and probabilities of capture and recapture. One grid (49 km2 cells) and one targeted configuration provided the most accurate results. Both placed traps by expert opinion and moved traps between capture sessions, which

  14. Sampling

    CERN Document Server

    Thompson, Steven K

    2012-01-01

    Praise for the Second Edition "This book has never had a competitor. It is the only book that takes a broad approach to sampling . . . any good personal statistics library should include a copy of this book." —Technometrics "Well-written . . . an excellent book on an important subject. Highly recommended." —Choice "An ideal reference for scientific researchers and other professionals who use sampling." —Zentralblatt Math Features new developments in the field combined with all aspects of obtaining, interpreting, and using sample data Sampling provides an up-to-date treat

  15. Design and implementation of visualization methods for the CHANGES Spatial Decision Support System

    Science.gov (United States)

    Cristal, Irina; van Westen, Cees; Bakker, Wim; Greiving, Stefan

    2014-05-01

    The CHANGES Spatial Decision Support System (SDSS) is a web-based system aimed for risk assessment and the evaluation of optimal risk reduction alternatives at local level as a decision support tool in long-term natural risk management. The SDSS use multidimensional information, integrating thematic, spatial, temporal and documentary data. The role of visualization in this context becomes of vital importance for efficiently representing each dimension. This multidimensional aspect of the required for the system risk information, combined with the diversity of the end-users imposes the use of sophisticated visualization methods and tools. The key goal of the present work is to exploit efficiently the large amount of data in relation to the needs of the end-user, utilizing proper visualization techniques. Three main tasks have been accomplished for this purpose: categorization of the end-users, the definition of system's modules and the data definition. The graphical representation of the data and the visualization tools were designed to be relevant to the data type and the purpose of the analysis. Depending on the end-users category, each user should have access to different modules of the system and thus, to the proper visualization environment. The technologies used for the development of the visualization component combine the latest and most innovative open source JavaScript frameworks, such as OpenLayers 2.13.1, ExtJS 4 and GeoExt 2. Moreover, the model-view-controller (MVC) pattern is used in order to ensure flexibility of the system at the implementation level. Using the above technologies, the visualization techniques implemented so far offer interactive map navigation, querying and comparison tools. The map comparison tools are of great importance within the SDSS and include the following: swiping tool for comparison of different data of the same location; raster subtraction for comparison of the same phenomena varying in time; linked views for comparison

  16. Design of the driving system for visible near-infrared spatial programmable push-broom remote CCD sensor

    Science.gov (United States)

    Xu, Zhipeng; Wei, Jun; Zhou, Qianting; Weng, Dongshan; Li, Jianwei

    2010-11-01

    VNIR multi-spectral image sensor has wide applications in remote sensing and imaging spectroscopy. An image spectrometer of a spatial remote programmable push-broom sensing satellite requires visible near infrared band ranges from 0.4μm to 1.04μm which is one of the most important bands in remote sensing. This paper introduces a method of design the driving system for 1024x1024 VNIR CCD sensor for programmable push-broom remote sensing. The digital driving signal is generated by the FPGA device. There are seven modules in the FPGA program and all the modules are coded by VHDL. The driving system have five mainly functions: drive the sensor as the demand of timing schedule, control the AD convert device to work, get the parameter via RS232 from control platform, process the data input from the AD device, output the processed data to PCI sample card to display in computer end. All the modules above succeed working on FPGA device APA600. This paper also introduced several important keys when designing the driving system including module synchronization, critical path optimization.

  17. Multi-saline sample distillation apparatus for hydrogen isotope analyses : design and accuracy

    Science.gov (United States)

    Hassan, Afifa Afifi

    1981-01-01

    A distillation apparatus for saline water samples was designed and tested. Six samples may be distilled simultaneously. The temperature was maintained at 400 C to ensure complete dehydration of the precipitating salts. Consequently, the error in the measured ratio of stable hydrogen isotopes resulting from incomplete dehydration of hydrated salts during distillation was eliminated. (USGS)

  18. Design/Operations review of core sampling trucks and associated equipment

    International Nuclear Information System (INIS)

    Shrivastava, H.P.

    1996-01-01

    A systematic review of the design and operations of the core sampling trucks was commissioned by Characterization Equipment Engineering of the Westinghouse Hanford Company in October 1995. The review team reviewed the design documents, specifications, operating procedure, training manuals and safety analysis reports. The review process, findings and corrective actions are summarized in this supporting document

  19. MUP, CEC-DES, STRADE. Codes for uncertainty propagation, experimental design and stratified random sampling techniques

    International Nuclear Information System (INIS)

    Amendola, A.; Astolfi, M.; Lisanti, B.

    1983-01-01

    The report describes the how-to-use of the codes: MUP (Monte Carlo Uncertainty Propagation) for uncertainty analysis by Monte Carlo simulation, including correlation analysis, extreme value identification and study of selected ranges of the variable space; CEC-DES (Central Composite Design) for building experimental matrices according to the requirements of Central Composite and Factorial Experimental Designs; and, STRADE (Stratified Random Design) for experimental designs based on the Latin Hypercube Sampling Techniques. Application fields, of the codes are probabilistic risk assessment, experimental design, sensitivity analysis and system identification problems

  20. Designing a socio-spatial need indicator for urban social services analysis and decision making. A case study

    Directory of Open Access Journals (Sweden)

    Antonio Morenos Jiménez

    2015-01-01

    Full Text Available Decision taking on social services requires, as a previous step, to appraise the human needs and their spatial distribution, a key issue particularly sensitive in less developed zones or during economic crisis periods, as far as socio-spatial cohesion is then strongly challenged. Vari-ous methods have been used for measuring social needs, provided that these are diverse in nature and sometimes elusive. Incorporating the spatial dimension in this task involves an additional challenge, but the results add meaningful value for socio-spatial planning. Along this concern, in this work it is tackled the problema of estimating the needs typically met by local social service centers (SSC. To this end, it is designed a novel statistical indicator for intra-urban zones, incorporating in the formula the main components of the actual observed de-mand as well as the per capita income, to take into account the relevant spatial equity principle. Using a geographical information systems (GIS, the indicator for estimating SSC need has been experimentally obtained for two types of spatial units in the city of Madrid: municipal districts and small statistical areas, looking for complementary applied uses. The results reveal the intra-urban inequalities for these types of needs and may support public decision making on spatial provision and location of this kind of social resources. In addition, a preliminary and statistically based exam of the indicator potentialities and limitations is carried out for both types of spatial units.

  1. Spatial variation of electrode position in bioelectrochemical treatment system: Design consideration for azo dye remediation.

    Science.gov (United States)

    Yeruva, Dileep Kumar; Shanthi Sravan, J; Butti, Sai Kishore; Annie Modestra, J; Venkata Mohan, S

    2018-05-01

    In the present study, three bio-electrochemical treatment systems (BET) were designed with variations in cathode electrode placement [air exposed (BET1), partially submerged (BET2) and fully submerged (BET3)] to evaluate azo-dye based wastewater treatment at three dye loading concentrations (50, 250 and 500 mg L -1 ). Highest dye decolorization (94.5 ± 0.4%) and COD removal (62.2 ± 0.8%) efficiencies were observed in BET3 (fully submerged electrodes) followed by BET1 and BET2, while bioelectrogenic activity was highest in BET1 followed by BET2 and BET3. It was observed that competition among electron acceptors (electrode, dye molecules and intermediates) critically regulated the fate of bio-electrogenesis to be higher in BET1 and dye removal higher in BET3. Maximum half-cell potentials in BET3 depict higher electron acceptance by electrodes utilized for dye degradation. Study infers that spatial positioning of electrodes in BET3 is more suitable towards dye remediation, which can be considered for scaling-up/designing a treatment plant for large-scale industrial applications. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. Brain-grounded theory of temporal and spatial design in architecture and the environment

    CERN Document Server

    Ando, Yoichi

    2016-01-01

    In this book, brain-grounded theory of temporal and spatial design in architecture and the environment is discussed. The author believes that it is a key to solving such global problems as environmental disorders and severe climate change as well as conflicts that are caused by the ill-conceived notion of “time is money”. There are three phases or aspects of a person’s life: the physical life, the spiritual or mental life, and the third stage of life, when a person moves from middle age into old age and can choose what he or she wishes to do instead of simply what must be done. This book describes the temporal design of the environment based on the theory of subjective preference, which could make it possible for an individual to realize a healthy life in all three phases. In his previously published work, the present author wrote that the theory of subjective preference has been established for the sound and visual fields based on neural evidence, and that subjective preference is an overall response o...

  3. Design for mosquito abundance, diversity, and phenology sampling within the National Ecological Observatory Network

    Science.gov (United States)

    Hoekman, D.; Springer, Yuri P.; Barker, C.M.; Barrera, R.; Blackmore, M.S.; Bradshaw, W.E.; Foley, D. H.; Ginsberg, Howard; Hayden, M. H.; Holzapfel, C. M.; Juliano, S. A.; Kramer, L. D.; LaDeau, S. L.; Livdahl, T. P.; Moore, C. G.; Nasci, R.S.; Reisen, W.K.; Savage, H. M.

    2016-01-01

    The National Ecological Observatory Network (NEON) intends to monitor mosquito populations across its broad geographical range of sites because of their prevalence in food webs, sensitivity to abiotic factors and relevance for human health. We describe the design of mosquito population sampling in the context of NEON’s long term continental scale monitoring program, emphasizing the sampling design schedule, priorities and collection methods. Freely available NEON data and associated field and laboratory samples, will increase our understanding of how mosquito abundance, demography, diversity and phenology are responding to land use and climate change.

  4. Design and implementation of a risk assessment module in a spatial decision support system

    Science.gov (United States)

    Zhang, Kaixi; van Westen, Cees; Bakker, Wim

    2014-05-01

    The spatial decision support system named 'Changes SDSS' is currently under development. The goal of this system is to analyze changing hydro-meteorological hazards and the effect of risk reduction alternatives to support decision makers in choosing the best alternatives. The risk assessment module within the system is to assess the current risk, analyze the risk after implementations of risk reduction alternatives, and analyze the risk in different future years when considering scenarios such as climate change, land use change and population growth. The objective of this work is to present the detailed design and implementation plan of the risk assessment module. The main challenges faced consist of how to shift the risk assessment from traditional desktop software to an open source web-based platform, the availability of input data and the inclusion of uncertainties in the risk analysis. The risk assessment module is developed using Ext JS library for the implementation of user interface on the client side, using Python for scripting, as well as PostGIS spatial functions for complex computations on the server side. The comprehensive consideration of the underlying uncertainties in input data can lead to a better quantification of risk assessment and a more reliable Changes SDSS, since the outputs of risk assessment module are the basis for decision making module within the system. The implementation of this module will contribute to the development of open source web-based modules for multi-hazard risk assessment in the future. This work is part of the "CHANGES SDSS" project, funded by the European Community's 7th Framework Program.

  5. Spatial capture-recapture design and modelling for the study of small mammals.

    Directory of Open Access Journals (Sweden)

    Juan Romairone

    Full Text Available Spatial capture-recapture modelling (SCR is a powerful analytical tool to estimate density and derive information on space use and behaviour of elusive animals. Yet, SCR has been seldom applied to the study of ecologically keystone small mammals. Here we highlight its potential and requirements with a case study on common voles (Microtus arvalis. First, we address mortality associated with live-trapping, which can be high in small mammals, and must be kept minimal. We designed and tested a nest box coupled with a classic Sherman trap and show that it allows a 5-fold reduction of mortality in traps. Second, we address the need to adjust the trapping grid to the individual home range to maximize spatial recaptures. In May-June 2016, we captured and tagged with transponders 227 voles in a 1.2-ha area during two monthly sessions. Using a Bayesian SCR with a multinomial approach, we estimated: (1 the baseline detection rate and investigated variation according to sex, time or behaviour (aversion/attraction after a previous capture; (2 the parameter sigma that describes how detection probability declines as a function of the distance to an individual´s activity centre, and investigated variation according to sex; and (3 density and population sex-ratio. We show that reducing the maximum distance between traps from 12 to 9.6m doubled spatial recaptures and improved model predictions. Baseline detection rate increased over time (after overcoming a likely aversion to entering new odourless traps and was greater for females than males in June. The sigma parameter of males was twice that of females, indicating larger home ranges. Density estimates were of 142.92±38.50 and 168.25±15.79 voles/ha in May and June, respectively, with 2-3 times more females than males. We highlight the potential and broad applicability that SCR offers and provide specific recommendations for using it to study small mammals like voles.

  6. Baseline Design Compliance Matrix for the Rotary Mode Core Sampling System

    International Nuclear Information System (INIS)

    LECHELT, J.A.

    2000-01-01

    The purpose of the design compliance matrix (DCM) is to provide a single-source document of all design requirements associated with the fifteen subsystems that make up the rotary mode core sampling (RMCS) system. It is intended to be the baseline requirement document for the RMCS system and to be used in governing all future design and design verification activities associated with it. This document is the DCM for the RMCS system used on Hanford single-shell radioactive waste storage tanks. This includes the Exhauster System, Rotary Mode Core Sample Trucks, Universal Sampling System, Diesel Generator System, Distribution Trailer, X-Ray Cart System, Breathing Air Compressor, Nitrogen Supply Trailer, Casks and Cask Truck, Service Trailer, Core Sampling Riser Equipment, Core Sampling Support Trucks, Foot Clamp, Ramps and Platforms and Purged Camera System. Excluded items are tools such as light plants and light stands. Other items such as the breather inlet filter are covered by a different design baseline. In this case, the inlet breather filter is covered by the Tank Farms Design Compliance Matrix

  7. Practical iterative learning control with frequency domain design and sampled data implementation

    CERN Document Server

    Wang, Danwei; Zhang, Bin

    2014-01-01

    This book is on the iterative learning control (ILC) with focus on the design and implementation. We approach the ILC design based on the frequency domain analysis and address the ILC implementation based on the sampled data methods. This is the first book of ILC from frequency domain and sampled data methodologies. The frequency domain design methods offer ILC users insights to the convergence performance which is of practical benefits. This book presents a comprehensive framework with various methodologies to ensure the learnable bandwidth in the ILC system to be set with a balance between learning performance and learning stability. The sampled data implementation ensures effective execution of ILC in practical dynamic systems. The presented sampled data ILC methods also ensure the balance of performance and stability of learning process. Furthermore, the presented theories and methodologies are tested with an ILC controlled robotic system. The experimental results show that the machines can work in much h...

  8. Testing the Birth Unit Design Spatial Evaluation Tool (BUDSET) in Australia: a pilot study.

    Science.gov (United States)

    Foureur, Maralyn J; Leap, Nicky; Davis, Deborah L; Forbes, Ian F; Homer, Caroline E S

    2011-01-01

    To pilot test the Birth Unit Design Spatial Evaluation Tool (BUDSET) in an Australian maternity care setting to determine whether such an instrument can measure the optimality of different birth settings. Optimally designed spaces to give birth are likely to influence a woman's ability to experience physiologically normal labor and birth. This is important in the current industrialized environment, where increased caesarean section rates are causing concerns. The measurement of an optimal birth space is currently impossible, because there are limited tools available. A quantitative study was undertaken to pilot test the discriminant ability of the BUDSET in eight maternity units in New South Wales, Australia. Five auditors trained in the use of the BUDSET assessed the birth units using the BUDSET, which is based on 18 design principles and is divided into four domains (Fear Cascade, Facility, Aesthetics, and Support) with three to eight assessable items in each. Data were independently collected in eight birth units. Values for each of the domains were aggregated to provide an overall Optimality Score for each birth unit. A range of Optimality Scores was derived for each of the birth units (from 51 to 77 out of a possible 100 points). The BUDSET identified units with low-scoring domains. Essentially these were older units and conventional labor ward settings. The BUDSET provides a way to assess the optimality of birth units and determine which domain areas may need improvement. There is potential for improvements to existing birth spaces, and considerable improvement can be made with simple low-cost modifications. Further research is needed to validate the tool.

  9. Bionic Design for Mars Sampling Scoop Inspired by Himalayan Marmot Claw

    Directory of Open Access Journals (Sweden)

    Long Xue

    2016-01-01

    Full Text Available Cave animals are often adapted to digging and life underground, with claw toes similar in structure and function to a sampling scoop. In this paper, the clawed toes of the Himalayan marmot were selected as a biological prototype for bionic research. Based on geometric parameter optimization of the clawed toes, a bionic sampling scoop for use on Mars was designed. Using a 3D laser scanner, the point cloud data of the second front claw toe was acquired. Parametric equations and contour curves for the claw were then built with cubic polynomial fitting. We obtained 18 characteristic curve equations for the internal and external contours of the claw. A bionic sampling scoop was designed according to the structural parameters of Curiosity’s sampling shovel and the contours of the Himalayan marmot’s claw. Verifying test results showed that when the penetration angle was 45° and the sampling speed was 0.33 r/min, the bionic sampling scoops’ resistance torque was 49.6% less than that of the prototype sampling scoop. When the penetration angle was 60° and the sampling speed was 0.22 r/min, the resistance torque of the bionic sampling scoop was 28.8% lower than that of the prototype sampling scoop.

  10. Sample size reassessment for a two-stage design controlling the false discovery rate.

    Science.gov (United States)

    Zehetmayer, Sonja; Graf, Alexandra C; Posch, Martin

    2015-11-01

    Sample size calculations for gene expression microarray and NGS-RNA-Seq experiments are challenging because the overall power depends on unknown quantities as the proportion of true null hypotheses and the distribution of the effect sizes under the alternative. We propose a two-stage design with an adaptive interim analysis where these quantities are estimated from the interim data. The second stage sample size is chosen based on these estimates to achieve a specific overall power. The proposed procedure controls the power in all considered scenarios except for very low first stage sample sizes. The false discovery rate (FDR) is controlled despite of the data dependent choice of sample size. The two-stage design can be a useful tool to determine the sample size of high-dimensional studies if in the planning phase there is high uncertainty regarding the expected effect sizes and variability.

  11. Influence of management of variables, sampling zones and land units on LR analysis for landslide spatial prevision

    Directory of Open Access Journals (Sweden)

    R. Greco

    2013-09-01

    Full Text Available Several authors, according to different methodological approaches, have employed logistic Regression (LR, a multivariate statistical analysis adopted to assess the spatial probability of landslide, even though its fundamental principles have remained unaltered. This study aims at assessing the influence of some of these methodological approaches on the performance of LR, through a series of sensitivity analyses developed over a test area of about 300 km2 in Calabria (southern Italy. In particular, four types of sampling (1 – the whole study area; 2 – transects running parallel to the general slope direction of the study area with a total surface of about 1/3 of the whole study area; 3 – buffers surrounding the phenomena with a 1/1 ratio between the stable and the unstable area; 4 – buffers surrounding the phenomena with a 1/2 ratio between the stable and the unstable area, two variable coding modes (1 – grouped variables; 2 – binary variables, and two types of elementary land (1 – cells units; 2 – slope units units have been tested. The obtained results must be considered as statistically relevant in all cases (Aroc values > 70%, thus confirming the soundness of the LR analysis which maintains high predictive capacities notwithstanding the features of input data. As for the area under investigation, the best performing methodological choices are the following: (i transects produced the best results (0 P(y ≤ 93.4%; Aroc = 79.5%; (ii as for sampling modalities, binary variables (0 P(y ≤ 98.3%; Aroc = 80.7% provide better performance than ordinated variables; (iii as for the choice of elementary land units, slope units (0 P(y ≤ 100%; Aroc = 84.2% have obtained better results than cells matrix.

  12. Design and study of a coplanar grid array CdZnTe detector for improved spatial resolution

    International Nuclear Information System (INIS)

    Ma, Yuedong; Xiao, Shali; Yang, Guoqiang; Zhang, Liuqiang

    2014-01-01

    Coplanar grid (CPG) CdZnTe detectors have been used as gamma-ray spectrometers for years. Comparing with pixelated CdZnTe detectors, CPG CdZnTe detectors have either no or poor spatial resolution, which directly limits its use in imaging applications. To address the issue, a 2×2 CPG array CdZnTe detector with dimensions of 7×7×5 mm 3 was fabricated. Each of the CPG pairs in the detector was moderately shrunk in size and precisely designed to improve the spatial resolution while maintaining good energy resolution, considering the charge loss at the surface between the strips of each CPG pairs. Preliminary measurements were demonstrated at an energy resolution of 2.7–3.9% for the four CPG pairs using 662 keV gamma rays and with a spatial resolution of 3.3 mm, which is the best spatial resolution ever achieved for CPG CdZnTe detectors. The results reveal that the CPG CdZnTe detector can also be applied to imaging applications at a substantially higher spatial resolution. - Highlights: • A novel structure of coplanar grid CdZnTe detector was designed to evaluate the possibility of applying the detector to gamma-ray imaging applications. • The best spatial resolution of coplanar grid CdZnTe detectors ever reported has been achieved, along with good spectroscopic performance. • Depth correction of the energy spectra using a new algorithm is presented

  13. Planning Considerations for a Mars Sample Receiving Facility: Summary and Interpretation of Three Design Studies

    Science.gov (United States)

    Beaty, David W.; Allen, Carlton C.; Bass, Deborah S.; Buxbaum, Karen L.; Campbell, James K.; Lindstrom, David J.; Miller, Sylvia L.; Papanastassiou, Dimitri A.

    2009-10-01

    It has been widely understood for many years that an essential component of a Mars Sample Return mission is a Sample Receiving Facility (SRF). The purpose of such a facility would be to take delivery of the flight hardware that lands on Earth, open the spacecraft and extract the sample container and samples, and conduct an agreed-upon test protocol, while ensuring strict containment and contamination control of the samples while in the SRF. Any samples that are found to be non-hazardous (or are rendered non-hazardous by sterilization) would then be transferred to long-term curation. Although the general concept of an SRF is relatively straightforward, there has been considerable discussion about implementation planning. The Mars Exploration Program carried out an analysis of the attributes of an SRF to establish its scope, including minimum size and functionality, budgetary requirements (capital cost, operating costs, cost profile), and development schedule. The approach was to arrange for three independent design studies, each led by an architectural design firm, and compare the results. While there were many design elements in common identified by each study team, there were significant differences in the way human operators were to interact with the systems. In aggregate, the design studies provided insight into the attributes of a future SRF and the complex factors to consider for future programmatic planning.

  14. Adaptive designs for the one-sample log-rank test.

    Science.gov (United States)

    Schmidt, Rene; Faldum, Andreas; Kwiecien, Robert

    2017-09-22

    Traditional designs in phase IIa cancer trials are single-arm designs with a binary outcome, for example, tumor response. In some settings, however, a time-to-event endpoint might appear more appropriate, particularly in the presence of loss to follow-up. Then the one-sample log-rank test might be the method of choice. It allows to compare the survival curve of the patients under treatment to a prespecified reference survival curve. The reference curve usually represents the expected survival under standard of the care. In this work, convergence of the one-sample log-rank statistic to Brownian motion is proven using Rebolledo's martingale central limit theorem while accounting for staggered entry times of the patients. On this basis, a confirmatory adaptive one-sample log-rank test is proposed where provision is made for data dependent sample size reassessment. The focus is to apply the inverse normal method. This is done in two different directions. The first strategy exploits the independent increments property of the one-sample log-rank statistic. The second strategy is based on the patient-wise separation principle. It is shown by simulation that the proposed adaptive test might help to rescue an underpowered trial and at the same time lowers the average sample number (ASN) under the null hypothesis as compared to a single-stage fixed sample design. © 2017, The International Biometric Society.

  15. Fixed-location hydroacoustic monitoring designs for estimating fish passage using stratified random and systematic sampling

    International Nuclear Information System (INIS)

    Skalski, J.R.; Hoffman, A.; Ransom, B.H.; Steig, T.W.

    1993-01-01

    Five alternate sampling designs are compared using 15 d of 24-h continuous hydroacoustic data to identify the most favorable approach to fixed-location hydroacoustic monitoring of salmonid outmigrants. Four alternative aproaches to systematic sampling are compared among themselves and with stratified random sampling (STRS). Stratifying systematic sampling (STSYS) on a daily basis is found to reduce sampling error in multiday monitoring studies. Although sampling precision was predictable with varying levels of effort in STRS, neither magnitude nor direction of change in precision was predictable when effort was varied in systematic sampling (SYS). Furthermore, modifying systematic sampling to include replicated (e.g., nested) sampling (RSYS) is further shown to provide unbiased point and variance estimates as does STRS. Numerous short sampling intervals (e.g., 12 samples of 1-min duration per hour) must be monitored hourly using RSYS to provide efficient, unbiased point and interval estimates. For equal levels of effort, STRS outperformed all variations of SYS examined. Parametric approaches to confidence interval estimates are found to be superior to nonparametric interval estimates (i.e., bootstrap and jackknife) in estimating total fish passage. 10 refs., 1 fig., 8 tabs

  16. Physicochemical and nanotechnological approaches to the design of 'rigid' spatial structures of DNA

    International Nuclear Information System (INIS)

    Yevdokimov, Yu M; Salyanov, V I; Skuridin, S G; Shtykova, E V; Khlebtsov, N G; Kats, E I

    2015-01-01

    This review focuses on physicochemical and nanotechnological approaches to the design of 'rigid' particles based on double-stranded DNA molecules. The physicochemical methods imply cross-linking of adjacent DNA molecules ordered in quasinematic layers of liquid-crystalline dispersion particles by synthetic nanobridges consisting of alternating molecules of an antibiotic (daunomycin) and divalent copper ions, as well as cross-linking of these molecules as a result of their salting-out in quasinematic layers of liquid-crystalline dispersion particles under the action of lanthanide cations. The nanotechnological approach is based on the insertion of gold nanoparticles into the free space between double-stranded DNA molecules that form quasinematic layers of liquid-crystalline dispersion particles. This gives rise to extended clusters of gold nanoparticles and is accompanied by an enhancement of the interaction between the DNA molecules through gold nanoparticles and by a decrease in the solubility of dispersion particles. These approaches produce integrated 'rigid' DNA-containing spatial structures, which are incompatible with the initial aqueous polymeric solutions and have unique properties. The bibliography includes 116 references

  17. Design, analysis, and interpretation of field quality-control data for water-sampling projects

    Science.gov (United States)

    Mueller, David K.; Schertz, Terry L.; Martin, Jeffrey D.; Sandstrom, Mark W.

    2015-01-01

    The process of obtaining and analyzing water samples from the environment includes a number of steps that can affect the reported result. The equipment used to collect and filter samples, the bottles used for specific subsamples, any added preservatives, sample storage in the field, and shipment to the laboratory have the potential to affect how accurately samples represent the environment from which they were collected. During the early 1990s, the U.S. Geological Survey implemented policies to include the routine collection of quality-control samples in order to evaluate these effects and to ensure that water-quality data were adequately representing environmental conditions. Since that time, the U.S. Geological Survey Office of Water Quality has provided training in how to design effective field quality-control sampling programs and how to evaluate the resultant quality-control data. This report documents that training material and provides a reference for methods used to analyze quality-control data.

  18. Implications of sensor design for coral reef detection: Upscaling ground hyperspectral imagery in spatial and spectral scales

    Science.gov (United States)

    Caras, Tamir; Hedley, John; Karnieli, Arnon

    2017-12-01

    Remote sensing offers a potential tool for large scale environmental surveying and monitoring. However, remote observations of coral reefs are difficult especially due to the spatial and spectral complexity of the target compared to sensor specifications as well as the environmental implications of the water medium above. The development of sensors is driven by technological advances and the desired products. Currently, spaceborne systems are technologically limited to a choice between high spectral resolution and high spatial resolution, but not both. The current study explores the dilemma of whether future sensor design for marine monitoring should prioritise on improving their spatial or spectral resolution. To address this question, a spatially and spectrally resampled ground-level hyperspectral image was used to test two classification elements: (1) how the tradeoff between spatial and spectral resolutions affects classification; and (2) how a noise reduction by majority filter might improve classification accuracy. The studied reef, in the Gulf of Aqaba (Eilat), Israel, is heterogeneous and complex so the local substrate patches are generally finer than currently available imagery. Therefore, the tested spatial resolution was broadly divided into four scale categories from five millimeters to one meter. Spectral resolution resampling aimed to mimic currently available and forthcoming spaceborne sensors such as (1) Environmental Mapping and Analysis Program (EnMAP) that is characterized by 25 bands of 6.5 nm width; (2) VENμS with 12 narrow bands; and (3) the WorldView series with broadband multispectral resolution. Results suggest that spatial resolution should generally be prioritized for coral reef classification because the finer spatial scale tested (pixel size mind, while the focus in this study was on the technologically limited spaceborne design, aerial sensors may presently provide an opportunity to implement the suggested setup.

  19. Designing a two-rank acceptance sampling plan for quality inspection of geospatial data products

    Science.gov (United States)

    Tong, Xiaohua; Wang, Zhenhua; Xie, Huan; Liang, Dan; Jiang, Zuoqin; Li, Jinchao; Li, Jun

    2011-10-01

    To address the disadvantages of classical sampling plans designed for traditional industrial products, we originally propose a two-rank acceptance sampling plan (TRASP) for the inspection of geospatial data outputs based on the acceptance quality level (AQL). The first rank sampling plan is to inspect the lot consisting of map sheets, and the second is to inspect the lot consisting of features in an individual map sheet. The TRASP design is formulated as an optimization problem with respect to sample size and acceptance number, which covers two lot size cases. The first case is for a small lot size with nonconformities being modeled by a hypergeometric distribution function, and the second is for a larger lot size with nonconformities being modeled by a Poisson distribution function. The proposed TRASP is illustrated through two empirical case studies. Our analysis demonstrates that: (1) the proposed TRASP provides a general approach for quality inspection of geospatial data outputs consisting of non-uniform items and (2) the proposed acceptance sampling plan based on TRASP performs better than other classical sampling plans. It overcomes the drawbacks of percent sampling, i.e., "strictness for large lot size, toleration for small lot size," and those of a national standard used specifically for industrial outputs, i.e., "lots with different sizes corresponding to the same sampling plan."

  20. Biased representation of disturbance rates in the roadside sampling frame in boreal forests: implications for monitoring design

    Directory of Open Access Journals (Sweden)

    Steven L. Van Wilgenburg

    2015-12-01

    Full Text Available The North American Breeding Bird Survey (BBS is the principal source of data to inform researchers about the status of and trend for boreal forest birds. Unfortunately, little BBS coverage is available in the boreal forest, where increasing concern over the status of species breeding there has increased interest in northward expansion of the BBS. However, high disturbance rates in the boreal forest may complicate roadside monitoring. If the roadside sampling frame does not capture variation in disturbance rates because of either road placement or the use of roads for resource extraction, biased trend estimates might result. In this study, we examined roadside bias in the proportional representation of habitat disturbance via spatial data on forest "loss," forest fires, and anthropogenic disturbance. In each of 455 BBS routes, the area disturbed within multiple buffers away from the road was calculated and compared against the area disturbed in degree blocks and BBS strata. We found a nonlinear relationship between bias and distance from the road, suggesting forest loss and forest fires were underrepresented below 75 and 100 m, respectively. In contrast, anthropogenic disturbance was overrepresented at distances below 500 m and underrepresented thereafter. After accounting for distance from road, BBS routes were reasonably representative of the degree blocks they were within, with only a few strata showing biased representation. In general, anthropogenic disturbance is overrepresented in southern strata, and forest fires are underrepresented in almost all strata. Similar biases exist when comparing the entire road network and the subset sampled by BBS routes against the amount of disturbance within BBS strata; however, the magnitude of biases differed. Based on our results, we recommend that spatial stratification and rotating panel designs be used to spread limited BBS and off-road sampling effort in an unbiased fashion and that new BBS routes

  1. A design-based approximation to the Bayes Information Criterion in finite population sampling

    Directory of Open Access Journals (Sweden)

    Enrico Fabrizi

    2014-05-01

    Full Text Available In this article, various issues related to the implementation of the usual Bayesian Information Criterion (BIC are critically examined in the context of modelling a finite population. A suitable design-based approximation to the BIC is proposed in order to avoid the derivation of the exact likelihood of the sample which is often very complex in a finite population sampling. The approximation is justified using a theoretical argument and a Monte Carlo simulation study.

  2. Software documentation and user's manual for fish-impingement sampling design and estimation method computer programs

    International Nuclear Information System (INIS)

    Murarka, I.P.; Bodeau, D.J.

    1977-11-01

    This report contains a description of three computer programs that implement the theory of sampling designs and the methods for estimating fish-impingement at the cooling-water intakes of nuclear power plants as described in companion report ANL/ES-60. Complete FORTRAN listings of these programs, named SAMPLE, ESTIMA, and SIZECO, are given and augmented with examples of how they are used

  3. Gas and liquid sampling for closed canisters in K-West basins - functional design criteria

    International Nuclear Information System (INIS)

    Pitkoff, C.C.

    1994-01-01

    The purpose of this document is to provide functions and requirements for the design and fabrication of equipment for sampling closed canisters in the K-West basin. The samples will be used to help determine the state of the fuel elements in closed canisters. The characterization information obtained will support evaluation and development of processes required for safe storage and disposition of Spent Nuclear Fuel (SNF) materials

  4. Design of an automatic sample changer for the measurement of neutron flux by gamma spectrometry

    International Nuclear Information System (INIS)

    Gago, Javier; Bruna, Ruben; Baltuano, Oscar; Montoya, Eduardo; Descreaux, Killian

    2014-01-01

    This paper presents calculus, selection and components design for the construction of an automatic system in order to measure neutron flux in a working nuclear reactor by the gamma spectrometry technique using samples irradiated on the RP-10 nucleus. This system will perform the measurement of interchanging 100 samples in a programed and automatic way, reducing operation time by the user and obtaining more accurate measures. (authors).

  5. Toolbox for super-structured and super-structure free multi-disciplinary building spatial design optimisation

    NARCIS (Netherlands)

    Boonstra, S.; van der Blom, K.; Hofmeyer, H.; Emmerich, M.T.M.; van Schijndel, A.W.M.; de Wilde, P.

    2018-01-01

    Multi-disciplinary optimisation of building spatial designs is characterised by large solution spaces. Here two approaches are introduced, one being super-structured and the other super-structure free. Both are different in nature and perform differently for large solution spaces and each requires

  6. Sampling design for long-term regional trends in marine rocky intertidal communities

    Science.gov (United States)

    Irvine, Gail V.; Shelley, Alice

    2013-01-01

    Probability-based designs reduce bias and allow inference of results to the pool of sites from which they were chosen. We developed and tested probability-based designs for monitoring marine rocky intertidal assemblages at Glacier Bay National Park and Preserve (GLBA), Alaska. A multilevel design was used that varied in scale and inference. The levels included aerial surveys, extensive sampling of 25 sites, and more intensive sampling of 6 sites. Aerial surveys of a subset of intertidal habitat indicated that the original target habitat of bedrock-dominated sites with slope ≤30° was rare. This unexpected finding illustrated one value of probability-based surveys and led to a shift in the target habitat type to include steeper, more mixed rocky habitat. Subsequently, we evaluated the statistical power of different sampling methods and sampling strategies to detect changes in the abundances of the predominant sessile intertidal taxa: barnacles Balanomorpha, the mussel Mytilus trossulus, and the rockweed Fucus distichus subsp. evanescens. There was greatest power to detect trends in Mytilus and lesser power for barnacles and Fucus. Because of its greater power, the extensive, coarse-grained sampling scheme was adopted in subsequent years over the intensive, fine-grained scheme. The sampling attributes that had the largest effects on power included sampling of “vertical” line transects (vs. horizontal line transects or quadrats) and increasing the number of sites. We also evaluated the power of several management-set parameters. Given equal sampling effort, sampling more sites fewer times had greater power. The information gained through intertidal monitoring is likely to be useful in assessing changes due to climate, including ocean acidification; invasive species; trampling effects; and oil spills.

  7. Precision, time, and cost: a comparison of three sampling designs in an emergency setting

    Science.gov (United States)

    Deitchler, Megan; Deconinck, Hedwig; Bergeron, Gilles

    2008-01-01

    The conventional method to collect data on the health, nutrition, and food security status of a population affected by an emergency is a 30 × 30 cluster survey. This sampling method can be time and resource intensive and, accordingly, may not be the most appropriate one when data are needed rapidly for decision making. In this study, we compare the precision, time and cost of the 30 × 30 cluster survey with two alternative sampling designs: a 33 × 6 cluster design (33 clusters, 6 observations per cluster) and a 67 × 3 cluster design (67 clusters, 3 observations per cluster). Data for each sampling design were collected concurrently in West Darfur, Sudan in September-October 2005 in an emergency setting. Results of the study show the 30 × 30 design to provide more precise results (i.e. narrower 95% confidence intervals) than the 33 × 6 and 67 × 3 design for most child-level indicators. Exceptions are indicators of immunization and vitamin A capsule supplementation coverage which show a high intra-cluster correlation. Although the 33 × 6 and 67 × 3 designs provide wider confidence intervals than the 30 × 30 design for child anthropometric indicators, the 33 × 6 and 67 × 3 designs provide the opportunity to conduct a LQAS hypothesis test to detect whether or not a critical threshold of global acute malnutrition prevalence has been exceeded, whereas the 30 × 30 design does not. For the household-level indicators tested in this study, the 67 × 3 design provides the most precise results. However, our results show that neither the 33 × 6 nor the 67 × 3 design are appropriate for assessing indicators of mortality. In this field application, data collection for the 33 × 6 and 67 × 3 designs required substantially less time and cost than that required for the 30 × 30 design. The findings of this study suggest the 33 × 6 and 67 × 3 designs can provide useful time- and resource-saving alternatives to the 30 × 30 method of data collection in emergency

  8. Precision, time, and cost: a comparison of three sampling designs in an emergency setting

    Directory of Open Access Journals (Sweden)

    Deconinck Hedwig

    2008-05-01

    Full Text Available Abstract The conventional method to collect data on the health, nutrition, and food security status of a population affected by an emergency is a 30 × 30 cluster survey. This sampling method can be time and resource intensive and, accordingly, may not be the most appropriate one when data are needed rapidly for decision making. In this study, we compare the precision, time and cost of the 30 × 30 cluster survey with two alternative sampling designs: a 33 × 6 cluster design (33 clusters, 6 observations per cluster and a 67 × 3 cluster design (67 clusters, 3 observations per cluster. Data for each sampling design were collected concurrently in West Darfur, Sudan in September-October 2005 in an emergency setting. Results of the study show the 30 × 30 design to provide more precise results (i.e. narrower 95% confidence intervals than the 33 × 6 and 67 × 3 design for most child-level indicators. Exceptions are indicators of immunization and vitamin A capsule supplementation coverage which show a high intra-cluster correlation. Although the 33 × 6 and 67 × 3 designs provide wider confidence intervals than the 30 × 30 design for child anthropometric indicators, the 33 × 6 and 67 × 3 designs provide the opportunity to conduct a LQAS hypothesis test to detect whether or not a critical threshold of global acute malnutrition prevalence has been exceeded, whereas the 30 × 30 design does not. For the household-level indicators tested in this study, the 67 × 3 design provides the most precise results. However, our results show that neither the 33 × 6 nor the 67 × 3 design are appropriate for assessing indicators of mortality. In this field application, data collection for the 33 × 6 and 67 × 3 designs required substantially less time and cost than that required for the 30 × 30 design. The findings of this study suggest the 33 × 6 and 67 × 3 designs can provide useful time- and resource-saving alternatives to the 30 × 30 method of data

  9. Economic Design of Acceptance Sampling Plans in a Two-Stage Supply Chain

    Directory of Open Access Journals (Sweden)

    Lie-Fern Hsu

    2012-01-01

    Full Text Available Supply Chain Management, which is concerned with material and information flows between facilities and the final customers, has been considered the most popular operations strategy for improving organizational competitiveness nowadays. With the advanced development of computer technology, it is getting easier to derive an acceptance sampling plan satisfying both the producer's and consumer's quality and risk requirements. However, all the available QC tables and computer software determine the sampling plan on a noneconomic basis. In this paper, we design an economic model to determine the optimal sampling plan in a two-stage supply chain that minimizes the producer's and the consumer's total quality cost while satisfying both the producer's and consumer's quality and risk requirements. Numerical examples show that the optimal sampling plan is quite sensitive to the producer's product quality. The product's inspection, internal failure, and postsale failure costs also have an effect on the optimal sampling plan.

  10. Robotic Irradiated Sample Handling Concept Design in Reactor TRIGA PUSPATI using Simulation Software

    International Nuclear Information System (INIS)

    Mohd Khairulezwan Abdul Manan; Mohd Sabri Minhat; Ridzuan Abdul Mutalib; Zareen Khan Abdul Jalil Khan; Nurfarhana Ayuni Joha

    2015-01-01

    This paper introduces the concept design of an Robotic Irradiated Sample Handling Machine using graphical software application, designed as a general, flexible and open platform to work on robotics. Webots has proven to be a useful tool in many fields of robotics, such as manipulator programming, mobile robots control (wheeled, sub-aquatic and walking robots), distance computation, sensor simulation, collision detection, motion planning and so on. Webots is used as the common interface for all the applications. Some practical cases and application for this concept design are illustrated on the paper to present the possibilities of this simulation software. (author)

  11. OSIRIS-REx Touch-and-Go (TAG) Mission Design for Asteroid Sample Collection

    Science.gov (United States)

    May, Alexander; Sutter, Brian; Linn, Timothy; Bierhaus, Beau; Berry, Kevin; Mink, Ron

    2014-01-01

    The Origins Spectral Interpretation Resource Identification Security Regolith Explorer (OSIRIS-REx) mission is a NASA New Frontiers mission launching in September 2016 to rendezvous with the near-Earth asteroid Bennu in October 2018. After several months of proximity operations to characterize the asteroid, OSIRIS-REx flies a Touch-And-Go (TAG) trajectory to the asteroid's surface to collect at least 60 g of pristine regolith sample for Earth return. This paper provides mission and flight system overviews, with more details on the TAG mission design and key events that occur to safely and successfully collect the sample. An overview of the navigation performed relative to a chosen sample site, along with the maneuvers to reach the desired site is described. Safety monitoring during descent is performed with onboard sensors providing an option to abort, troubleshoot, and try again if necessary. Sample collection occurs using a collection device at the end of an articulating robotic arm during a brief five second contact period, while a constant force spring mechanism in the arm assists to rebound the spacecraft away from the surface. Finally, the sample is measured quantitatively utilizing the law of conservation of angular momentum, along with qualitative data from imagery of the sampling device. Upon sample mass verification, the arm places the sample into the Stardust-heritage Sample Return Capsule (SRC) for return to Earth in September 2023.

  12. Architectural Design Space Exploration of an FPGA-based Compressed Sampling Engine

    DEFF Research Database (Denmark)

    El-Sayed, Mohammad; Koch, Peter; Le Moullec, Yannick

    2015-01-01

    We present the architectural design space exploration of a compressed sampling engine for use in a wireless heart-rate monitoring system. We show how parallelism affects execution time at the register transfer level. Furthermore, two example solutions (modified semi-parallel and full...

  13. An Alternative View of Some FIA Sample Design and Analysis Issues

    Science.gov (United States)

    Paul C. Van Deusen

    2005-01-01

    Sample design and analysis decisions are the result of compromises and inputs from many sources. The end result would likely change if different individuals or groups were involved in the planning process. Discussed here are some alternatives to the procedures that are currently being used for the annual inventory. The purpose is to indicate that alternatives exist and...

  14. Designing efficient nitrous oxide sampling strategies in agroecosystems using simulation models

    Science.gov (United States)

    Debasish Saha; Armen R. Kemanian; Benjamin M. Rau; Paul R. Adler; Felipe Montes

    2017-01-01

    Annual cumulative soil nitrous oxide (N2O) emissions calculated from discrete chamber-based flux measurements have unknown uncertainty. We used outputs from simulations obtained with an agroecosystem model to design sampling strategies that yield accurate cumulative N2O flux estimates with a known uncertainty level. Daily soil N2O fluxes were simulated for Ames, IA (...

  15. Low-sensitivity H ∞ filter design for linear delta operator systems with sampling time jitter

    Science.gov (United States)

    Guo, Xiang-Gui; Yang, Guang-Hong

    2012-04-01

    This article is concerned with the problem of designing H ∞ filters for a class of linear discrete-time systems with low-sensitivity to sampling time jitter via delta operator approach. Delta-domain model is used to avoid the inherent numerical ill-condition resulting from the use of the standard shift-domain model at high sampling rates. Based on projection lemma in combination with the descriptor system approach often used to solve problems related to delay, a novel bounded real lemma with three slack variables for delta operator systems is presented. A sensitivity approach based on this novel lemma is proposed to mitigate the effects of sampling time jitter on system performance. Then, the problem of designing a low-sensitivity filter can be reduced to a convex optimisation problem. An important consideration in the design of correlation filters is the optimal trade-off between the standard H ∞ criterion and the sensitivity of the transfer function with respect to sampling time jitter. Finally, a numerical example demonstrating the validity of the proposed design method is given.

  16. Effects of sample survey design on the accuracy of classification tree models in species distribution models

    Science.gov (United States)

    Thomas C. Edwards; D. Richard Cutler; Niklaus E. Zimmermann; Linda Geiser; Gretchen G. Moisen

    2006-01-01

    We evaluated the effects of probabilistic (hereafter DESIGN) and non-probabilistic (PURPOSIVE) sample surveys on resultant classification tree models for predicting the presence of four lichen species in the Pacific Northwest, USA. Models derived from both survey forms were assessed using an independent data set (EVALUATION). Measures of accuracy as gauged by...

  17. Causality in Statistical Power: Isomorphic Properties of Measurement, Research Design, Effect Size, and Sample Size

    Directory of Open Access Journals (Sweden)

    R. Eric Heidel

    2016-01-01

    Full Text Available Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power.

  18. Design of modified annulus air sampling system for the detection of leakage in waste transfer line

    International Nuclear Information System (INIS)

    Deokar, U.V; Khot, A.R.; Mathew, P.; Ganesh, G.; Tripathi, R.M.; Srivastava, Srishti

    2018-01-01

    Various liquid waste streams are generated during the operation of reprocessing plant. The High Level (HL), Intermediate Level (IL) and Low Level (LL) liquid wastes generated, are transferred from reprocessing plant to Waste Management Facility. These respective waste streams are transferred through pipe-in-pipe lines along the shielded concrete trench. For detection of radioactive leakage from primary waste transfer line into secondary line, sampling of the annulus air between the two pipes is carried out. The currently installed pressurized annulus air sampling system did not have online leakage detection provision. Hence, there are chances of personal exposure and airborne activity in the working area. To overcome these design flaws, free air flow modified online annulus air sampling system with more safety features is designed

  19. Evaluating Site-Specific and Generic Spatial Models of Aboveground Forest Biomass Based on Landsat Time-Series and LiDAR Strip Samples in the Eastern USA

    Science.gov (United States)

    Ram Deo; Matthew Russell; Grant Domke; Hans-Erik Andersen; Warren Cohen; Christopher Woodall

    2017-01-01

    Large-area assessment of aboveground tree biomass (AGB) to inform regional or national forest monitoring programs can be efficiently carried out by combining remotely sensed data and field sample measurements through a generic statistical model, in contrast to site-specific models. We integrated forest inventory plot data with spatial predictors from Landsat time-...

  20. HPLC/DAD determination of rosmarinic acid in Salvia officinalis: sample preparation optimization by factorial design

    International Nuclear Information System (INIS)

    Oliveira, Karina B. de; Oliveira, Bras H. de

    2013-01-01

    Sage (Salvia officinalis) contains high amounts of the biologically active rosmarinic acid (RA) and other polyphenolic compounds. RA is easily oxidized, and may undergo degradation during sample preparation for analysis. The objective of this work was to develop and validate an analytical procedure for determination of RA in sage, using factorial design of experiments for optimizing sample preparation. The statistically significant variables for improving RA extraction yield were determined initially and then used in the optimization step, using central composite design (CCD). The analytical method was then fully validated, and used for the analysis of commercial samples of sage. The optimized procedure involved extraction with aqueous methanol (40%) containing an antioxidant mixture (ascorbic acid and ethylenediaminetetraacetic acid (EDTA)), with sonication at 45 deg C for 20 min. The samples were then injected in a system containing a C 18 column, using methanol (A) and 0.1% phosphoric acid in water (B) in step gradient mode (45A:55B, 0-5 min; 80A:20B, 5-10 min) with flow rate of 1.0 mL min−1 and detection at 330 nm. Using this conditions, RA concentrations were 50% higher when compared to extractions without antioxidants (98.94 ± 1.07% recovery). Auto-oxidation of RA during sample extraction was prevented by the use of antioxidants resulting in more reliable analytical results. The method was then used for the analysis of commercial samples of sage. (author)

  1. HPLC/DAD determination of rosmarinic acid in Salvia officinalis: sample preparation optimization by factorial design

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Karina B. de [Universidade Federal do Parana (UFPR), Curitiba, PR (Brazil). Dept. de Farmacia; Oliveira, Bras H. de, E-mail: bho@ufpr.br [Universidade Federal do Parana (UFPR), Curitiba, PR (Brazil). Dept. de Quimica

    2013-01-15

    Sage (Salvia officinalis) contains high amounts of the biologically active rosmarinic acid (RA) and other polyphenolic compounds. RA is easily oxidized, and may undergo degradation during sample preparation for analysis. The objective of this work was to develop and validate an analytical procedure for determination of RA in sage, using factorial design of experiments for optimizing sample preparation. The statistically significant variables for improving RA extraction yield were determined initially and then used in the optimization step, using central composite design (CCD). The analytical method was then fully validated, and used for the analysis of commercial samples of sage. The optimized procedure involved extraction with aqueous methanol (40%) containing an antioxidant mixture (ascorbic acid and ethylenediaminetetraacetic acid (EDTA)), with sonication at 45 deg C for 20 min. The samples were then injected in a system containing a C{sub 18} column, using methanol (A) and 0.1% phosphoric acid in water (B) in step gradient mode (45A:55B, 0-5 min; 80A:20B, 5-10 min) with flow rate of 1.0 mL min-1 and detection at 330 nm. Using this conditions, RA concentrations were 50% higher when compared to extractions without antioxidants (98.94 {+-} 1.07% recovery). Auto-oxidation of RA during sample extraction was prevented by the use of antioxidants resulting in more reliable analytical results. The method was then used for the analysis of commercial samples of sage. (author)

  2. Shielding design of highly activated sample storage at reactor TRIGA PUSPATI

    International Nuclear Information System (INIS)

    Naim Syauqi Hamzah; Julia Abdul Karim; Mohamad Hairie Rabir; Muhd Husamuddin Abdul Khalil; Mohd Amin Sharifuldin Salleh

    2010-01-01

    Radiation protection has always been one of the most important things considered in Reaktor Triga PUSPATI (RTP) management. Currently, demands on sample activation were increased from variety of applicant in different research field area. Radiological hazard may occur if the samples evaluation done were misjudge or miscalculated. At present, there is no appropriate storage for highly activated samples. For that purpose, special irradiated samples storage box should be provided in order to segregate highly activated samples that produce high dose level and typical activated samples that produce lower dose level (1 - 2 mR/ hr). In this study, thickness required by common shielding material such as lead and concrete to reduce highly activated radiotracer sample (potassium bromide) with initial exposure dose of 5 R/ hr to background level (0.05 mR/ hr) were determined. Analyses were done using several methods including conventional shielding equation, half value layer calculation and Micro shield computer code. Design of new irradiated samples storage box for RTP that capable to contain high level gamma radioactivity were then proposed. (author)

  3. Monitoring oil persistence on beaches : SCAT versus stratified random sampling designs

    International Nuclear Information System (INIS)

    Short, J.W.; Lindeberg, M.R.; Harris, P.M.; Maselko, J.M.; Pella, J.J.; Rice, S.D.

    2003-01-01

    In the event of a coastal oil spill, shoreline clean-up assessment teams (SCAT) commonly rely on visual inspection of the entire affected area to monitor the persistence of the oil on beaches. Occasionally, pits are excavated to evaluate the persistence of subsurface oil. This approach is practical for directing clean-up efforts directly following a spill. However, sampling of the 1989 Exxon Valdez oil spill in Prince William Sound 12 years later has shown that visual inspection combined with pit excavation does not offer estimates of contaminated beach area of stranded oil volumes. This information is needed to statistically evaluate the significance of change with time. Assumptions regarding the correlation of visually-evident surface oil and cryptic subsurface oil are usually not evaluated as part of the SCAT mandate. Stratified random sampling can avoid such problems and could produce precise estimates of oiled area and volume that allow for statistical assessment of major temporal trends and the extent of the impact. The 2001 sampling of the shoreline of Prince William Sound showed that 15 per cent of surface oil occurrences were associated with subsurface oil. This study demonstrates the usefulness of the stratified random sampling method and shows how sampling design parameters impact statistical outcome. Power analysis based on the study results, indicate that optimum power is derived when unnecessary stratification is avoided. It was emphasized that sampling effort should be balanced between choosing sufficient beaches for sampling and the intensity of sampling

  4. Integrated design as an opportunity to develop green infrastructures within complex spatial questions

    NARCIS (Netherlands)

    Bartelse, G.; Kost, S.

    2012-01-01

    Landscape is a complex system of competitive spatial functions. This competition is especially readable in high dense urban areas between housing, industry, leisure facilities, transport and infrastructure, energy supply, flood protection, natural resources. Nevertheless, those conflicts are seldom

  5. Designing a monitoring program to estimate estuarine survival of anadromous salmon smolts: simulating the effect of sample design on inference

    Science.gov (United States)

    Romer, Jeremy D.; Gitelman, Alix I.; Clements, Shaun; Schreck, Carl B.

    2015-01-01

    A number of researchers have attempted to estimate salmonid smolt survival during outmigration through an estuary. However, it is currently unclear how the design of such studies influences the accuracy and precision of survival estimates. In this simulation study we consider four patterns of smolt survival probability in the estuary, and test the performance of several different sampling strategies for estimating estuarine survival assuming perfect detection. The four survival probability patterns each incorporate a systematic component (constant, linearly increasing, increasing and then decreasing, and two pulses) and a random component to reflect daily fluctuations in survival probability. Generally, spreading sampling effort (tagging) across the season resulted in more accurate estimates of survival. All sampling designs in this simulation tended to under-estimate the variation in the survival estimates because seasonal and daily variation in survival probability are not incorporated in the estimation procedure. This under-estimation results in poorer performance of estimates from larger samples. Thus, tagging more fish may not result in better estimates of survival if important components of variation are not accounted for. The results of our simulation incorporate survival probabilities and run distribution data from previous studies to help illustrate the tradeoffs among sampling strategies in terms of the number of tags needed and distribution of tagging effort. This information will assist researchers in developing improved monitoring programs and encourage discussion regarding issues that should be addressed prior to implementation of any telemetry-based monitoring plan. We believe implementation of an effective estuary survival monitoring program will strengthen the robustness of life cycle models used in recovery plans by providing missing data on where and how much mortality occurs in the riverine and estuarine portions of smolt migration. These data

  6. AN EVALUATION OF PRIMARY DATA-COLLECTION MODES IN AN ADDRESS-BASED SAMPLING DESIGN.

    Science.gov (United States)

    Amaya, Ashley; Leclere, Felicia; Carris, Kari; Liao, Youlian

    2015-01-01

    As address-based sampling becomes increasingly popular for multimode surveys, researchers continue to refine data-collection best practices. While much work has been conducted to improve efficiency within a given mode, additional research is needed on how multimode designs can be optimized across modes. Previous research has not evaluated the consequences of mode sequencing on multimode mail and phone surveys, nor has significant research been conducted to evaluate mode sequencing on a variety of indicators beyond response rates. We conducted an experiment within the Racial and Ethnic Approaches to Community Health across the U.S. Risk Factor Survey (REACH U.S.) to evaluate two multimode case-flow designs: (1) phone followed by mail (phone-first) and (2) mail followed by phone (mail-first). We compared response rates, cost, timeliness, and data quality to identify differences across case-flow design. Because surveys often differ on the rarity of the target population, we also examined whether changes in the eligibility rate altered the choice of optimal case flow. Our results suggested that, on most metrics, the mail-first design was superior to the phone-first design. Compared with phone-first, mail-first achieved a higher yield rate at a lower cost with equivalent data quality. While the phone-first design initially achieved more interviews compared to the mail-first design, over time the mail-first design surpassed it and obtained the greatest number of interviews.

  7. Effects of Spatial Distribution of Trees on Density Estimation by Nearest Individual Sampling Method: Case Studies in Zagros Wild Pistachio Woodlands and Simulated Stands

    Directory of Open Access Journals (Sweden)

    Y. Erfanifard

    2014-06-01

    Full Text Available Distance methods and their estimators of density may have biased measurements unless the studied stand of trees has a random spatial pattern. This study aimed at assessing the effect of spatial arrangement of wild pistachio trees on the results of density estimation by using the nearest individual sampling method in Zagros woodlands, Iran, and applying a correction factor based on the spatial pattern of trees. A 45 ha clumped stand of wild pistachio trees was selected in Zagros woodlands and two random and dispersed stands with similar density and area were simulated. Distances from the nearest individual and neighbour at 40 sample points in a 100 × 100 m grid were measured in the three stands. The results showed that the nearest individual method with Batcheler estimator could not calculate density correctly in all stands. However, applying the correction factor based on the spatial pattern of the trees, density was measured with no significant difference in terms of the real density of the stands. This study showed that considering the spatial arrangement of trees can improve the results of the nearest individual method with Batcheler estimator in density measurement.

  8. Classifier-guided sampling for discrete variable, discontinuous design space exploration: Convergence and computational performance

    Energy Technology Data Exchange (ETDEWEB)

    Backlund, Peter B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shahan, David W. [HRL Labs., LLC, Malibu, CA (United States); Seepersad, Carolyn Conner [Univ. of Texas, Austin, TX (United States)

    2014-04-22

    A classifier-guided sampling (CGS) method is introduced for solving engineering design optimization problems with discrete and/or continuous variables and continuous and/or discontinuous responses. The method merges concepts from metamodel-guided sampling and population-based optimization algorithms. The CGS method uses a Bayesian network classifier for predicting the performance of new designs based on a set of known observations or training points. Unlike most metamodeling techniques, however, the classifier assigns a categorical class label to a new design, rather than predicting the resulting response in continuous space, and thereby accommodates nondifferentiable and discontinuous functions of discrete or categorical variables. The CGS method uses these classifiers to guide a population-based sampling process towards combinations of discrete and/or continuous variable values with a high probability of yielding preferred performance. Accordingly, the CGS method is appropriate for discrete/discontinuous design problems that are ill-suited for conventional metamodeling techniques and too computationally expensive to be solved by population-based algorithms alone. In addition, the rates of convergence and computational properties of the CGS method are investigated when applied to a set of discrete variable optimization problems. Results show that the CGS method significantly improves the rate of convergence towards known global optima, on average, when compared to genetic algorithms.

  9. The impact of design-based modeling instruction on seventh graders' spatial abilities and model-based argumentation

    Science.gov (United States)

    McConnell, William J.

    Due to the call of current science education reform for the integration of engineering practices within science classrooms, design-based instruction is receiving much attention in science education literature. Although some aspect of modeling is often included in well-known design-based instructional methods, it is not always a primary focus. The purpose of this study was to better understand how design-based instruction with an emphasis on scientific modeling might impact students' spatial abilities and their model-based argumentation abilities. In the following mixed-method multiple case study, seven seventh grade students attending a secular private school in the Mid-Atlantic region of the United States underwent an instructional intervention involving design-based instruction, modeling and argumentation. Through the course of a lesson involving students in exploring the interrelatedness of the environment and an animal's form and function, students created and used multiple forms of expressed models to assist them in model-based scientific argument. Pre/post data were collected through the use of The Purdue Spatial Visualization Test: Rotation, the Mental Rotation Test and interviews. Other data included a spatial activities survey, student artifacts in the form of models, notes, exit tickets, and video recordings of students throughout the intervention. Spatial abilities tests were analyzed using descriptive statistics while students' arguments were analyzed using the Instrument for the Analysis of Scientific Curricular Arguments and a behavior protocol. Models were analyzed using content analysis and interviews and all other data were coded and analyzed for emergent themes. Findings in the area of spatial abilities included increases in spatial reasoning for six out of seven participants, and an immense difference in the spatial challenges encountered by students when using CAD software instead of paper drawings to create models. Students perceived 3D printed

  10. Objective assessment and design improvement of a staring, sparse transducer array by the spatial crosstalk matrix for 3D photoacoustic tomography.

    Directory of Open Access Journals (Sweden)

    Philip Wong

    Full Text Available Accurate reconstruction of 3D photoacoustic (PA images requires detection of photoacoustic signals from many angles. Several groups have adopted staring ultrasound arrays, but assessment of array performance has been limited. We previously reported on a method to calibrate a 3D PA tomography (PAT staring array system and analyze system performance using singular value decomposition (SVD. The developed SVD metric, however, was impractical for large system matrices, which are typical of 3D PAT problems. The present study consisted of two main objectives. The first objective aimed to introduce the crosstalk matrix concept to the field of PAT for system design. Figures-of-merit utilized in this study were root mean square error, peak signal-to-noise ratio, mean absolute error, and a three dimensional structural similarity index, which were derived between the normalized spatial crosstalk matrix and the identity matrix. The applicability of this approach for 3D PAT was validated by observing the response of the figures-of-merit in relation to well-understood PAT sampling characteristics (i.e. spatial and temporal sampling rate. The second objective aimed to utilize the figures-of-merit to characterize and improve the performance of a near-spherical staring array design. Transducer arrangement, array radius, and array angular coverage were the design parameters examined. We observed that the performance of a 129-element staring transducer array for 3D PAT could be improved by selection of optimal values of the design parameters. The results suggested that this formulation could be used to objectively characterize 3D PAT system performance and would enable the development of efficient strategies for system design optimization.

  11. Design of sample analysis device for iodine adsorption efficiency test in NPPs

    International Nuclear Information System (INIS)

    Ji Jinnan

    2015-01-01

    In nuclear power plants, iodine adsorption efficiency test is used to check the iodine adsorption efficiency of the iodine adsorber. The iodine adsorption efficiency can be calculated through the analysis of the test sample, and thus to determine if the performance of the adsorber meets the requirement on the equipment operation and emission. Considering the process of test and actual demand, in this paper, a special device for the analysis of this kind of test sample is designed. The application shows that the device is with convenient operation and high reliability and accurate calculation, and improves the experiment efficiency and reduces the experiment risk. (author)

  12. Identification and verification of ultrafine particle affinity zones in urban neighbourhoods: sample design and data pre-processing.

    LENUS (Irish Health Repository)

    Harris, Paul

    2009-01-01

    A methodology is presented and validated through which long-term fixed site air quality measurements are used to characterise and remove temporal signals in sample-based measurements which have good spatial coverage but poor temporal resolution. The work has been carried out specifically to provide a spatial dataset of atmospheric ultrafine particle (UFP < 100 nm) data for ongoing epidemiologic cohort analysis but the method is readily transferable to wider epidemiologic investigations and research into the health effects of other pollutant species.

  13. Modern survey sampling

    CERN Document Server

    Chaudhuri, Arijit

    2014-01-01

    Exposure to SamplingAbstract Introduction Concepts of Population, Sample, and SamplingInitial RamificationsAbstract Introduction Sampling Design, Sampling SchemeRandom Numbers and Their Uses in Simple RandomSampling (SRS)Drawing Simple Random Samples with and withoutReplacementEstimation of Mean, Total, Ratio of Totals/Means:Variance and Variance EstimationDetermination of Sample SizesA.2 Appendix to Chapter 2 A.More on Equal Probability Sampling A.Horvitz-Thompson EstimatorA.SufficiencyA.LikelihoodA.Non-Existence Theorem More Intricacies Abstract Introduction Unequal Probability Sampling StrategiesPPS Sampling Exploring Improved WaysAbstract Introduction Stratified Sampling Cluster SamplingMulti-Stage SamplingMulti-Phase Sampling: Ratio and RegressionEstimationviiviii ContentsControlled SamplingModeling Introduction Super-Population ModelingPrediction Approach Model-Assisted Approach Bayesian Methods Spatial SmoothingSampling on Successive Occasions: Panel Rotation Non-Response and Not-at-Homes Weighting Adj...

  14. Spatial layout optimization design of multi-type LEDs lighting source based on photoelectrothermal coupling theory

    Science.gov (United States)

    Xue, Lingyun; Li, Guang; Chen, Qingguang; Rao, Huanle; Xu, Ping

    2018-03-01

    Multiple LED-based spectral synthesis technology has been widely used in the fields of solar simulator, color mixing, and artificial lighting of plant factory and so on. Generally, amounts of LEDs are spatially arranged with compact layout to obtain the high power density output. Mutual thermal spreading among LEDs will produce the coupled thermal effect which will additionally increase the junction temperature of LED. Affected by the Photoelectric thermal coupling effect of LED, the spectrum of LED will shift and luminous efficiency will decrease. Correspondingly, the spectral synthesis result will mismatch. Therefore, thermal management of LED spatial layout plays an important role for multi-LEDs light source system. In the paper, the thermal dissipation network topology model considering the mutual thermal spreading effect among the LEDs is proposed for multi-LEDs system with various types of power. The junction temperature increment cased by the thermal coupling has the great relation with the spatial arrangement. To minimize the thermal coupling effect, an optimized method of LED spatial layout for the specific light source structure is presented and analyzed. The results showed that layout of LED with high-power are arranged in the corner and low-power in the center. Finally, according to this method, it is convenient to determine the spatial layout of LEDs in a system having any kind of light source structure, and has the advantages of being universally applicable to facilitate adjustment.

  15. Designing and Developing an Augmented Reality Application: A Sample Of Chemistry Education

    Directory of Open Access Journals (Sweden)

    Zeynep Taçgın

    2016-09-01

    Full Text Available Augmented Reality has been accepted as an effective educational method and this review depends on philosophical background of cognitive science. This means, several channels –aural, visual, and interactivity, etc. - have been used to offer information in order to support individual learning styles. In this study, Natural User Interface- and Human Computer Interaction-based Augmented Reality application has been developed for the chemistry education. The purpose of this study is to design and develop a student-centered Augmented Reality environment to teach periodic table, and atomic structure of the elements and molecules. Head Mounted Display has been used to develop Augmented Reality system, and user control has been executed with hand motions (grab, drag, drop, select and rotate. The hand motion control has been used to improve spatial abilities of students in order to maximize the transferred knowledge. Use of the most common natural controlling tools (fingers and hands to interact with virtual objects instead of AR markers or other tools provides a more interactive, holistic, social and effective learning environment that authentically reflects the world around them. In this way, learners have an active role, and are not just passive receptors. Correspondingly, the developed NUI-based system has been constructed as design-based research and developed by using instructional design methods and principles to get reach of more effective and productive learning material. Features of this developed material consist of some fundamental components to create more intuitive and conductive tools in order to support Real World collaboration.

  16. A high time and spatial resolution MRPC designed for muon tomography

    Science.gov (United States)

    Shi, L.; Wang, Y.; Huang, X.; Wang, X.; Zhu, W.; Li, Y.; Cheng, J.

    2014-12-01

    A prototype of cosmic muon scattering tomography system has been set up in Tsinghua University in Beijing. Multi-gap Resistive Plate Chamber (MRPC) is used in the system to get the muon tracks. Compared with other detectors, MRPC can not only provide the track but also the Time of Flight (ToF) between two detectors which can estimate the energy of particles. To get a more accurate track and higher efficiency of the tomography system, a new type of high time and two-dimensional spatial resolution MRPC has been developed. A series of experiments have been done to measure the efficiency, time resolution and spatial resolution. The results show that the efficiency can reach 95% and its time resolution is around 65 ps. The cluster size is around 4 and the spatial resolution can reach 200 μ m.

  17. Sample requirements and design of an inter-laboratory trial for radiocarbon laboratories

    International Nuclear Information System (INIS)

    Bryant, Charlotte; Carmi, Israel; Cook, Gordon; Gulliksen, Steinar; Harkness, Doug; Heinemeier, Jan; McGee, Edward; Naysmith, Philip; Possnert, Goran; Scott, Marian; Plicht, Hans van der; Strydonck, Mark van

    2000-01-01

    An on-going inter-comparison programme which is focused on assessing and establishing consensus protocols to be applied in the identification, selection and sub-sampling of materials for subsequent 14 C analysis is described. The outcome of the programme will provide a detailed quantification of the uncertainties associated with 14 C measurements including the issues of accuracy and precision. Such projects have become recognised as a fundamental aspect of continuing laboratory quality assurance schemes, providing a mechanism for the harmonisation of measurements and for demonstrating the traceability of results. The design of this study and its rationale are described. In summary, a suite of core samples has been defined which will be made available to both AMS and radiometric laboratories. These core materials are representative of routinely dated material and their ages span the full range of the applied 14 C time-scale. Two of the samples are of wood from the German and Irish dendrochronologies, thus providing a direct connection to the master dendrochronological calibration curve. Further samples link this new inter-comparison to past studies. Sample size and precision have been identified as being of paramount importance in defining dating confidence, and so several core samples have been identified for more in-depth study of these practical issues. In addition to the core samples, optional samples have been identified and prepared specifically for either AMS and/or radiometric laboratories. For AMS laboratories, these include bone, textile, leather and parchment samples. Participation in the study requires a commitment to a minimum of 10 core analyses, with results to be returned within a year

  18. Field Investigation Plan for 1301-N and 1325-N Facilities Sampling to Support Remedial Design

    International Nuclear Information System (INIS)

    Weiss, S. G.

    1998-01-01

    This field investigation plan (FIP) provides for the sampling and analysis activities supporting the remedial design planning for the planned removal action for the 1301-N and 1325-N Liquid Waste Disposal Facilities (LWDFs), which are treatment, storage,and disposal (TSD) units (cribs/trenches). The planned removal action involves excavation, transportation, and disposal of contaminated material at the Environmental Restoration Disposal Facility (ERDF).An engineering study (BHI 1997) was performed to develop and evaluate various options that are predominantly influenced by the volume of high- and low-activity contaminated soil requiring removal. The study recommended that additional sampling be performed to supplement historical data for use in the remedial design

  19. Optimization of Apparatus Design and Behavioral Measures for the Assessment of Visuo-Spatial Learning and Memory of Mice on the Barnes Maze

    Science.gov (United States)

    O'Leary, Timothy P.; Brown, Richard E.

    2013-01-01

    We have previously shown that apparatus design can affect visual-spatial cue use and memory performance of mice on the Barnes maze. The present experiment extends these findings by determining the optimal behavioral measures and test procedure for analyzing visuo-spatial learning and memory in three different Barnes maze designs. Male and female…

  20. Musical Applications and Design Techniques for the Gametrak Tethered Spatial Position Controller

    DEFF Research Database (Denmark)

    Freed, Adrian; Overholt, Daniel; Hansen, Anne-Marie

    2009-01-01

    The Gametrak spatial position controller has been saved from the fate of so many discontinued gaming controllers to become an attractive and increasingly popular platform for experimental musical controllers, math and science manipulatives, large scale interactive installations and as a playful...... tangible gaming interface that promotes inter-generational creative play and discovery . After introducing the peculiarities of the GameTrak and comparing it to related spatial position sensing systems we survey musical applications of the device. The short paper format cannot do justice to the depth...

  1. Single-subject withdrawal designs in delayed matching-to-sample procedures

    OpenAIRE

    Eilifsen, Christoffer; Arntzen, Erik

    2011-01-01

    In most studies of delayed matching-to-sample (DMTS) and stimulus equivalence, the delay has remained fixed throughout a single experimental condition. We wanted to expand on the DMTS and stimulus equivalence literature by examining the effects of using titrating delays with different starting points during the establishment of conditional discriminations prerequisite for stimulus equivalence. In Experiment 1, a variation of a single-subject withdrawal design was used. Ten adults were exposed...

  2. Design, placement, and sampling of groundwater monitoring wells for the management of hazardous waste disposal facilities

    International Nuclear Information System (INIS)

    Tsai, S.Y.

    1988-01-01

    Groundwater monitoring is an important technical requirement in managing hazardous waste disposal facilities. The purpose of monitoring is to assess whether and how a disposal facility is affecting the underlying groundwater system. This paper focuses on the regulatory and technical aspects of the design, placement, and sampling of groundwater monitoring wells for hazardous waste disposal facilities. Such facilities include surface impoundments, landfills, waste piles, and land treatment facilities. 8 refs., 4 figs

  3. Impacts of Sample Design for Validation Data on the Accuracy of Feedforward Neural Network Classification

    Directory of Open Access Journals (Sweden)

    Giles M. Foody

    2017-08-01

    Full Text Available Validation data are often used to evaluate the performance of a trained neural network and used in the selection of a network deemed optimal for the task at-hand. Optimality is commonly assessed with a measure, such as overall classification accuracy. The latter is often calculated directly from a confusion matrix showing the counts of cases in the validation set with particular labelling properties. The sample design used to form the validation set can, however, influence the estimated magnitude of the accuracy. Commonly, the validation set is formed with a stratified sample to give balanced classes, but also via random sampling, which reflects class abundance. It is suggested that if the ultimate aim is to accurately classify a dataset in which the classes do vary in abundance, a validation set formed via random, rather than stratified, sampling is preferred. This is illustrated with the classification of simulated and remotely-sensed datasets. With both datasets, statistically significant differences in the accuracy with which the data could be classified arose from the use of validation sets formed via random and stratified sampling (z = 2.7 and 1.9 for the simulated and real datasets respectively, for both p < 0.05%. The accuracy of the classifications that used a stratified sample in validation were smaller, a result of cases of an abundant class being commissioned into a rarer class. Simple means to address the issue are suggested.

  4. Determination of the spatial resolution of an aperture-type near-field scanning optical microscope using a standard sample of a quantum-dot-embedded polymer film

    International Nuclear Information System (INIS)

    Kim, J. Y.; Kim, D. C.; Nakajima, K.; Mitsui, T.; Aoki, H.

    2010-01-01

    The near-field scanning optical microscope (NSOM) is a form of scanning probe microscope that achieves, through the use of the near-field, a spatial resolution significantly superior to that defined by the Abbe diffraction limit. Although the term spatial resolution has a clear meaning, it is often used in different ways in characterizing the NSOM instrument. In this paper, we describe the concept, the cautions, and the general guidelines of a method to measure the spatial resolution of an aperture-type NSOM instrument. As an example, a quantum dot embedded polymer film was prepared and imaged as a test sample, and the determination of the lateral resolution was demonstrated using the described method.

  5. Fast Bayesian experimental design: Laplace-based importance sampling for the expected information gain

    KAUST Repository

    Beck, Joakim

    2018-02-19

    In calculating expected information gain in optimal Bayesian experimental design, the computation of the inner loop in the classical double-loop Monte Carlo requires a large number of samples and suffers from underflow if the number of samples is small. These drawbacks can be avoided by using an importance sampling approach. We present a computationally efficient method for optimal Bayesian experimental design that introduces importance sampling based on the Laplace method to the inner loop. We derive the optimal values for the method parameters in which the average computational cost is minimized for a specified error tolerance. We use three numerical examples to demonstrate the computational efficiency of our method compared with the classical double-loop Monte Carlo, and a single-loop Monte Carlo method that uses the Laplace approximation of the return value of the inner loop. The first demonstration example is a scalar problem that is linear in the uncertain parameter. The second example is a nonlinear scalar problem. The third example deals with the optimal sensor placement for an electrical impedance tomography experiment to recover the fiber orientation in laminate composites.

  6. Fast Bayesian experimental design: Laplace-based importance sampling for the expected information gain

    Science.gov (United States)

    Beck, Joakim; Dia, Ben Mansour; Espath, Luis F. R.; Long, Quan; Tempone, Raúl

    2018-06-01

    In calculating expected information gain in optimal Bayesian experimental design, the computation of the inner loop in the classical double-loop Monte Carlo requires a large number of samples and suffers from underflow if the number of samples is small. These drawbacks can be avoided by using an importance sampling approach. We present a computationally efficient method for optimal Bayesian experimental design that introduces importance sampling based on the Laplace method to the inner loop. We derive the optimal values for the method parameters in which the average computational cost is minimized according to the desired error tolerance. We use three numerical examples to demonstrate the computational efficiency of our method compared with the classical double-loop Monte Carlo, and a more recent single-loop Monte Carlo method that uses the Laplace method as an approximation of the return value of the inner loop. The first example is a scalar problem that is linear in the uncertain parameter. The second example is a nonlinear scalar problem. The third example deals with the optimal sensor placement for an electrical impedance tomography experiment to recover the fiber orientation in laminate composites.

  7. [Design of standard voice sample text for subjective auditory perceptual evaluation of voice disorders].

    Science.gov (United States)

    Li, Jin-rang; Sun, Yan-yan; Xu, Wen

    2010-09-01

    To design a speech voice sample text with all phonemes in Mandarin for subjective auditory perceptual evaluation of voice disorders. The principles for design of a speech voice sample text are: The short text should include the 21 initials and 39 finals, this may cover all the phonemes in Mandarin. Also, the short text should have some meanings. A short text was made out. It had 155 Chinese words, and included 21 initials and 38 finals (the final, ê, was not included because it was rarely used in Mandarin). Also, the text covered 17 light tones and one "Erhua". The constituent ratios of the initials and finals presented in this short text were statistically similar as those in Mandarin according to the method of similarity of the sample and population (r = 0.742, P text were statistically not similar as those in Mandarin (r = 0.731, P > 0.05). A speech voice sample text with all phonemes in Mandarin was made out. The constituent ratios of the initials and finals presented in this short text are similar as those in Mandarin. Its value for subjective auditory perceptual evaluation of voice disorders need further study.

  8. Objective sampling design in a highly heterogeneous landscape - characterizing environmental determinants of malaria vector distribution in French Guiana, in the Amazonian region.

    Science.gov (United States)

    Roux, Emmanuel; Gaborit, Pascal; Romaña, Christine A; Girod, Romain; Dessay, Nadine; Dusfour, Isabelle

    2013-12-01

    Sampling design is a key issue when establishing species inventories and characterizing habitats within highly heterogeneous landscapes. Sampling efforts in such environments may be constrained and many field studies only rely on subjective and/or qualitative approaches to design collection strategy. The region of Cacao, in French Guiana, provides an excellent study site to understand the presence and abundance of Anopheles mosquitoes, their species dynamics and the transmission risk of malaria across various environments. We propose an objective methodology to define a stratified sampling design. Following thorough environmental characterization, a factorial analysis of mixed groups allows the data to be reduced and non-collinear principal components to be identified while balancing the influences of the different environmental factors. Such components defined new variables which could then be used in a robust k-means clustering procedure. Then, we identified five clusters that corresponded to our sampling strata and selected sampling sites in each stratum. We validated our method by comparing the species overlap of entomological collections from selected sites and the environmental similarities of the same sites. The Morisita index was significantly correlated (Pearson linear correlation) with environmental similarity based on i) the balanced environmental variable groups considered jointly (p = 0.001) and ii) land cover/use (p-value sampling approach. Land cover/use maps (based on high spatial resolution satellite images) were shown to be particularly useful when studying the presence, density and diversity of Anopheles mosquitoes at local scales and in very heterogeneous landscapes.

  9. Optimizing sampling design to deal with mist-net avoidance in Amazonian birds and bats.

    Directory of Open Access Journals (Sweden)

    João Tiago Marques

    Full Text Available Mist netting is a widely used technique to sample bird and bat assemblages. However, captures often decline with time because animals learn and avoid the locations of nets. This avoidance or net shyness can substantially decrease sampling efficiency. We quantified the day-to-day decline in captures of Amazonian birds and bats with mist nets set at the same location for four consecutive days. We also evaluated how net avoidance influences the efficiency of surveys under different logistic scenarios using re-sampling techniques. Net avoidance caused substantial declines in bird and bat captures, although more accentuated in the latter. Most of the decline occurred between the first and second days of netting: 28% in birds and 47% in bats. Captures of commoner species were more affected. The numbers of species detected also declined. Moving nets daily to minimize the avoidance effect increased captures by 30% in birds and 70% in bats. However, moving the location of nets may cause a reduction in netting time and captures. When moving the nets caused the loss of one netting day it was no longer advantageous to move the nets frequently. In bird surveys that could even decrease the number of individuals captured and species detected. Net avoidance can greatly affect sampling efficiency but adjustments in survey design can minimize this. Whenever nets can be moved without losing netting time and the objective is to capture many individuals, they should be moved daily. If the main objective is to survey species present then nets should still be moved for bats, but not for birds. However, if relocating nets causes a significant loss of netting time, moving them to reduce effects of shyness will not improve sampling efficiency in either group. Overall, our findings can improve the design of mist netting sampling strategies in other tropical areas.

  10. Investigations on spatial sound design based on measured room impulse responses

    NARCIS (Netherlands)

    Melchior, F.

    2011-01-01

    Developments in the area of spatial sound reproduction have led to a large variety of established audio systems. Systems based on stereophonic principles are extended and growing from two channels via the ITU-R BS.775 surround setup to larger systems with more channels including elevated

  11. Ecological concepts and strategies with relevance to energy-conscious spatial planning and design

    NARCIS (Netherlands)

    Stremke, S.; Koh, J.

    2010-01-01

    Sustainable systems utilise renewable energy sources and recycle materials effectively. In theory, solar radiation provides abundant energy to sustain humanity. Our capacity to utilise available sources, however, is limited and competition for resources is expected to increase in the future. Spatial

  12. Design of a sample acquistion system for the Mars exobiological penetrator

    Science.gov (United States)

    Thomson, Ron; Gwynne, Owen

    1988-01-01

    The Mars Exobiological Penetrator will be imbedded into several locations on the Martian surface. It contains various scientific instruments, such as an Alpha-Particle Instrument (API), Differential Scanning Calorimeter (DSC), Evolved Gas Analyzer (EGA) and accelerometers. A sample is required for analysis in the API and DSC. To avoid impact contaminated material, this sample must be taken from soil greater than 2 cm away from the penetrator shell. This study examines the design of a dedicated sampling system including deployment, suspension, fore/after body coupling, sample gathering and placement. To prevent subsurface material from entering the penetrator sampling compartment during impact, a plug is placed in the exit hole of the wall. A U-lever device is used to hold this plug in the penetrator wall. The U-lever rotates upon initial motion of the core-grinder mechanism (CGM), releasing the plug. Research points to a combination of coring and grinding as a plausible solution to the problem of dry drilling. The CGM, driven by two compressed springs, will be deployed along a tracking system. A slowly varying load i.e., springs, is favored over a fixed displacement motion because of its adaptability to different material hardness. However, to accommodate sampling in a low density soil, two dash pots set a maximum transverse velocity. In addition, minimal power use is achieved by unidirectional motion of the CGM. The sample will be transported to the scientific instruments by means of a sample placement tray that is driven by a compressed spring to avoid unnecessary power usage. This paper also explores possible modifications for size, weight, and time as well as possible future studies.

  13. Optimization of sampling pattern and the design of Fourier ptychographic illuminator.

    Science.gov (United States)

    Guo, Kaikai; Dong, Siyuan; Nanda, Pariksheet; Zheng, Guoan

    2015-03-09

    Fourier ptychography (FP) is a recently developed imaging approach that facilitates high-resolution imaging beyond the cutoff frequency of the employed optics. In the original FP approach, a periodic LED array is used for sample illumination, and therefore, the scanning pattern is a uniform grid in the Fourier space. Such a uniform sampling scheme leads to 3 major problems for FP, namely: 1) it requires a large number of raw images, 2) it introduces the raster grid artefacts in the reconstruction process, and 3) it requires a high-dynamic-range detector. Here, we investigate scanning sequences and sampling patterns to optimize the FP approach. For most biological samples, signal energy is concentrated at low-frequency region, and as such, we can perform non-uniform Fourier sampling in FP by considering the signal structure. In contrast, conventional ptychography perform uniform sampling over the entire real space. To implement the non-uniform Fourier sampling scheme in FP, we have designed and built an illuminator using LEDs mounted on a 3D-printed plastic case. The advantages of this illuminator are threefold in that: 1) it reduces the number of image acquisitions by at least 50% (68 raw images versus 137 in the original FP setup), 2) it departs from the translational symmetry of sampling to solve the raster grid artifact problem, and 3) it reduces the dynamic range of the captured images 6 fold. The results reported in this paper significantly shortened acquisition time and improved quality of FP reconstructions. It may provide new insights for developing Fourier ptychographic imaging platforms and find important applications in digital pathology.

  14. Spatial filtring and thermocouple spatial filter

    International Nuclear Information System (INIS)

    Han Bing; Tong Yunxian

    1989-12-01

    The design and study on thermocouple spatial filter have been conducted for the flow measurement of integrated reactor coolant. The fundamental principle of spatial filtring, mathematical descriptions and analyses of thermocouple spatial filter are given

  15. Spatial correlation analysis of seismic noise for STAR X-ray infrastructure design

    Science.gov (United States)

    D'Alessandro, Antonino; Agostino, Raffaele; Festa, Lorenzo; Gervasi, Anna; Guerra, Ignazio; Palmer, Dennis T.; Serafini, Luca

    2014-05-01

    The Italian PON MaTeRiA project is focused on the creation of a research infrastructure open to users based on an innovative and evolutionary X-ray source. This source, named STAR (Southern Europe TBS for Applied Research), exploits the Thomson backscattering process of a laser radiation by fast-electron beams (Thomson Back Scattering - TBS). Its main performances are: X-ray photon flux 109-1010 ph/s, Angular divergence variable between 2 and 10 mrad, X-ray energy continuously variable between 8 keV and 150 keV, Bandwidth ΔE/E variable between 1 and 10%, ps time resolved structure. In order to achieve this performances, bunches of electrons produced by a photo-injector are accelerated to relativistic velocities by a linear accelerator section. The electron beam, few hundreds of micrometer wide, is driven by magnetic fields to the interaction point along a 15 m transport line where it is focused in a 10 micrometer-wide area. In the same area, the laser beam is focused after being transported along a 12 m structure. Ground vibrations could greatly affect the collision probability and thus the emittance by deviating the paths of the beams during their travel in the STAR source. Therefore, the study program to measure ground vibrations in the STAR site can be used for site characterization in relation to accelerator design. The environmental and facility noise may affect the X-ray operation especially if the predominant wavelengths in the microtremor wavefield are much smaller than the size of the linear accelerator. For wavelength much greater, all the accelerator parts move in phase, and therefore also large displacements cannot generate any significant effect. On the other hand, for wavelengths equal or less than half the accelerator size several parts could move in phase opposition and therefore small displacements could affect its proper functioning. Thereafter, it is important to characterize the microtremor wavefield in both frequencies and wavelengths domains

  16. A liquid scintillation counter specifically designed for samples deposited on a flat matrix

    International Nuclear Information System (INIS)

    Potter, C.G.; Warner, G.T.

    1986-01-01

    A prototype liquid scintillation counter has been designed to count samples deposited as a 6x16 array on a flat matrix. Applications include the counting of labelled cells processed by a cell harvester from 96-well microtitration plates onto glass fibre filters and of DNA samples directly deposited onto nitrocellulose or nylon transfer membranes (e.g. 'Genescreen' NEN) for genetic studies by dot-blot hybridisation. The whole filter is placed in a bag with 4-12 ml of scintillant, sufficient to count all 96 samples. Nearest-neighbour intersample cross talk ranged from 0.004% for 3 H to 0.015% for 32 P. Background was 1.4 counts/min for glass fibre and 0.7 counts/min for 'Genescreen' in the 3 H channel: for 14 C the respective figures were 5.3 and 4.3 counts/min. Counting efficiency for 3 H-labelled cells on glass fibre was 54%(E 2 /B=2053) and 26% for tritiated thymidine spotted on 'Genescreen'(E 2 /B=980). Similar 14 C samples gave figures on 97%(E 2 /B=1775) and 81(E 2 B=1526) respectively. Electron emission counting from samples containing 125 I and 51 Cr was also possible. (U.K.)

  17. Design of a Clean Room for Quality Control of an Environmental Sampling in KINAC

    International Nuclear Information System (INIS)

    Yoon, Jongho; Ahn, Gil Hoon; Seo, Hana; Han, Kitek; Park, Il Jin

    2014-01-01

    The objective of environmental sampling and analysis for safeguards is to characterize the nuclear materials handled and the activities conducted at the specific locations. The KINAC is responsible for the conclusions drawn from the analytical results provided by the analytical laboratories. To assure the KINAC of the continuity of the quality of the analytical results provided by the laboratories, the KINAC will implement a quality control(QC) programme. One of the QC programme is to prepare QC samples. The establishment of a clean room is needed to handle QC samples due to stringent control of contamination. The KINAC designed a clean facility with cleanliness of ISO Class 6, the Clean Room for Estimation and Assay of trace Nuclear materials(CREAN) to meet conflicting requirements of a clean room and for handling of nuclear materials according to Korean laws. The clean room will be expected to acquire of a radiation safety license under these conditions in this year and continue to improve it. The construction of the CREAN facility will be completed by the middle of 2015. In terms of QC programme, the establishment of a clean room is essential and will be not only very helpful for setting of quality control system for the national environmental sampling programme but also be applied for the environmental sample analysis techniques to the nuclear forensics

  18. Decreasing spatial disorientation in care-home settings: How psychology can guide the development of dementia friendly design guidelines.

    Science.gov (United States)

    O'Malley, Mary; Innes, Anthea; Wiener, Jan M

    2017-04-01

    Alzheimer's disease results in marked declines in navigation skills that are particularly pronounced in unfamiliar environments. However, many people with Alzheimer's disease eventually face the challenge of having to learn their way around unfamiliar environments when moving into assisted living or care-homes. People with Alzheimer's disease would have an easier transition moving to new residences if these larger, and often more institutional, environments were designed to compensate for decreasing orientation skills. However, few existing dementia friendly design guidelines specifically address orientation and wayfinding. Those that do are often based on custom, practice or intuition and not well integrated with psychological and neuroscientific knowledge or navigation research, therefore often remaining unspecific. This paper discusses current dementia friendly design guidelines, reports findings from psychological and neuropsychological experiments on navigation and evaluates their potential for informing design guidelines that decrease spatial disorientation for people with dementia.

  19. Hypercell : A bio-inspired information design framework for real-time adaptive spatial components

    NARCIS (Netherlands)

    Biloria, N.M.; Chang, J.R.

    2012-01-01

    Contemporary explorations within the evolutionary computational domain have been heavily instrumental in exploring biological processes of adaptation, growth and mutation. On the other hand a plethora of designers owing to the increasing sophistication in computer aided design software are equally

  20. Dealing with trade-offs in destructive sampling designs for occupancy surveys.

    Directory of Open Access Journals (Sweden)

    Stefano Canessa

    Full Text Available Occupancy surveys should be designed to minimise false absences. This is commonly achieved by increasing replication or increasing the efficiency of surveys. In the case of destructive sampling designs, in which searches of individual microhabitats represent the repeat surveys, minimising false absences leads to an inherent trade-off. Surveyors can sample more low quality microhabitats, bearing the resultant financial costs and producing wider-spread impacts, or they can target high quality microhabitats were the focal species is more likely to be found and risk more severe impacts on local habitat quality. We show how this trade-off can be solved with a decision-theoretic approach, using the Millewa Skink Hemiergis millewae from southern Australia as a case study. Hemiergis millewae is an endangered reptile that is best detected using destructive sampling of grass hummocks. Within sites that were known to be occupied by H. millewae, logistic regression modelling revealed that lizards were more frequently detected in large hummocks. If this model is an accurate representation of the detection process, searching large hummocks is more efficient and requires less replication, but this strategy also entails destruction of the best microhabitats for the species. We developed an optimisation tool to calculate the minimum combination of the number and size of hummocks to search to achieve a given cumulative probability of detecting the species at a site, incorporating weights to reflect the sensitivity of the results to a surveyor's priorities. The optimisation showed that placing high weight on minimising volume necessitates impractical replication, whereas placing high weight on minimising replication requires searching very large hummocks which are less common and may be vital for H. millewae. While destructive sampling methods are sometimes necessary, surveyors must be conscious of the ecological impacts of these methods. This study provides a

  1. The Design and Use of Planetary Science Video Games to Teach Content while Enhancing Spatial Reasoning Skills

    Science.gov (United States)

    Ziffer, Julie; Nadirli, Orkhan; Rudnick, Benjamin; Pinkham, Sunny; Montgomery, Benjamin

    2016-10-01

    Traditional teaching of Planetary Science requires students to possess well developed spatial reasoning skills (SRS). Recent research has demonstrated that SRS, long known to be crucial to math and science success, can be improved among students who lack these skills (Sorby et al., 2009). Teaching spatial reasoning is particularly valuable to women and minorities who, through societal pressure, often doubt their abilities (Hill et al., 2010). To address SRS deficiencies, our team is developing video games that embed SRS training into Planetary Science content. Our first game, on Moon Phases, addresses the two primary challenges faced by students trying to understand the Sun-Earth-Moon system: 1) visualizing the system (specifically the difference between the Sun-Earth orbital plane and the Earth-Moon orbital plane) and 2) comprehending the relationship between time and the position-phase of the Moon. In our second video game, the student varies an asteroid's rotational speed, shape, and orientation to the light source while observing how these changes effect the resulting light curve. To correctly pair objects to their light curves, students use spatial reasoning skills to imagine how light scattering off a three dimensional rotating object is imaged on a sensor plane and is then reduced to a series of points on a light curve plot. These two games represent the first of our developing suite of high-interest video games designed to teach content while increasing the student's competence in spatial reasoning.

  2. Sampling design considerations for demographic studies: a case of colonial seabirds

    Science.gov (United States)

    Kendall, William L.; Converse, Sarah J.; Doherty, Paul F.; Naughton, Maura B.; Anders, Angela; Hines, James E.; Flint, Elizabeth

    2009-01-01

    For the purposes of making many informed conservation decisions, the main goal for data collection is to assess population status and allow prediction of the consequences of candidate management actions. Reducing the bias and variance of estimates of population parameters reduces uncertainty in population status and projections, thereby reducing the overall uncertainty under which a population manager must make a decision. In capture-recapture studies, imperfect detection of individuals, unobservable life-history states, local movement outside study areas, and tag loss can cause bias or precision problems with estimates of population parameters. Furthermore, excessive disturbance to individuals during capture?recapture sampling may be of concern because disturbance may have demographic consequences. We address these problems using as an example a monitoring program for Black-footed Albatross (Phoebastria nigripes) and Laysan Albatross (Phoebastria immutabilis) nesting populations in the northwestern Hawaiian Islands. To mitigate these estimation problems, we describe a synergistic combination of sampling design and modeling approaches. Solutions include multiple capture periods per season and multistate, robust design statistical models, dead recoveries and incidental observations, telemetry and data loggers, buffer areas around study plots to neutralize the effect of local movements outside study plots, and double banding and statistical models that account for band loss. We also present a variation on the robust capture?recapture design and a corresponding statistical model that minimizes disturbance to individuals. For the albatross case study, this less invasive robust design was more time efficient and, when used in combination with a traditional robust design, reduced the standard error of detection probability by 14% with only two hours of additional effort in the field. These field techniques and associated modeling approaches are applicable to studies of

  3. Sample Processor for Life on Icy Worlds (SPLIce): Design and Test Results

    Science.gov (United States)

    Chinn, Tori N.; Lee, Anthony K.; Boone, Travis D.; Tan, Ming X.; Chin, Matthew M.; McCutcheon, Griffin C.; Horne, Mera F.; Padgen, Michael R.; Blaich, Justin T.; Forgione, Joshua B.; hide

    2017-01-01

    We report the design, development, and testing of the Sample Processor for Life on Icy Worlds (SPLIce) system, a microfluidic sample processor to enable autonomous detection of signatures of life and measurements of habitability parameters in Ocean Worlds. This monolithic fluid processing-and-handling system (Figure 1; mass 0.5 kg) retrieves a 50-L-volume sample and prepares it to supply a suite of detection instruments, each with unique preparation needs. SPLIce has potential applications in orbiter missions that sample ocean plumes, such as found in Saturns icy moon Enceladus, or landed missions on the surface of icy satellites, such as Jupiters moon Europa. Answering the question Are we alone in the universe? is captivating and exceptionally challenging. Even general criteria that define life very broadly include a significant role for water [1,2]. Searches for extinct or extant life therefore prioritize locations of abundant water whether in ancient (Mars), or present (Europa and Enceladus) times. Only two previous planetary missions had onboard fluid processing: the Viking Biology Experiments [3] and Phoenixs Wet Chemistry Laboratory (WCL) [4]. SPLIce differs crucially from those systems, including its capability to process and distribute L-volume samples and the integration autonomous control of a wide range of fluidic functions, including: 1) retrieval of fluid samples from an evacuated sample chamber; 2) onboard multi-year storage of dehydrated reagents; 3) integrated pressure, pH, and conductivity measurement; 4) filtration and retention of insoluble particles for microscopy; 5) dilution or vacuum-driven concentration of samples to accommodate instrument working ranges; 6) removal of gas bubbles from sample aliquots; 7) unidirectional flow (check valves); 8) active flow-path selection (solenoid-actuated valves); 9) metered pumping in 100 nL volume increments. The SPLIce manifold, made of three thermally fused layers of precision-machined cyclo

  4. Design and construction of a prototype vaporization calorimeter for the assay of radioisotopic samples

    International Nuclear Information System (INIS)

    Tormey, T.V.

    1979-10-01

    A prototype vaporization calorimeter has been designed and constructed for use in the assay of low power output radioisotopic samples. The prototype calorimeter design was based on that of a previous experimental instrument used by H.P. Stephens, to establish the feasibility of the vaporization calorimetry technique for this type of power measurement. The calorimeter is composed of a mechanical calorimeter assembly together with a data acquisition and control system. Detailed drawings of the calorimeter assembly are included and additional drawings are referenced. The data acquisition system is based on an HP 9825A programmable calculator. A description of the hardware is provided together with a listing of all system software programs. The operating procedure is outlined, including initial setup and operation of all related equipment. Preliminary system performance was evaluated by making a series of four measurements on two nominal 1.5W samples and on a nominal 0.75W sample. Data for these measurements indicate that the absolute accuracy (one standard deviation) is approx. = 0.0035W in this power range, resulting in an estimated relative one standard deviation accuracy of 0.24% at 1.5W and 0.48% at 0.75W

  5. Selection and Penalty Strategies for Genetic Algorithms Designed to Solve Spatial Forest Planning Problems

    International Nuclear Information System (INIS)

    Thompson, M.P.; Sessions, J.; Hamann, J.D.

    2009-01-01

    Genetic algorithms (GAs) have demonstrated success in solving spatial forest planning problems. We present an adaptive GA that incorporates population-level statistics to dynamically update penalty functions, a process analogous to strategic oscillation from the tabu search literature. We also explore performance of various selection strategies. The GA identified feasible solutions within 96%, 98%, and 93% of a non spatial relaxed upper bound calculated for landscapes of 100, 500, and 1000 units, respectively. The problem solved includes forest structure constraints limiting harvest opening sizes and requiring minimally sized patches of mature forest. Results suggest that the dynamic penalty strategy is superior to the more standard static penalty implementation. Results also suggest that tournament selection can be superior to the more standard implementation of proportional selection for smaller problems, but becomes susceptible to premature convergence as problem size increases. It is therefore important to balance selection pressure with appropriate disruption. We conclude that integrating intelligent search strategies into the context of genetic algorithms can yield improvements and should be investigated for future use in spatial planning with ecological goals.

  6. DESIGN AND CALIBRATION OF A VIBRANT SAMPLE MAGNETOMETER: CHARACTERIZATION OF MAGNETIC MATERIALS

    Directory of Open Access Journals (Sweden)

    Freddy P. Guachun

    2018-01-01

    Full Text Available This paper presents the process followed in the implementation of a vibrating sample magnetometer (VSM, constructed with materials commonly found in an electromagnetism laboratory. It describes the design, construction, calibration and use in the characterization of some magnetic materials. A VSM measures the magnetic moment of a sample when it is vibrated perpendicular to a uniform magnetic field; Magnetization and magnetic susceptibility can be determined from these readings. This instrument stands out for its simplicity, versatility and low cost, but it is very sensitive and capable of eliminating or minimizing many sources of error that are found in other methods of measurement, allowing to obtain very accurate and reliable results. Its operation is based on the law of magnetic induction of Lenz-Faraday that consists in measuring the induced voltage in coils of detection produced by the variation of the magnetic flux that crosses them. The calibration of the VSM was performed by means of a standard sample (Magnetite and verified by means of a test sample (Nickel.

  7. Design and theoretical investigation of a digital x-ray detector with large area and high spatial resolution

    Science.gov (United States)

    Gui, Jianbao; Guo, Jinchuan; Yang, Qinlao; Liu, Xin; Niu, Hanben

    2007-05-01

    X-ray phase contrast imaging is a promising new technology today, but the requirements of a digital detector with large area, high spatial resolution and high sensitivity bring forward a large challenge to researchers. This paper is related to the design and theoretical investigation of an x-ray direct conversion digital detector based on mercuric iodide photoconductive layer with the latent charge image readout by photoinduced discharge (PID). Mercuric iodide has been verified having a good imaging performance (high sensitivity, low dark current, low voltage operation and good lag characteristics) compared with the other competitive materials (α-Se,PbI II,CdTe,CdZnTe) and can be easily deposited on large substrates in the manner of polycrystalline. By use of line scanning laser beam and parallel multi-electrode readout make the system have high spatial resolution and fast readout speed suitable for instant general radiography and even rapid sequence radiography.

  8. Neutron multicounter detector for investigation of content and spatial distribution of fission materials in large volume samples

    International Nuclear Information System (INIS)

    Swiderska-Kowalczyk, M.; Starosta, W.; Zoltowski, T.

    1998-01-01

    The experimental device is a neutron coincidence well counter. It can be applied for passive assay of fissile - especially for plutonium bearing - materials. It consist of a set of 3 He tubes placed inside a polyethylene moderator; outputs from the tubes, first processed by preamplifier/amplifier/discriminator circuits, are then analysed using neutron correlator connected with a PC, and correlation techniques implemented in software. Such a neutron counter allows for determination of plutonium mass ( 240 Pu effective mass) in nonmultiplying samples having fairly big volume (up to 0.14 m 3 ). For determination of neutron sources distribution inside the sample, the heuristic methods based on hierarchical cluster analysis are applied. As an input parameters, amplitudes and phases of two-dimensional Fourier transformation of the count profiles matrices for known point sources distributions and for the examined samples, are taken. Such matrices are collected by means of sample scanning by detection head. During clustering process, counts profiles for unknown samples fitted into dendrograms using the 'proximity' criterion of the examined sample profile to standard samples profiles. Distribution of neutron sources in an examined sample is then evaluated on the basis of comparison with standard sources distributions. (author)

  9. The Design of Sample Driver System for Gamma Irradiator Facility at Thermal Column of Kartini Reactor

    International Nuclear Information System (INIS)

    Suyamto; Tasih Mulyono; Setyo Atmojo

    2007-01-01

    The design and construction of sample driver system for gamma irradiator facility at thermal column of Kartini reactor post operation has been carried out. The design and construction is based on the space of thermal column and the sample speed rotation which has to as low as possible in order the irradiation process can be more homogeneity. The electrical and mechanical calculation was done after fixation the electrical motor and transmission system which will be applied. By the assumption that the maximum sample weight is 50 kg, the electric motor specification is decided due to its rating i.e. single phase induction motor, run capacitor type, 0.5 HP; 220 V; 3.61 A, CCW and CW, rotation speed 1430 rpm. To achieve the low load rotation speed, motor speed was reduced twice using the conical reduction gear with the reduction ratio 3.9 and thread reduction gear with the reduction ratio 60. From the calculation it is found that power of motor is 118.06 watt, speed rotation of load sample is 6.11 rpm due to the no load rotation of motor 1430 rpm. From the test by varying weight of load up to 75 kg it is known that the device can be operated in a good condition, both in the two direction with the average speed of motor 1486 rpm and load 6.3 rpm respectively. So that the slip is 0.268 % and 0.314 % for no load and full load condition. The difference input current to the motor during no load and full load condition is relative small i.e. 0.14 A. The safety factor of motor is 316 % which is correspond to the weight of load 158 kg. (author)

  10. Design of cross-sensitive temperature and strain sensor based on sampled fiber grating

    Directory of Open Access Journals (Sweden)

    Zhang Xiaohang

    2017-02-01

    Full Text Available In this paper,a cross-sensitive temperature and strain sensor based on sampled fiber grating is designed.Its temperature measurement range is -50-200℃,and the strain measurement rangeis 0-2 000 με.The characteristics of the sensor are obtained using simulation method.Utilizing SPSS software,we found the dual-parameter matrix equations of measurement of temperature and strain,and calibrated the four sensing coefficients of the matrix equations.

  11. A sample design for globally consistent biomass estimation using lidar data from the Geoscience Laser Altimeter System (GLAS)

    Science.gov (United States)

    Sean P. Healey; Paul L. Patterson; Sassan S. Saatchi; Michael A. Lefsky; Andrew J. Lister; Elizabeth A. Freeman

    2012-01-01

    Lidar height data collected by the Geosciences Laser Altimeter System (GLAS) from 2002 to 2008 has the potential to form the basis of a globally consistent sample-based inventory of forest biomass. GLAS lidar return data were collected globally in spatially discrete full waveform "shots," which have been shown to be strongly correlated with aboveground forest...

  12. A computational study of a fast sampling valve designed to sample soot precursors inside a forming diesel spray plume

    International Nuclear Information System (INIS)

    Dumitrescu, Cosmin; Puzinauskas, Paulius V.; Agrawal, Ajay K.; Liu, Hao; Daly, Daniel T.

    2009-01-01

    Accurate chemical reaction mechanisms are critically needed to fully optimize combustion strategies for modern internal-combustion engines. These mechanisms are needed to predict emission formation and the chemical heat release characteristics for traditional direct-injection diesel as well as recently-developed and proposed variant combustion strategies. Experimental data acquired under conditions representative of such combustion strategies are required to validate these reaction mechanisms. This paper explores the feasibility of developing a fast sampling valve which extracts reactants at known locations in the spray reaction structure to provide these data. CHEMKIN software is used to establish the reaction timescales which dictate the required fast sampling capabilities. The sampling process is analyzed using separate FLUENT and CHEMKIN calculations. The non-reacting FLUENT CFD calculations give a quantitative estimate of the sample quantity as well as the fluid mixing and thermal history. A CHEMKIN reactor network has been created that reflects these mixing and thermal time scales and allows a theoretical evaluation of the quenching process

  13. Simulated optimization of crop yield through irrigation system design and operation based on the spatial variability of soil hydrodynamic properties

    International Nuclear Information System (INIS)

    Gurovich, L.; Stern, J.; Ramos, R.

    1983-01-01

    Spatial autocorrelation and kriging techniques were applied to soil infiltrability data from a 20 hectare field, to separate homogeneous irrigation units. Border irrigation systems were designed for each unit and combinations of units by using DESIGN, a computer model based on soil infiltrability and hydraulics of surface water flow, which enables optimal irrigation systems to be designed. Water depths effectively infiltrated at different points along the irrigation run were determined, and the agronomic irrigation efficiency of the unit evaluated. A modification of Hanks' evapotranspiration model, PLANTGRO, was used to evaluate plant growth, relative crop yield and soil-water economy throughout the growing season, at several points along each irrigation unit. The effect of different irrigation designs on total field yield and total water used for irrigation was evaluated by integrating yield values corresponding to each point, volume and inflow time during each irrigation. For relevant data from winter wheat grown in the central area of Chile during 1981, simulation by an interactive and sequentially recurrent use of DESIGN and PLANTGRO models, was carried out. The results obtained indicate that, when a field is separated into homogeneous irrigation units on the basis of the spatial variability of soil infiltrability and the border irrigation systems are designed according to soil characteristics, both a significant yield increase and less water use can be obtained by comparison with other criteria of field zonification for irrigation management. The use of neutrometric determinations to assess soil-water content during the growing season, as a validation of the results obtained in this work, is discussed. (author)

  14. Design and validation of dynamic hierarchies and adaptive layouts using spatial graph grammars

    NARCIS (Netherlands)

    Liao, K.; Kong, J.; Zhang, K.; de Vries, B.; Griffth, D.A.; Chun, Y.; Dean, D.J.

    2017-01-01

    With the thinking paradigm shifting on the evolution of complex adaptive systems, a pattern-based design approach is reviewed and reinterpreted. Although a variety of long-term and lasting explorations on patterns in geographical analysis, environmental planning, and design exist, in-depth

  15. Design of Field Experiments for Adaptive Sampling of the Ocean with Autonomous Vehicles

    Science.gov (United States)

    Zheng, H.; Ooi, B. H.; Cho, W.; Dao, M. H.; Tkalich, P.; Patrikalakis, N. M.

    2010-05-01

    Due to the highly non-linear and dynamical nature of oceanic phenomena, the predictive capability of various ocean models depends on the availability of operational data. A practical method to improve the accuracy of the ocean forecast is to use a data assimilation methodology to combine in-situ measured and remotely acquired data with numerical forecast models of the physical environment. Autonomous surface and underwater vehicles with various sensors are economic and efficient tools for exploring and sampling the ocean for data assimilation; however there is an energy limitation to such vehicles, and thus effective resource allocation for adaptive sampling is required to optimize the efficiency of exploration. In this paper, we use physical oceanography forecasts of the coastal zone of Singapore for the design of a set of field experiments to acquire useful data for model calibration and data assimilation. The design process of our experiments relied on the oceanography forecast including the current speed, its gradient, and vorticity in a given region of interest for which permits for field experiments could be obtained and for time intervals that correspond to strong tidal currents. Based on these maps, resources available to our experimental team, including Autonomous Surface Craft (ASC) are allocated so as to capture the oceanic features that result from jets and vortices behind bluff bodies (e.g., islands) in the tidal current. Results are summarized from this resource allocation process and field experiments conducted in January 2009.

  16. A UAV-Based Fog Collector Design for Fine-Scale Aerobiological Sampling

    Science.gov (United States)

    Gentry, Diana; Guarro, Marcello; Demachkie, Isabella Siham; Stumfall, Isabel; Dahlgren, Robert P.

    2017-01-01

    Airborne microbes are found throughout the troposphere and into the stratosphere. Knowing how the activity of airborne microorganisms can alter water, carbon, and other geochemical cycles is vital to a full understanding of local and global ecosystems. Just as on the land or in the ocean, atmospheric regions vary in habitability; the underlying geochemical, climatic, and ecological dynamics must be characterized at different scales to be effectively modeled. Most aerobiological studies have focused on a high level: 'How high are airborne microbes found?' and 'How far can they travel?' Most fog and cloud water studies collect from stationary ground stations (point) or along flight transects (1D). To complement and provide context for this data, we have designed a UAV-based modified fog and cloud water collector to retrieve 4D-resolved samples for biological and chemical analysis.Our design uses a passive impacting collector hanging from a rigid rod suspended between two multi-rotor UAVs. The suspension design reduces the effect of turbulence and potential for contamination from the UAV downwash. The UAVs are currently modeled in a leader-follower configuration, taking advantage of recent advances in modular UAVs, UAV swarming, and flight planning.The collector itself is a hydrophobic mesh. Materials including Tyvek, PTFE, nylon, and polypropylene monofilament fabricated via laser cutting, CNC knife, or 3D printing were characterized for droplet collection efficiency using a benchtop atomizer and particle counter. Because the meshes can be easily and inexpensively fabricated, a set can be pre-sterilized and brought to the field for 'hot swapping' to decrease cross-contamination between flight sessions or use as negative controls.An onboard sensor and logging system records the time and location of each sample; when combined with flight tracking data, the samples can be resolved into a 4D volumetric map of the fog bank. Collected samples can be returned to the lab for

  17. Physical effects of mechanical design parameters on photon sensitivity and spatial resolution performance of a breast-dedicated PET system.

    Science.gov (United States)

    Spanoudaki, V C; Lau, F W Y; Vandenbroucke, A; Levin, C S

    2010-11-01

    This study aims to address design considerations of a high resolution, high sensitivity positron emission tomography scanner dedicated to breast imaging. The methodology uses a detailed Monte Carlo model of the system structures to obtain a quantitative evaluation of several performance parameters. Special focus was given to the effect of dense mechanical structures designed to provide mechanical robustness and thermal regulation to the minuscule and temperature sensitive detectors. For the energies of interest around the photopeak (450-700 keV energy window), the simulation results predict a 6.5% reduction in the single photon detection efficiency and a 12.5% reduction in the coincidence photon detection efficiency in the case that the mechanical structures are interspersed between the detectors. However for lower energies, a substantial increase in the number of detected events (approximately 14% and 7% for singles at a 100-200 keV energy window and coincidences at a lower energy threshold of 100 keV, respectively) was observed with the presence of these structures due to backscatter. The number of photon events that involve multiple interactions in various crystal elements is also affected by the presence of the structures. For photon events involving multiple interactions among various crystal elements, the coincidence photon sensitivity is reduced by as much as 20% for a point source at the center of the field of view. There is no observable effect on the intrinsic and the reconstructed spatial resolution and spatial resolution uniformity. Mechanical structures can have a considerable effect on system sensitivity, especially for systems processing multi-interaction photon events. This effect, however, does not impact the spatial resolution. Various mechanical structure designs are currently under evaluation in order to achieve optimum trade-off between temperature stability, accurate detector positioning, and minimum influence on system performance.

  18. Predictive Sampling of Rare Conformational Events in Aqueous Solution: Designing a Generalized Orthogonal Space Tempering Method.

    Science.gov (United States)

    Lu, Chao; Li, Xubin; Wu, Dongsheng; Zheng, Lianqing; Yang, Wei

    2016-01-12

    analysis suggests that because essential conformational events are mainly driven by the compensating fluctuations of essential solute-solvent and solute-solute interactions, commonly employed "predictive" sampling methods are unlikely to be effective on this seemingly "simple" system. The gOST development presented in this paper illustrates how to employ the OSS scheme for physics-based sampling method designs.

  19. Sampling design for the Study of Cardiovascular Risks in Adolescents (ERICA

    Directory of Open Access Journals (Sweden)

    Mauricio Teixeira Leite de Vasconcellos

    2015-05-01

    Full Text Available The Study of Cardiovascular Risk in Adolescents (ERICA aims to estimate the prevalence of cardiovascular risk factors and metabolic syndrome in adolescents (12-17 years enrolled in public and private schools of the 273 municipalities with over 100,000 inhabitants in Brazil. The study population was stratified into 32 geographical strata (27 capitals and five sets with other municipalities in each macro-region of the country and a sample of 1,251 schools was selected with probability proportional to size. In each school three combinations of shift (morning and afternoon and grade were selected, and within each of these combinations, one class was selected. All eligible students in the selected classes were included in the study. The design sampling weights were calculated by the product of the reciprocals of the inclusion probabilities in each sampling stage, and were later calibrated considering the projections of the numbers of adolescents enrolled in schools located in the geographical strata by sex and age.

  20. Coastal California's Fog as a Unique Habitable Niche: Design for Autonomous Sampling and Preliminary Aerobiological Characterization

    Science.gov (United States)

    Gentry, Diana; Cynthia Ouandji; Arismendi, Dillon; Guarro, Marcello; Demachkie, Isabella; Crosbie, Ewan; Dadashazar, Hossein; MacDonald, Alex B.; Wang, Zhen; Sorooshian, Armin; hide

    2017-01-01

    Just as on the land or in the ocean, atmospheric regions may be more or less hospitable to life. The aerobiosphere, or collection of living things in Earth's atmosphere, is poorly understood due to the small number and ad hoc nature of samples studied. However, we know viable airborne microbes play important roles, such as providing cloud condensation nuclei. Knowing the distribution of such microorganisms and how their activity can alter water, carbon, and other geochemical cycles is key to developing criteria for planetary habitability, particularly for potential habitats with wet atmospheres but little stable surface water. Coastal California has regular, dense fog known to play a major transport role in the local ecosystem. In addition to the significant local (1 km) geographical variation in typical fog, previous studies have found that changes in height above surface of as little as a few meters can yield significant differences in typical concentrations, populations and residence times. No single current sampling platform (ground-based impactors, towers, balloons, aircraft) is capable of accessing all of these regions of interest.A novel passive fog and cloud water sampler, consisting of a lightweight passive impactor suspended from autonomous aerial vehicles (UAVs), is being developed to allow 4D point sampling within a single fog bank, allowing closer study of small-scale (100 m) system dynamics. Fog and cloud droplet water samples from low-altitude aircraft flights in nearby coastal waters were collected and assayed to estimate the required sample volumes, flight times, and sensitivity thresholds of the system under design.125 cloud water samples were collected from 16 flights of the Center for Interdisciplinary Remotely Piloted Aircraft Studies (CIRPAS) instrumented Twin Otter, equipped with a sampling tube collector, occurring between 18 July and 12 August 2016 below 1 km altitude off the central coast. The collector was flushed first with 70 ethanol

  1. Design considerations for a high-spatial-resolution positron camera with dense-drift-space MWPC's

    International Nuclear Information System (INIS)

    Del Guerra, A.; Perez-Mendez, V.; Schwartz, G.; Nelson, W.R.

    1982-10-01

    A multiplane Positron Cameris is proposed, made of six MWPC modules arranged to form the lateral surface of a hexagonal prism. Each module (50 x 50 cm 2 ) has a 2 cm thick lead-glass tube converter on both sides of a MWPC pressurized to 2 atm. Experimental measurements are presented to show how to reduce the parallax error by determining in which of the two converter layers the photon has interacted. The results of a detailed Monte Carlo calculation for the efficiency of this type of converter are shown to be in excellent agreement with the experimental measurements. The expected performance of the Positron Camera is presented: a true coincidence rate of 56,000 counts/s (with an equal accidental coincidence rate and a 30% Compton scatter contamination) and a spatial resolution better than 5.0 mm (FWHM) for a 400 μ Ci point-like source embedded in a 10 cm radius water phantom

  2. “Markhi” spatial design structure: numerical study of its work under static load

    Directory of Open Access Journals (Sweden)

    Alpatov Vadim

    2016-01-01

    Full Text Available There is a problem of internal stress volume existing for some types of spatial structures and their joint connections. The problem occurs when a massive body is used as a joint connector. It is quite simple to determine tension on this joint connector surface using electric resistive tensometry method. It is not simple though to empirically determine internal tension in the massive body of the connector. To determine internal tension we can use modern calculation systems, such as Ansys, Abaqus, CosmosWorks, Nastran, Autodesk Inventor, Robot Structural Analysis, Bentley STAAD, CSI SAP2000; etc: Internal tension analysis in a massive joint connector makes possible to select both surplus stock parts and shortage stock parts. In this paper the authors base their analysis on both surface and internal tension of MARKHI connector and come up with solutions for its improvement.

  3. Importance of sampling design and analysis in animal population studies: a comment on Sergio et al

    Science.gov (United States)

    Kery, M.; Royle, J. Andrew; Schmid, Hans

    2008-01-01

    1. The use of predators as indicators and umbrellas in conservation has been criticized. In the Trentino region, Sergio et al. (2006; hereafter SEA) counted almost twice as many bird species in quadrats located in raptor territories than in controls. However, SEA detected astonishingly few species. We used contemporary Swiss Breeding Bird Survey data from an adjacent region and a novel statistical model that corrects for overlooked species to estimate the expected number of bird species per quadrat in that region. 2. There are two anomalies in SEA which render their results ambiguous. First, SEA detected on average only 6.8 species, whereas a value of 32 might be expected. Hence, they probably overlooked almost 80% of all species. Secondly, the precision of their mean species counts was greater in two-thirds of cases than in the unlikely case that all quadrats harboured exactly the same number of equally detectable species. This suggests that they detected consistently only a biased, unrepresentative subset of species. 3. Conceptually, expected species counts are the product of true species number and species detectability p. Plenty of factors may affect p, including date, hour, observer, previous knowledge of a site and mobbing behaviour of passerines in the presence of predators. Such differences in p between raptor and control quadrats could have easily created the observed effects. Without a method that corrects for such biases, or without quantitative evidence that species detectability was indeed similar between raptor and control quadrats, the meaning of SEA's counts is hard to evaluate. Therefore, the evidence presented by SEA in favour of raptors as indicator species for enhanced levels of biodiversity remains inconclusive. 4. Synthesis and application. Ecologists should pay greater attention to sampling design and analysis in animal population estimation. Species richness estimation means sampling a community. Samples should be representative for the

  4. Bayesian assessment of the expected data impact on prediction confidence in optimal sampling design

    Science.gov (United States)

    Leube, P. C.; Geiges, A.; Nowak, W.

    2012-02-01

    Incorporating hydro(geo)logical data, such as head and tracer data, into stochastic models of (subsurface) flow and transport helps to reduce prediction uncertainty. Because of financial limitations for investigation campaigns, information needs toward modeling or prediction goals should be satisfied efficiently and rationally. Optimal design techniques find the best one among a set of investigation strategies. They optimize the expected impact of data on prediction confidence or related objectives prior to data collection. We introduce a new optimal design method, called PreDIA(gnosis) (Preposterior Data Impact Assessor). PreDIA derives the relevant probability distributions and measures of data utility within a fully Bayesian, generalized, flexible, and accurate framework. It extends the bootstrap filter (BF) and related frameworks to optimal design by marginalizing utility measures over the yet unknown data values. PreDIA is a strictly formal information-processing scheme free of linearizations. It works with arbitrary simulation tools, provides full flexibility concerning measurement types (linear, nonlinear, direct, indirect), allows for any desired task-driven formulations, and can account for various sources of uncertainty (e.g., heterogeneity, geostatistical assumptions, boundary conditions, measurement values, model structure uncertainty, a large class of model errors) via Bayesian geostatistics and model averaging. Existing methods fail to simultaneously provide these crucial advantages, which our method buys at relatively higher-computational costs. We demonstrate the applicability and advantages of PreDIA over conventional linearized methods in a synthetic example of subsurface transport. In the example, we show that informative data is often invisible for linearized methods that confuse zero correlation with statistical independence. Hence, PreDIA will often lead to substantially better sampling designs. Finally, we extend our example to specifically

  5. ANALYSIS OF SPATIAL STRUCTURE AND SOCIAL SIGNIFICANCE OF A SAMPLE OF HAMMĀMS IN MEDITERRANEAN CITIES

    Directory of Open Access Journals (Sweden)

    Roula Aboukhater

    2008-11-01

    Full Text Available The hammām is a public building which is traditionally closely linked to socio cultural norms of the society that is supposed to serve. This paper seeks to answer questions about the logic by which such buildings respond to those complex socio cultural relations and the potentials offered by their spatial structures. The hypothesis in analyzing the internal layout is based on the ability of forms to adapt to socio cultural norms of certain societies and that they could be shaped to respond to social needs and to produce appropriate behavior. This study is based on the analysis of the morphological characteristics of the internal layouts of several hammāms, the socio-historical information, the direct observation of the spaces and face to face interviews with staff especially those working in hammām Ammuna in Damascus. The main objective is to explore the following questions: 1 How are hammāms “designed” to fulfi ll users’ social needs and their well-being in the internal spaces? 2 How architectural settings in the internal spaces of the hammām are “coded” or “structured” to produce appropriate social practice or behavior? This paper demonstrates that hammāms are the witnesses of a genius locus of adaptation of a building to sociocultural norms.

  6. GIS-assisted spatial analysis for urban regulatory detailed planning: designer's dimension in the Chinese code system

    Science.gov (United States)

    Yu, Yang; Zeng, Zheng

    2009-10-01

    By discussing the causes behind the high amendments ratio in the implementation of urban regulatory detailed plans in China despite its law-ensured status, the study aims to reconcile conflict between the legal authority of regulatory detailed planning and the insufficient scientific support in its decision-making and compilation by introducing into the process spatial analysis based on GIS technology and 3D modeling thus present a more scientific and flexible approach to regulatory detailed planning in China. The study first points out that the current compilation process of urban regulatory detailed plan in China employs mainly an empirical approach which renders it constantly subjected to amendments; the study then discusses the need and current utilization of GIS in the Chinese system and proposes the framework of a GIS-assisted 3D spatial analysis process from the designer's perspective which can be regarded as an alternating processes between the descriptive codes and physical design in the compilation of regulatory detailed planning. With a case study of the processes and results from the application of the framework, the paper concludes that the proposed framework can be an effective instrument which provides more rationality, flexibility and thus more efficiency to the compilation and decision-making process of urban regulatory detailed plan in China.

  7. Finding Biomarker Signatures in Pooled Sample Designs: A Simulation Framework for Methodological Comparisons

    Directory of Open Access Journals (Sweden)

    Anna Telaar

    2010-01-01

    Full Text Available Detection of discriminating patterns in gene expression data can be accomplished by using various methods of statistical learning. It has been proposed that sample pooling in this context would have negative effects; however, pooling cannot always be avoided. We propose a simulation framework to explicitly investigate the parameters of patterns, experimental design, noise, and choice of method in order to find out which effects on classification performance are to be expected. We use a two-group classification task and simulated gene expression data with independent differentially expressed genes as well as bivariate linear patterns and the combination of both. Our results show a clear increase of prediction error with pool size. For pooled training sets powered partial least squares discriminant analysis outperforms discriminance analysis, random forests, and support vector machines with linear or radial kernel for two of three simulated scenarios. The proposed simulation approach can be implemented to systematically investigate a number of additional scenarios of practical interest.

  8. Cluster designs to assess the prevalence of acute malnutrition by lot quality assurance sampling: a validation study by computer simulation.

    Science.gov (United States)

    Olives, Casey; Pagano, Marcello; Deitchler, Megan; Hedt, Bethany L; Egge, Kari; Valadez, Joseph J

    2009-04-01

    Traditional lot quality assurance sampling (LQAS) methods require simple random sampling to guarantee valid results. However, cluster sampling has been proposed to reduce the number of random starting points. This study uses simulations to examine the classification error of two such designs, a 67x3 (67 clusters of three observations) and a 33x6 (33 clusters of six observations) sampling scheme to assess the prevalence of global acute malnutrition (GAM). Further, we explore the use of a 67x3 sequential sampling scheme for LQAS classification of GAM prevalence. Results indicate that, for independent clusters with moderate intracluster correlation for the GAM outcome, the three sampling designs maintain approximate validity for LQAS analysis. Sequential sampling can substantially reduce the average sample size that is required for data collection. The presence of intercluster correlation can impact dramatically the classification error that is associated with LQAS analysis.

  9. Photon event distribution sampling: an image formation technique for scanning microscopes that permits tracking of sub-diffraction particles with high spatial and temporal resolutions.

    Science.gov (United States)

    Larkin, J D; Publicover, N G; Sutko, J L

    2011-01-01

    In photon event distribution sampling, an image formation technique for scanning microscopes, the maximum likelihood position of origin of each detected photon is acquired as a data set rather than binning photons in pixels. Subsequently, an intensity-related probability density function describing the uncertainty associated with the photon position measurement is applied to each position and individual photon intensity distributions are summed to form an image. Compared to pixel-based images, photon event distribution sampling images exhibit increased signal-to-noise and comparable spatial resolution. Photon event distribution sampling is superior to pixel-based image formation in recognizing the presence of structured (non-random) photon distributions at low photon counts and permits use of non-raster scanning patterns. A photon event distribution sampling based method for localizing single particles derived from a multi-variate normal distribution is more precise than statistical (Gaussian) fitting to pixel-based images. Using the multi-variate normal distribution method, non-raster scanning and a typical confocal microscope, localizations with 8 nm precision were achieved at 10 ms sampling rates with acquisition of ~200 photons per frame. Single nanometre precision was obtained with a greater number of photons per frame. In summary, photon event distribution sampling provides an efficient way to form images when low numbers of photons are involved and permits particle tracking with confocal point-scanning microscopes with nanometre precision deep within specimens. © 2010 The Authors Journal of Microscopy © 2010 The Royal Microscopical Society.

  10. Failure Models of Thin-walled Steel Sheeting and Structural-spatial Design Process

    NARCIS (Netherlands)

    Hofmeyer, H.

    2009-01-01

    This presentation is the first on 20 years of research on the failure mechanisms of sheeting subjected to combined concentrated load and bending moment, performed at Technische Universiteit Eindhoven. The aim of this research is to develop accurate, insight providing design rules using simple

  11. Performance Based Envelopes: A Theory of Spatialized Skins and the Emergence of the Integrated Design Professional

    Directory of Open Access Journals (Sweden)

    Franca Trubiano

    2013-10-01

    Full Text Available Realigning the design of building envelopes within the measures of air, light and heat has rendered possible an inventive form of practice whose benefits are far in excess of the metrics of data and analysis. For many of its most advanced practitioners, the contemporary design of facades engages the true potential of “performance” when it deepens, broadens and complicates the theoretical dimension of this most liminal of surfaces. Of particular interest to this paper is a discussion of new theoretical paradigms associated with the design and operation of high performance envelopes of which four characteristics of this emergent sub-discipline are herein examined. To begin with, the way in which building envelopes are no longer separators, dividers and barriers between a building’s interior and exterior conditions, but rather, “spatially” defined environments that fully engage the totality of a building’s engineering systems, is discussed. Cantilevered Louvers, Double Skin Facades and Hybrid Conditioned Atria are representative of this new paradigm as is the use of Responsive Technologies to optimize their behaviors. Lastly, the paper examines the rise of the new integrated design building envelope professional called upon to deliver ever-better performing skins, whether in the guise of energy modeler, climate engineer or façade construction specialist. Hence, this paper develops a theoretical structure within which to describe, analyze and interpret the values made possible by this new and expanding field of performance based envelopes.

  12. On store design and consumer motivation: spatial control and arousal in the retail context

    NARCIS (Netherlands)

    van Rompay, T.J.L.; Tanja-Dijkstra, K.; Verhoeven, J.W.M.; van Es, A.F.

    2012-01-01

    Research testifies to the influence of environmental factors in shopping environments. However, few studies examine effects of store design in interaction with shoppers’ motivations. The authors propose that task-oriented shoppers prefer stores that are spacious, whereas recreational shoppers enjoy

  13. On store design and consumer motivation: Spatial control and arousal in the retail context

    NARCIS (Netherlands)

    van Rompay, Thomas Johannes Lucas; Dijkstra, K.; Verhoeven, J.W.M.; van Es, Annemiek F.

    2012-01-01

    Research testifies to the influence of environmental factors in shopping environments. However, few studies examine effects of store design in interaction with shoppers’ motivations. The authors propose that task-oriented shoppers prefer stores that are spacious, whereas recreational shoppers enjoy

  14. On Store Design and Consumer Motivation : Spatial Control and Arousal in the Retail Context

    NARCIS (Netherlands)

    Van Rompay, Thomas J L; Tanja-Dijkstra, Karin; Verhoeven, Joost W M; van Es, Annemiek F.

    2012-01-01

    Research testifies to the influence of environmental factors in shopping environments. However, few studies examine effects of store design in interaction with shoppers' motivations. The authors propose that task-oriented shoppers prefer stores that are spacious, whereas recreational shoppers enjoy

  15. Hyper-Morphology : Experimentations with bio-inspired design processes for adaptive spatial re-use

    NARCIS (Netherlands)

    Biloria, N.; Chang, J.R.

    2013-01-01

    Hyper-Morphology is an on-going research outlining a bottom-up evolutionary design process based on autonomous cellular building components. The research interfaces critical operational traits of the natural world (Evolutionary Development Biology, Embryology and Cellular Differentiation) with

  16. Hyper-morphology : Experimentations with bio-inspired design processes for adaptive spatial re-use

    NARCIS (Netherlands)

    Chang, J.R.

    2014-01-01

    This article is a newer version of a paper originally published in the eCAADe 2013 Conference Proceedings Computation & Performance. Hyper-Morphology is an on-going research outlining a bottom-up evolutionary design process based on autonomous cellular building components. The research interfaces

  17. Bayesian Spatial Design of Optimal Deep Tubewell Locations in Matlab, Bangladesh.

    Science.gov (United States)

    Warren, Joshua L; Perez-Heydrich, Carolina; Yunus, Mohammad

    2013-09-01

    We introduce a method for statistically identifying the optimal locations of deep tubewells (dtws) to be installed in Matlab, Bangladesh. Dtw installations serve to mitigate exposure to naturally occurring arsenic found at groundwater depths less than 200 meters, a serious environmental health threat for the population of Bangladesh. We introduce an objective function, which incorporates both arsenic level and nearest town population size, to identify optimal locations for dtw placement. Assuming complete knowledge of the arsenic surface, we then demonstrate how minimizing the objective function over a domain favors dtws placed in areas with high arsenic values and close to largely populated regions. Given only a partial realization of the arsenic surface over a domain, we use a Bayesian spatial statistical model to predict the full arsenic surface and estimate the optimal dtw locations. The uncertainty associated with these estimated locations is correctly characterized as well. The new method is applied to a dataset from a village in Matlab and the estimated optimal locations are analyzed along with their respective 95% credible regions.

  18. Spatial distributions of the red palm mite, Raoiella indica (Acari: Tenuipalpidae) on coconut and their implications for development of efficient sampling plans

    DEFF Research Database (Denmark)

    Roda, A.; Nachman, G.; Hosein, F.

    2012-01-01

    The red palm mite (Raoiella indica), an invasive pest of coconut, entered the Western hemisphere in 2004, then rapidly spread through the Caribbean and into Florida, USA. Developing effective sampling methods may aid in the timely detection of the pest in a new area. Studies were conducted...... to provide and compare intra tree spatial distribution of red palm mite populations on coconut in two different geographical areas, Trinidad and Puerto Rico, recently invaded by the mite. The middle stratum of a palm hosted significantly more mites than fronds from the upper or lower canopy and fronds from...

  19. Design and Use of a Full Flow Sampling System (FFS) for the Quantification of Methane Emissions.

    Science.gov (United States)

    Johnson, Derek R; Covington, April N; Clark, Nigel N

    2016-06-12

    The use of natural gas continues to grow with increased discovery and production of unconventional shale resources. At the same time, the natural gas industry faces continued scrutiny for methane emissions from across the supply chain, due to methane's relatively high global warming potential (25-84x that of carbon dioxide, according to the Energy Information Administration). Currently, a variety of techniques of varied uncertainties exists to measure or estimate methane emissions from components or facilities. Currently, only one commercial system is available for quantification of component level emissions and recent reports have highlighted its weaknesses. In order to improve accuracy and increase measurement flexibility, we have designed, developed, and implemented a novel full flow sampling system (FFS) for quantification of methane emissions and greenhouse gases based on transportation emissions measurement principles. The FFS is a modular system that consists of an explosive-proof blower(s), mass airflow sensor(s) (MAF), thermocouple, sample probe, constant volume sampling pump, laser based greenhouse gas sensor, data acquisition device, and analysis software. Dependent upon the blower and hose configuration employed, the current FFS is able to achieve a flow rate ranging from 40 to 1,500 standard cubic feet per minute (SCFM). Utilization of laser-based sensors mitigates interference from higher hydrocarbons (C2+). Co-measurement of water vapor allows for humidity correction. The system is portable, with multiple configurations for a variety of applications ranging from being carried by a person to being mounted in a hand drawn cart, on-road vehicle bed, or from the bed of utility terrain vehicles (UTVs). The FFS is able to quantify methane emission rates with a relative uncertainty of ± 4.4%. The FFS has proven, real world operation for the quantification of methane emissions occurring in conventional and remote facilities.

  20. Design and Use of a Full Flow Sampling System (FFS) for the Quantification of Methane Emissions

    Science.gov (United States)

    Johnson, Derek R.; Covington, April N.; Clark, Nigel N.

    2016-01-01

    The use of natural gas continues to grow with increased discovery and production of unconventional shale resources. At the same time, the natural gas industry faces continued scrutiny for methane emissions from across the supply chain, due to methane's relatively high global warming potential (25-84x that of carbon dioxide, according to the Energy Information Administration). Currently, a variety of techniques of varied uncertainties exists to measure or estimate methane emissions from components or facilities. Currently, only one commercial system is available for quantification of component level emissions and recent reports have highlighted its weaknesses. In order to improve accuracy and increase measurement flexibility, we have designed, developed, and implemented a novel full flow sampling system (FFS) for quantification of methane emissions and greenhouse gases based on transportation emissions measurement principles. The FFS is a modular system that consists of an explosive-proof blower(s), mass airflow sensor(s) (MAF), thermocouple, sample probe, constant volume sampling pump, laser based greenhouse gas sensor, data acquisition device, and analysis software. Dependent upon the blower and hose configuration employed, the current FFS is able to achieve a flow rate ranging from 40 to 1,500 standard cubic feet per minute (SCFM). Utilization of laser-based sensors mitigates interference from higher hydrocarbons (C2+). Co-measurement of water vapor allows for humidity correction. The system is portable, with multiple configurations for a variety of applications ranging from being carried by a person to being mounted in a hand drawn cart, on-road vehicle bed, or from the bed of utility terrain vehicles (UTVs). The FFS is able to quantify methane emission rates with a relative uncertainty of ± 4.4%. The FFS has proven, real world operation for the quantification of methane emissions occurring in conventional and remote facilities. PMID:27341646

  1. The design of high-temperature thermal conductivity measurements apparatus for thin sample size

    Directory of Open Access Journals (Sweden)

    Hadi Syamsul

    2017-01-01

    Full Text Available This study presents the designing, constructing and validating processes of thermal conductivity apparatus using steady-state heat-transfer techniques with the capability of testing a material at high temperatures. This design is an improvement from ASTM D5470 standard where meter-bars with the equal cross-sectional area were used to extrapolate surface temperature and measure heat transfer across a sample. There were two meter-bars in apparatus where each was placed three thermocouples. This Apparatus using a heater with a power of 1,000 watts, and cooling water to stable condition. The pressure applied was 3.4 MPa at the cross-sectional area of 113.09 mm2 meter-bar and thermal grease to minimized interfacial thermal contact resistance. To determine the performance, the validating process proceeded by comparing the results with thermal conductivity obtained by THB 500 made by LINSEIS. The tests showed the thermal conductivity of the stainless steel and bronze are 15.28 Wm-1K-1 and 38.01 Wm-1K-1 with a difference of test apparatus THB 500 are −2.55% and 2.49%. Furthermore, this apparatus has the capability to measure the thermal conductivity of the material to a temperature of 400°C where the results for the thermal conductivity of stainless steel is 19.21 Wm-1K-1 and the difference was 7.93%.

  2. Changes in tar yields and cigarette design in samples of Chinese cigarettes, 2009 and 2012.

    Science.gov (United States)

    Schneller, Liane M; Zwierzchowski, Benjamin A; Caruso, Rosalie V; Li, Qiang; Yuan, Jiang; Fong, Geoffrey T; O'Connor, Richard J

    2015-11-01

    China is home to the greatest number of smokers as well as the greatest number of smoking-related deaths. An active and growing market of cigarettes marketed as 'light' or 'low tar' may keep health-concerned smokers from quitting, wrongly believing that such brands are less harmful. This study sought to observe changes in cigarette design characteristics and reported tar, nicotine and carbon monoxide (TNCO) levels in a sample of cigarette brands obtained in seven Chinese cities from 2009 to 2012. Cigarettes were purchased and shipped to Roswell Park Cancer Institute, where 91 pairs of packs were selected for physical cigarette design characteristic testing and recording of TNCO values. Data analysis was conducted using SPSS, and was initially characterised using descriptive statistics, correlations and generalised estimating equations to observe changes in brand varieties over time. Reported TNCO values on packs saw mean tar, nicotine and CO levels decrease from 2009 to 2012 by 7.9%, 4.5% and 6.0%, respectively. Ventilation was the only cigarette design feature that significantly changed over time (p<0.001), with an increase of 31.7%. Significant predictors of tar and CO yield overall were ventilation and per-cigarette tobacco weight, while for nicotine tobacco moisture was also an independent predictor of yield. The use of ventilation to decrease TNCO emissions is misleading smokers to believe that they are smoking a 'light/low' tar cigarette that is healthier, and is potentially forestalling the quitting behaviours that would begin to reduce the health burden of tobacco in China, and so should be prohibited. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  3. Sustainable Urban Development: Spatial Analyses as Novel Tools for Planning a Universally Designed City

    Directory of Open Access Journals (Sweden)

    Joanna Borowczyk

    2018-05-01

    Full Text Available The aim of the research was to analyze the “design for all” concept as a key strategy for creating social sustainability. The paper attempts to answer the question: how can universal design contribute to the rational development of the city space? The author has taken part in participatory experiments. The research took into account various criteria, including the level of the city space’s adaptation to the needs and capabilities of persons with different disabilities. Analyses included qualitative studies concerning the possibilities of developing the social capital as well as creating and preserving a cohesive social structure. The analytic process allowed determining the means of raising the quality of urban planning. Finding effective and reliable analytical tools enabling the development of healthy cities which are compatible with the principles of sustainability could become both a great chance and a great challenge for urban planners. Transition from the microplanning to the macroplanning scale and following the principles of universal design at the stage of the formation of urban concepts using spatiotemporal modelling methods will lead to the creation of harmonious accessible spaces adjusted to the needs of present and future users, which will generate sustainable development and lead to the healing of a city.

  4. Deserts and holy mountains of medieval Serbia: Written sources, spatial patterns, architectural designs

    Directory of Open Access Journals (Sweden)

    Popović Danica

    2007-01-01

    meaning and the function of the monastic locales labeled as deserts and holy mountains (and, in a limited number of cases, also known as caves. The most important conclusions that may be drawn would be the following: the terms are interchangeable and were used both in a broader and a narrower sense, but in either case in reference to the space intended for higher forms of monastic life. A particularly broad range of meanings had the term desert which could refer to a distinct locale, as a rule a river gorge, or a mountain inhabited by hermits, but also a cave hermitage, the hesychasterion of a coenobitic community. The distinct forms of monastic life in such areas were communities of two or three or a few monks, organized as a skete or as a cell. In the deserts and mountains hermits primarily pursued the practice of 'agon and hesychia', but were also engaged in manuscript copying - an important peculiarity of Serbian eremitic monasticism. Finally, such locales were thought of by their dwellers as spiritual cities and the narrow path leading to Heavenly Jerusalem. The other thematic focus is an analysis of spatial patterns and architectural structures based on the relevant examples studied so far. Different types of monastic communities functioning as deserts were considered, from the point of view of their spatial situation and their relationship to the coenobia. In this context, field research identified examples of the so-called internal deserts, which was reconfirmed by the records from written sources. Special attention was given to the mechanism for creating a holy mount in the Serbian environment, according to the recognizable, athonite model. Also analyzed were architectural solutions characteristic of Serbian monastic deserts, from the simplest ones such as wooden huts and walled-up caves to monumental multi-storied edifices, equipped with different features. Finally, the conclusions that have been reached serve as a basis for defining future priorities in the

  5. Design and Demonstration of a Material-Plasma Exposure Target Station for Neutron Irradiated Samples

    Energy Technology Data Exchange (ETDEWEB)

    Rapp, Juergen [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Aaron, A. M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bell, Gary L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Burgess, Thomas W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ellis, Ronald James [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Giuliano, D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Howard, R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kiggans, James O. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Lessard, Timothy L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ohriner, Evan Keith [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Perkins, Dale E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Varma, Venugopal Koikal [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-10-20

    -state heat fluxes of 5–20 MW/m2 and ion fluxes up to 1024 m-2s-1. Since PFCs will have to withstand neutron irradiation displacement damage up to 50 dpa, the target station design must accommodate radioactive specimens (materials to be irradiated in HFIR or at SNS) to enable investigations of the impact of neutron damage on materials. Therefore, the system will have to be able to install and extract irradiated specimens using equipment and methods to avoid sample modification, control contamination, and minimize worker dose. Included in the design considerations will be an assessment of all the steps between neutron irradiation and post-exposure materials examination/characterization, as well as an evaluation of the facility hazard categorization. In particular, the factors associated with the acquisition of radioactive specimens and their preparation, transportation, experimental configuration at the plasma-specimen interface, post-plasma-exposure sample handling, and specimen preparation will be evaluated. Neutronics calculations to determine the dose rates of the samples were carried out for a large number of potential plasma-facing materials.

  6. Design and Demonstration of a Material-Plasma Exposure Target Station for Neutron Irradiated Samples

    International Nuclear Information System (INIS)

    Rapp, Juergen; Aaron, A. M.; Bell, Gary L.; Burgess, Thomas W.; Ellis, Ronald James; Giuliano, D.; Howard, R.; Kiggans, James O.; Lessard, Timothy L.; Ohriner, Evan Keith; Perkins, Dale E.; Varma, Venugopal Koikal

    2015-01-01

    5-20 MW/m"2 and ion fluxes up to 10"2"4 m"-"2s"-"1. Since PFCs will have to withstand neutron irradiation displacement damage up to 50 dpa, the target station design must accommodate radioactive specimens (materials to be irradiated in HFIR or at SNS) to enable investigations of the impact of neutron damage on materials. Therefore, the system will have to be able to install and extract irradiated specimens using equipment and methods to avoid sample modification, control contamination, and minimize worker dose. Included in the design considerations will be an assessment of all the steps between neutron irradiation and post-exposure materials examination/characterization, as well as an evaluation of the facility hazard categorization. In particular, the factors associated with the acquisition of radioactive specimens and their preparation, transportation, experimental configuration at the plasma-specimen interface, post-plasma-exposure sample handling, and specimen preparation will be evaluated. Neutronics calculations to determine the dose rates of the samples were carried out for a large number of potential plasma-facing materials.

  7. Two specialized delayed-neutron detector designs for assays of fissionable elements in water and sediment samples

    International Nuclear Information System (INIS)

    Balestrini, S.J.; Balagna, J.P.; Menlove, H.O.

    1976-01-01

    Two specialized neutron-sensitive detectors are described which are employed for rapid assays of fissionable elements by sensing for delayed neutrons emitted by samples after they have been irradiated in a nuclear reactor. The more sensitive of the two detectors, designed to assay for uranium in water samples, is 40% efficient; the other, designed for sediment sample assays, is 27% efficient. These detectors are also designed to operate under water as an inexpensive shielding against neutron leakage from the reactor and neutrons from cosmic rays. (Auth.)

  8. A spatially distributed isotope sampling network in a snow-dominated catchment for the quantification of snow meltwater

    Science.gov (United States)

    Rücker, Andrea; Boss, Stefan; Von Freyberg, Jana; Zappa, Massimiliano; Kirchner, James

    2017-04-01

    In mountainous catchments with seasonal snowpacks, river discharge in downstream valleys is largely sustained by snowmelt in spring and summer. Future climate warming will likely reduce snow volumes and lead to earlier and faster snowmelt in such catchments. This, in turn, may increase the risk of summer low flows and hydrological droughts. Improved runoff predictions are thus required in order to adapt water management to future climatic conditions and to assure the availability of fresh water throughout the year. However, a detailed understanding of the hydrological processes is crucial to obtain robust predictions of river streamflow. This in turn requires fingerprinting source areas of streamflow, tracing water flow pathways, and measuring timescales of catchment storage, using tracers such as stable water isotopes (18O, 2H). For this reason, we have established an isotope sampling network in the Alptal, a snowmelt-dominated catchment (46.4 km2) in Central-Switzerland, as part of the SREP-Drought project (Snow Resources and the Early Prediction of hydrological DROUGHT in mountainous streams). Precipitation and snow cores are analyzed for their isotopic signature at daily or weekly intervals. Three-week bulk samples of precipitation are also collected on a transect along the Alptal valley bottom, and along an elevational transect perpendicular to the Alptal valley axis. Streamwater samples are taken at the catchment outlet as well as in two small nested sub-catchments (automatic snow lysimeter system was developed, which also facilitates real-time monitoring of snowmelt events, system status and environmental conditions (air and soil temperature). Three lysimeter systems were installed within the catchment, in one forested site and two open field sites at different elevations, and have been operational since November 2016. We will present the isotope time series from our regular sampling network, as well as initial results from our snowmelt lysimeter sites. Our

  9. Design and Validation of a Cyclic Strain Bioreactor to Condition Spatially-Selective Scaffolds in Dual Strain Regimes

    Directory of Open Access Journals (Sweden)

    J. Matthew Goodhart

    2014-03-01

    Full Text Available The objective of this study was to design and validate a unique bioreactor design for applying spatially selective, linear, cyclic strain to degradable and non-degradable polymeric fabric scaffolds. This system uses a novel three-clamp design to apply cyclic strain via a computer controlled linear actuator to a specified zone of a scaffold while isolating the remainder of the scaffold from strain. Image analysis of polyethylene terephthalate (PET woven scaffolds subjected to a 3% mechanical stretch demonstrated that the stretched portion of the scaffold experienced 2.97% ± 0.13% strain (mean ± standard deviation while the unstretched portion experienced 0.02% ± 0.18% strain. NIH-3T3 fibroblast cells were cultured on the PET scaffolds and half of each scaffold was stretched 5% at 0.5 Hz for one hour per day for 14 days in the bioreactor. Cells were checked for viability and proliferation at the end of the 14 day period and levels of glycosaminoglycan (GAG and collagen (hydroxyproline were measured as indicators of extracellular matrix production. Scaffolds in the bioreactor showed a seven-fold increase in cell number over scaffolds cultured statically in tissue culture plastic petri dishes (control. Bioreactor scaffolds showed a lower concentration of GAG deposition per cell as compared to the control scaffolds largely due to the great increase in cell number. A 75% increase in hydroxyproline concentration per cell was seen in the bioreactor stretched scaffolds as compared to the control scaffolds. Surprisingly, little differences were experienced between the stretched and unstretched portions of the scaffolds for this study. This was largely attributed to the conditioned and shared media effect. Results indicate that the bioreactor system is capable of applying spatially-selective, linear, cyclic strain to cells growing on polymeric fabric scaffolds and evaluating the cellular and matrix responses to the applied strains.

  10. A simple and efficient alternative to implementing systematic random sampling in stereological designs without a motorized microscope stage.

    Science.gov (United States)

    Melvin, Neal R; Poda, Daniel; Sutherland, Robert J

    2007-10-01

    When properly applied, stereology is a very robust and efficient method to quantify a variety of parameters from biological material. A common sampling strategy in stereology is systematic random sampling, which involves choosing a random sampling [corrected] start point outside the structure of interest, and sampling relevant objects at [corrected] sites that are placed at pre-determined, equidistant intervals. This has proven to be a very efficient sampling strategy, and is used widely in stereological designs. At the microscopic level, this is most often achieved through the use of a motorized stage that facilitates the systematic random stepping across the structure of interest. Here, we report a simple, precise and cost-effective software-based alternative to accomplishing systematic random sampling under the microscope. We believe that this approach will facilitate the use of stereological designs that employ systematic random sampling in laboratories that lack the resources to acquire costly, fully automated systems.

  11. Comparison of Sampling Designs for Estimating Deforestation from Landsat TM and MODIS Imagery: A Case Study in Mato Grosso, Brazil

    Directory of Open Access Journals (Sweden)

    Shanyou Zhu

    2014-01-01

    Full Text Available Sampling designs are commonly used to estimate deforestation over large areas, but comparisons between different sampling strategies are required. Using PRODES deforestation data as a reference, deforestation in the state of Mato Grosso in Brazil from 2005 to 2006 is evaluated using Landsat imagery and a nearly synchronous MODIS dataset. The MODIS-derived deforestation is used to assist in sampling and extrapolation. Three sampling designs are compared according to the estimated deforestation of the entire study area based on simple extrapolation and linear regression models. The results show that stratified sampling for strata construction and sample allocation using the MODIS-derived deforestation hotspots provided more precise estimations than simple random and systematic sampling. Moreover, the relationship between the MODIS-derived and TM-derived deforestation provides a precise estimate of the total deforestation area as well as the distribution of deforestation in each block.

  12. Comparison of sampling designs for estimating deforestation from landsat TM and MODIS imagery: a case study in Mato Grosso, Brazil.

    Science.gov (United States)

    Zhu, Shanyou; Zhang, Hailong; Liu, Ronggao; Cao, Yun; Zhang, Guixin

    2014-01-01

    Sampling designs are commonly used to estimate deforestation over large areas, but comparisons between different sampling strategies are required. Using PRODES deforestation data as a reference, deforestation in the state of Mato Grosso in Brazil from 2005 to 2006 is evaluated using Landsat imagery and a nearly synchronous MODIS dataset. The MODIS-derived deforestation is used to assist in sampling and extrapolation. Three sampling designs are compared according to the estimated deforestation of the entire study area based on simple extrapolation and linear regression models. The results show that stratified sampling for strata construction and sample allocation using the MODIS-derived deforestation hotspots provided more precise estimations than simple random and systematic sampling. Moreover, the relationship between the MODIS-derived and TM-derived deforestation provides a precise estimate of the total deforestation area as well as the distribution of deforestation in each block.

  13. Heavy metals in soils of Hechuan County in the upper Yangtze (SW China): Comparative pollution assessment using multiple indices with high-spatial-resolution sampling.

    Science.gov (United States)

    Ni, Maofei; Mao, Rong; Jia, Zhongmin; Dong, Ruozhu; Li, Siyue

    2018-02-01

    In order to assess heavy metals (HMs) in soils of the upper Yangtze Basin, a very high-spatial-resolution sampling (582 soil samples) was conducted from Hechuan County, an important agricultural practice area in the Southwest China. Multiple indices including geoaccumulation index (I geo ), enrichment factor (EF), sediment pollution index (SPI) and risk index (RI), as well as multivariate statistics were employed for pollution assessment and source identification of HMs in soils. Our results demonstrated that the averages of eight HMs decreased in the following order: Zn (82.8 ± 15.9) > Cr (71.6 ± 12.2) > Ni (32.1 ± 9.89) > Pb (27.6 ± 13.8) > Cu (25.9 ± 11.8) > As (5.48 ± 3.42) > Cd (0.30 ± 0.077) > Hg (0.082 ± 0.092). Averages of HMs except Cd were lower than threshold value of Environmental Quality Standard for Soils, while 43% of total samples had Cd concentration exceeding the national standard, 1% of samples for Hg and 5% samples for Ni, moreover, Cd and Hg averages were much higher than their background levels. I geo and EF indicated that their levels decreased as follows: Cd > Hg > Zn > Pb > Ni > Cu > Cr > As, with moderate enrichments of Cd and Hg. RI indicated that 61.7% of all samples showed moderate risk, while 6.5% of samples with greater than considerable risk due to human activities should be paid more attention. Multivariate analysis showed lithogenic source of Cu, Cr, Ni and Zn, while Cd and Hg were largely contributed by anthropogenic activities such as agricultural practices. Our study would be helpful for improving soil environmental quality in SW, China, as well as supplying modern approaches for other areas with soil HM pollution. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Sampling designs for contaminant temporal trend analyses using sedentary species exemplified by the snails Bellamya aeruginosa and Viviparus viviparus.

    Science.gov (United States)

    Yin, Ge; Danielsson, Sara; Dahlberg, Anna-Karin; Zhou, Yihui; Qiu, Yanling; Nyberg, Elisabeth; Bignert, Anders

    2017-10-01

    Environmental monitoring typically assumes samples and sampling activities to be representative of the population being studied. Given a limited budget, an appropriate sampling strategy is essential to support detecting temporal trends of contaminants. In the present study, based on real chemical analysis data on polybrominated diphenyl ethers in snails collected from five subsites in Tianmu Lake, computer simulation is performed to evaluate three sampling strategies by the estimation of required sample size, to reach a detection of an annual change of 5% with a statistical power of 80% and 90% with a significant level of 5%. The results showed that sampling from an arbitrarily selected sampling spot is the worst strategy, requiring much more individual analyses to achieve the above mentioned criteria compared with the other two approaches. A fixed sampling site requires the lowest sample size but may not be representative for the intended study object e.g. a lake and is also sensitive to changes of that particular sampling site. In contrast, sampling at multiple sites along the shore each year, and using pooled samples when the cost to collect and prepare individual specimens are much lower than the cost for chemical analysis, would be the most robust and cost efficient strategy in the long run. Using statistical power as criterion, the results demonstrated quantitatively the consequences of various sampling strategies, and could guide users with respect of required sample sizes depending on sampling design for long term monitoring programs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. The effect of the configuration and the interior design of a virtual weightless space station on human spatial orientation

    Science.gov (United States)

    Aoki, Hirofumi; Ohno, Ryuzo; Yamaguchi, Takao

    2005-05-01

    In a virtual weightless environment, subjects' orientation skills were studied to examine what kind of cognitive errors people make when they moved through the interior space of virtual space stations and what kind of visual information effectively decreases those errors. Subjects wearing a head-mounted display moved from one end to the other end in space station-like routes constructed of rectangular and cubical modules, and did Pointing and Modeling tasks. In Experiment 1, configurations of the routes were changed with such variables as the number of bends, the number of embedding planes, and the number of planes with respect to the body posture. The results indicated that spatial orientation ability was relevant to the variables and that orientational errors were explained by two causes. One of these was that the place, the direction, and the sequence of turns were incorrect. The other was that subjects did not recognize the rotation of the frame of reference, especially when they turned in pitch direction rather than in yaw. In Experiment 2, the effect of the interior design was examined by testing three design settings. Wall colors that showed the allocentric frame of reference and the different interior design of vertical and horizontal modules were effective; however, there was a limit to the effectiveness in complicated configurations.

  16. The effect of the configuration and the interior design of a virtual weightless space station on human spatial orientation.

    Science.gov (United States)

    Aoki, Hirofumi; Ohno, Ryuzo; Yamaguchi, Takao

    2005-01-01

    In a virtual weightless environment, subjects' orientation skills were studied to examine what kind of cognitive errors people make when they moved through the interior space of virtual space stations and what kind of visual information effectively decreases those errors. Subjects wearing a head-mounted display moved from one end to the other end in space station-like routes constructed of rectangular and cubical modules, and did Pointing and Modeling tasks. In Experiment 1, configurations of the routes were changed with such variables as the number of bends, the number of embedding planes, and the number of planes with respect to the body posture. The results indicated that spatial orientation ability was relevant to the variables and that orientational errors were explained by two causes. One of these was that the place, the direction, and the sequence of turns were incorrect. The other was that subjects did not recognize the rotation of the frame of reference, especially when they turned in pitch direction rather than in yaw. In Experiment 2, the effect of the interior design was examined by testing three design settings. Wall colors that showed the allocentric frame of reference and the different interior design of vertical and horizontal modules were effective; however, there was a limit to the effectiveness in complicated configurations. c2005 Published by Elsevier Ltd.

  17. Multiobjecitve Sampling Design for Calibration of Water Distribution Network Model Using Genetic Algorithm and Neural Network

    Directory of Open Access Journals (Sweden)

    Kourosh Behzadian

    2008-03-01

    Full Text Available In this paper, a novel multiobjective optimization model is presented for selecting optimal locations in the water distribution network (WDN with the aim of installing pressure loggers. The pressure data collected at optimal locations will be used later on in the calibration of the proposed WDN model. Objective functions consist of maximization of calibrated model prediction accuracy and minimization of the total cost for sampling design. In order to decrease the model run time, an optimization model has been developed using multiobjective genetic algorithm and adaptive neural network (MOGA-ANN. Neural networks (NNs are initially trained after a number of initial GA generations and periodically retrained and updated after generation of a specified number of full model-analyzed solutions. Trained NNs are replaced with the fitness evaluation of some chromosomes within the GA progress. Using cache prevents objective function evaluation of repetitive chromosomes within GA. Optimal solutions are obtained through pareto-optimal front with respect to the two objective functions. Results show that jointing NNs in MOGA for approximating portions of chromosomes’ fitness in each generation leads to considerable savings in model run time and can be promising for reducing run-time in optimization models with significant computational effort.

  18. How Mobile App Design Impacts User Responses to Mixed Self-Tracking Outcomes: Randomized Online Experiment to Explore the Role of Spatial Distance for Hedonic Editing

    Science.gov (United States)

    Lorenz, Jana

    2018-01-01

    Background Goal setting is among the most common behavioral change techniques employed in contemporary self-tracking apps. For these techniques to be effective, it is relevant to understand how the visual presentation of goal-related outcomes employed in the app design affects users’ responses to their self-tracking outcomes. Objective This study examined whether a spatially close (vs distant) presentation of mixed positive and negative self-tracking outcomes from multiple domains (ie, activity, diet) on a digital device’s screen can provide users the opportunity to hedonically edit their self-tracking outcome profile (ie, to view their mixed self-tracking outcomes in the most positive light). Further, this study examined how the opportunity to hedonically edit one’s self-tracking outcome profile relates to users’ future health behavior intentions. Methods To assess users’ responses to a spatially close (vs distant) presentation of a mixed-gain (vs mixed-loss) self-tracking outcome profile, a randomized 2×2 between-subjects online experiment with a final sample of 397 participants (mean age 27.4, SD 7.2 years; 71.5%, 284/397 female) was conducted in Germany. The experiment started with a cover story about a fictitious self-tracking app. Thereafter, participants saw one of four manipulated self-tracking outcome profiles. Variables of interest measured were health behavior intentions, compensatory health beliefs, health motivation, and recall of the outcome profile. We analyzed data using chi-square tests (SPSS version 23) and moderated mediation analyses with the PROCESS macro 2.16.1. Results Spatial distance facilitated hedonic editing, which was indicated by systematic memory biases in users’ recall of positive and negative self-tracking outcomes. In the case of a mixed-gain outcome profile, a spatially close (vs distant) presentation tended to increase the underestimation of the negative outcome (P=.06). In the case of a mixed-loss outcome profile, a

  19. Novel Ordered Stepped-Wedge Cluster Trial Designs for Detecting Ebola Vaccine Efficacy Using a Spatially Structured Mathematical Model.

    Directory of Open Access Journals (Sweden)

    Ibrahim Diakite

    2016-08-01

    Full Text Available During the 2014 Ebola virus disease (EVD outbreak, policy-makers were confronted with difficult decisions on how best to test the efficacy of EVD vaccines. On one hand, many were reluctant to withhold a vaccine that might prevent a fatal disease from study participants randomized to a control arm. On the other, regulatory bodies called for rigorous placebo-controlled trials to permit direct measurement of vaccine efficacy prior to approval of the products. A stepped-wedge cluster study (SWCT was proposed as an alternative to a more traditional randomized controlled vaccine trial to address these concerns. Here, we propose novel "ordered stepped-wedge cluster trial" (OSWCT designs to further mitigate tradeoffs between ethical concerns, logistics, and statistical rigor.We constructed a spatially structured mathematical model of the EVD outbreak in Sierra Leone. We used the output of this model to simulate and compare a series of stepped-wedge cluster vaccine studies. Our model reproduced the observed order of first case occurrence within districts of Sierra Leone. Depending on the infection risk within the trial population and the trial start dates, the statistical power to detect a vaccine efficacy of 90% varied from 14% to 32% for standard SWCT, and from 67% to 91% for OSWCTs for an alpha error of 5%. The model's projection of first case occurrence was robust to changes in disease natural history parameters.Ordering clusters in a step-wedge trial based on the cluster's underlying risk of infection as predicted by a spatial model can increase the statistical power of a SWCT. In the event of another hemorrhagic fever outbreak, implementation of our proposed OSWCT designs could improve statistical power when a step-wedge study is desirable based on either ethical concerns or logistical constraints.

  20. Rational Design of Thermally Stable Novel Biocatalytic Nanomaterials: Enzyme Stability in Restricted Spatial Dimensions

    Science.gov (United States)

    Mudhivarthi, Vamsi K.

    Enzyme stability is of intense interest in bio-materials science as biocatalysts, and as sensing platforms. This is essentially because the unique properties of DNA, RNA, PAA can be coupled with the interesting and novel properties of proteins to produce systems with unprecedented control over their properties. In this article, the very first examples of enzyme/NA/inorganic hybrid nanomaterials and enzyme-Polyacrylic acid conjugates will be presented. The basic principles of design, synthesis and control of properties of these hybrid materials will be presented first, and this will be followed by a discussion of selected examples from our recent research findings. Data show that key properties of biological catalysts are improved by the inorganic framework especially when the catalyst is co-embedded with DNA. Several examples of such studies with various enzymes and proteins, including horseradish peroxidase (HRP), glucose oxidase (GO), cytochrome c (Cyt c), met-hemoglobin (Hb) and met-myoglobin (Mb) will be discussed. Additionally, key insights obtained by the standard methods of materials science including XRD, SEM and TEM as well as biochemical, calorimetric and spectroscopic methods will be discussed. Furthermore, improved structure and enhanced activities of the biocatalysts in specific cases will be demonstrated along with the potential stabilization mechanisms. Our hypothesis is that nucleic acids provide an excellent control over the enzyme-solid interactions as well as rational assembly of nanomaterials. These novel nanobiohybrid materials may aid in engineering more effective synthetic materials for gene-delivery, RNA-delivery and drug delivery applications.

  1. A two-stage Bayesian design with sample size reestimation and subgroup analysis for phase II binary response trials.

    Science.gov (United States)

    Zhong, Wei; Koopmeiners, Joseph S; Carlin, Bradley P

    2013-11-01

    Frequentist sample size determination for binary outcome data in a two-arm clinical trial requires initial guesses of the event probabilities for the two treatments. Misspecification of these event rates may lead to a poor estimate of the necessary sample size. In contrast, the Bayesian approach that considers the treatment effect to be random variable having some distribution may offer a better, more flexible approach. The Bayesian sample size proposed by (Whitehead et al., 2008) for exploratory studies on efficacy justifies the acceptable minimum sample size by a "conclusiveness" condition. In this work, we introduce a new two-stage Bayesian design with sample size reestimation at the interim stage. Our design inherits the properties of good interpretation and easy implementation from Whitehead et al. (2008), generalizes their method to a two-sample setting, and uses a fully Bayesian predictive approach to reduce an overly large initial sample size when necessary. Moreover, our design can be extended to allow patient level covariates via logistic regression, now adjusting sample size within each subgroup based on interim analyses. We illustrate the benefits of our approach with a design in non-Hodgkin lymphoma with a simple binary covariate (patient gender), offering an initial step toward within-trial personalized medicine. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. Application of binomial and multinomial probability statistics to the sampling design process of a global grain tracing and recall system

    Science.gov (United States)

    Small, coded, pill-sized tracers embedded in grain are proposed as a method for grain traceability. A sampling process for a grain traceability system was designed and investigated by applying probability statistics using a science-based sampling approach to collect an adequate number of tracers fo...

  3. Multi-saline sample distillation apparatus for hydrogen isotope analyses: design and accuracy. Water-resources investigations

    International Nuclear Information System (INIS)

    Hassan, A.A.

    1981-04-01

    A distillation apparatus for saline water samples was designed and tested. Six samples may be distilled simultaneously. The temperature was maintained at 400 degrees C to ensure complete dehydration of the precipitating salts. Consequently, the error in the measured ratio of stable hydrogen isotopes resulting from incomplete dehydration of hydrated salts during distillation was eliminated

  4. Design Review Report for formal review of safety class features of exhauster system for rotary mode core sampling

    International Nuclear Information System (INIS)

    JANICEK, G.P.

    2000-01-01

    Report documenting Formal Design Review conducted on portable exhausters used to support rotary mode core sampling of Hanford underground radioactive waste tanks with focus on Safety Class design features and control requirements for flammable gas environment operation and air discharge permitting compliance

  5. Design Review Report for formal review of safety class features of exhauster system for rotary mode core sampling

    Energy Technology Data Exchange (ETDEWEB)

    JANICEK, G.P.

    2000-06-08

    Report documenting Formal Design Review conducted on portable exhausters used to support rotary mode core sampling of Hanford underground radioactive waste tanks with focus on Safety Class design features and control requirements for flammable gas environment operation and air discharge permitting compliance.

  6. Designing a spatial decision-support system to improve urban resilience to floods

    Science.gov (United States)

    Heinzlef, Charlotte; Ganz, François; Becue, Vincent; Serre, Damien

    2017-04-01

    increase of these observatories (Dolique, 2013), observatories which are focused on different fields as, risk observation (PACA regional risks observatory), environmental observation (Environmental virtual observatory), ecological observation (National ecological observatory), etc. Usually, an observatory focuses either on a scale (generally national or regional) or on a fact (risks, environment, energy, economy, etc) Our objective is to develop an observatory tested on the territory of Avignon, to design a tool for analyzing resilience according to indicators which would measure technical resilience (urban and suburban networks), urban resilience (buildings and critical infrastructures) and social resilience (knowledge of risk, memory of the disaster, perception of vulnerability). Our tool would be designed with the help of our socio-economic partner which is the city of Avignon, and would provide a clearer picture of the resilience for managers and inhabitants. It would be participatory and social insofar as, following the assessment of the existing resilience thanks to the indicators, it would be make the territory more resilient thanks to expert advices and participatory workshops for the inhabitants and managers.

  7. Spatial variation of contaminant elements of roadside dust samples from Budapest (Hungary) and Seoul (Republic of Korea), including Pt, Pd and Ir.

    Science.gov (United States)

    Sager, Manfred; Chon, Hyo-Taek; Marton, Laszlo

    2015-02-01

    Roadside dusts were studied to explain the spatial variation and present levels of contaminant elements including Pt, Pd and Ir in urban environment and around Budapest (Hungary) and Seoul (Republic of Korea). The samples were collected from six sites of high traffic volumes in Seoul metropolitan city and from two control sites within the suburbs of Seoul, for comparison. Similarly, road dust samples were obtained two times from traffic focal points in Budapest, from the large bridges across the River Danube, from Margitsziget (an island in the Danube in the northern part of Budapest, used for recreation) as well as from main roads (no highways) outside Budapest. The samples were analysed for contaminant elements by ICP-AES and for Pt, Pd and Ir by ICP-MS. The highest Pt, Pd and Ir levels in road dusts were found from major roads with high traffic volume, but correlations with other contaminant elements were low, however. This reflects automobile catalytic converter to be an important source. To interpret the obtained multi-element results in short, pollution index, contamination index and geo-accumulation index were calculated. Finally, the obtained data were compared with total concentrations encountered in dust samples from Madrid, Oslo, Tokyo and Muscat (Oman). Dust samples from Seoul reached top level concentrations for Cd-Zn-As-Co-Cr-Cu-Mo-Ni-Sn. Just Pb was rather low because unleaded gasoline was introduced as compulsory in 1993. Concentrations in Budapest dust samples were lower than from Seoul, except for Pb and Mg. Compared with Madrid as another continental site, Budapest was higher in Co-V-Zn. Dust from Oslo, which is not so large, contained more Mn-Na-Sr than dust from other towns, but less other metals.

  8. The significance of Sampling Design on Inference: An Analysis of Binary Outcome Model of Children’s Schooling Using Indonesian Large Multi-stage Sampling Data

    OpenAIRE

    Ekki Syamsulhakim

    2008-01-01

    This paper aims to exercise a rather recent trend in applied microeconometrics, namely the effect of sampling design on statistical inference, especially on binary outcome model. Many theoretical research in econometrics have shown the inappropriateness of applying i.i.dassumed statistical analysis on non-i.i.d data. These research have provided proofs showing that applying the iid-assumed analysis on a non-iid observations would result in an inflated standard errors which could make the esti...

  9. Design and use of the IR gas-cloud scanner for measurement and imaging of the spatial distribution of gases at workplaces

    Science.gov (United States)

    ter Kuile, Willem M.; van Veen, J. J.; Knoll, Bas

    1995-02-01

    Usual sampling methods and instruments for checking compliance with `threshold limit values' (TLV) of gaseous components do not provide much information on the mechanism which caused the measured workday average concentration. In the case of noncompliance this information is indispensable for the design of cost effective measures. The infrared gas cloud (IGC) scanner visualizes the spatial distribution of specific gases at a workplace in a quantitative image with a calibrated grayvalue scale. This helps to find the cause of an over- exposure, and so it permits effective abatement of high exposures in the working environment. This paper deals with the technical design of the IGC scanner. Its use is illustrated by some real-world problems. The measuring principle and the technical operation of the IGC-scanner are described. Special attention is given to the pros and cons of retro-reflector screens, the noise reduction methods and image presentation and interpretation. The latter is illustrated by the images produced by the measurements. Essentially the IGC scanner can be used for selective open-path measurement of all gases with a concentration in the ppm range and sufficiently strong distinct absorption lines in the infrared region between 2.5 micrometers and 14.0 micrometers . Further it could be useful for testing the efficiency of ventilation systems and the remote detection of gas leaks. We conclude that a new powerful technique has been added to the industrial hygiene facilities for controlling and improving the work environment.

  10. Osiris-Rex and Hayabusa2 Sample Cleanroom Design and Construction Planning at NASA-JSC

    Science.gov (United States)

    Righter, Kevin; Pace, Lisa F.; Messenger, Keiko

    2018-01-01

    Final Paper and not the abstract is attached. The OSIRIS-REx asteroid sample return mission launched to asteroid Bennu September 8, 2016. The spacecraft will arrive at Bennu in late 2019, orbit and map the asteroid, and perform a touch and go (TAG) sampling maneuver in July 2020. After confirma-tion of successful sample stowage, the spacecraft will return to Earth, and the sample return capsule (SRC) will land in Utah in September 2023. Samples will be recovered from Utah and then transported and stored in a new sample cleanroom at NASA Johnson Space Center in Houston. All curation-specific ex-amination and documentation activities related to Ben-nu samples will be conducted in the dedicated OSIRIS-REx sample cleanroom to be built at NASA-JSC.

  11. Spatial scale and sampling resolution affect measures of gap disturbance in a lowland tropical forest: implications for understanding forest regeneration and carbon storage

    Science.gov (United States)

    Lobo, Elena; Dalling, James W.

    2014-01-01

    Treefall gaps play an important role in tropical forest dynamics and in determining above-ground biomass (AGB). However, our understanding of gap disturbance regimes is largely based either on surveys of forest plots that are small relative to spatial variation in gap disturbance, or on satellite imagery, which cannot accurately detect small gaps. We used high-resolution light detection and ranging data from a 1500 ha forest in Panama to: (i) determine how gap disturbance parameters are influenced by study area size, and the criteria used to define gaps; and (ii) to evaluate how accurately previous ground-based canopy height sampling can determine the size and location of gaps. We found that plot-scale disturbance parameters frequently differed significantly from those measured at the landscape-level, and that canopy height thresholds used to define gaps strongly influenced the gap-size distribution, an important metric influencing AGB. Furthermore, simulated ground surveys of canopy height frequently misrepresented the true location of gaps, which may affect conclusions about how relatively small canopy gaps affect successional processes and contribute to the maintenance of diversity. Across site comparisons need to consider how gap definition, scale and spatial resolution affect characterizations of gap disturbance, and its inferred importance for carbon storage and community composition. PMID:24452032

  12. Fitting by a pearson II function of the spatial deposited energy distribution in superconducting YBaCuO samples calculated by Monte Carlo simulation

    International Nuclear Information System (INIS)

    Cruz Inclan, Carlos M.; Leyva Fabelo, Antonio; Alfonso Vazquez, Onexis

    2001-01-01

    The spatial deposited energy distribution inside YBa 2 Cu 3 O 7 superconducting ceramics irradiated with gamma rays were simulated using the codes system EGS4, based on the Monte Carlo method. The obtained distributions evidence a notable inhomogeneity, which may be one of the possible sources of inconsistent results of irradiation studies. The profiles of these distributions show asymmetrical behaviors, which may be fitted satisfactorily through a Pearson II Gamma type function. These fittings are presented in the paper and the behavior of the fitting parameters with the energy of incident photons, its number, and the experimental geometry were studied. The physical signification of each fitting parameters is discussed in the text. The exponent is related to certain mass absorption coefficient when the thick of the sample is sufficiently large

  13. An investigation of the effects of relevant samples and a comparison of verification versus discovery based lab design

    Science.gov (United States)

    Rieben, James C., Jr.

    This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect of relevant (or "real world") samples on student learning and a verification-based lab design versus a discovery-based lab design. With the cellulose regeneration experiment, the effect of a discovery-based lab design vs. a verification-based lab design was the sole focus. Evaluation surveys consisting of six questions were used at three different times to assess student knowledge of experimental concepts. In the general chemistry laboratory portion of this study, four experimental variants were employed to investigate the effect of relevance and lab design on student learning. These variants consisted of a traditional (or verification) lab design, a traditional lab design using "real world" samples, a new lab design employing real world samples/situations using unknown samples, and the new lab design using real world samples/situations that were known to the student. Data used in this analysis were collected during the Fall 08, Winter 09, and Fall 09 terms. For the second part of this study a cellulose regeneration experiment was employed to investigate the effects of lab design. A demonstration creating regenerated cellulose "rayon" was modified and converted to an efficient and low-waste experiment. In the first variant students tested their products and verified a list of physical properties. In the second variant, students filled in a blank physical property chart with their own experimental results for the physical properties. Results from the conductivity experiment show significant student learning of the effects of concentration on conductivity and how to use conductivity to differentiate solution types with the

  14. Are quantitative trait-dependent sampling designs cost-effective for analysis of rare and common variants?

    Science.gov (United States)

    Yilmaz, Yildiz E; Bull, Shelley B

    2011-11-29

    Use of trait-dependent sampling designs in whole-genome association studies of sequence data can reduce total sequencing costs with modest losses of statistical efficiency. In a quantitative trait (QT) analysis of data from the Genetic Analysis Workshop 17 mini-exome for unrelated individuals in the Asian subpopulation, we investigate alternative designs that sequence only 50% of the entire cohort. In addition to a simple random sampling design, we consider extreme-phenotype designs that are of increasing interest in genetic association analysis of QTs, especially in studies concerned with the detection of rare genetic variants. We also evaluate a novel sampling design in which all individuals have a nonzero probability of being selected into the sample but in which individuals with extreme phenotypes have a proportionately larger probability. We take differential sampling of individuals with informative trait values into account by inverse probability weighting using standard survey methods which thus generalizes to the source population. In replicate 1 data, we applied the designs in association analysis of Q1 with both rare and common variants in the FLT1 gene, based on knowledge of the generating model. Using all 200 replicate data sets, we similarly analyzed Q1 and Q4 (which is known to be free of association with FLT1) to evaluate relative efficiency, type I error, and power. Simulation study results suggest that the QT-dependent selection designs generally yield greater than 50% relative efficiency compared to using the entire cohort, implying cost-effectiveness of 50% sample selection and worthwhile reduction of sequencing costs.

  15. Adaptive clinical trial designs with pre-specified rules for modifying the sample size: understanding efficient types of adaptation.

    Science.gov (United States)

    Levin, Gregory P; Emerson, Sarah C; Emerson, Scott S

    2013-04-15

    Adaptive clinical trial design has been proposed as a promising new approach that may improve the drug discovery process. Proponents of adaptive sample size re-estimation promote its ability to avoid 'up-front' commitment of resources, better address the complicated decisions faced by data monitoring committees, and minimize accrual to studies having delayed ascertainment of outcomes. We investigate aspects of adaptation rules, such as timing of the adaptation analysis and magnitude of sample size adjustment, that lead to greater or lesser statistical efficiency. Owing in part to the recent Food and Drug Administration guidance that promotes the use of pre-specified sampling plans, we evaluate alternative approaches in the context of well-defined, pre-specified adaptation. We quantify the relative costs and benefits of fixed sample, group sequential, and pre-specified adaptive designs with respect to standard operating characteristics such as type I error, maximal sample size, power, and expected sample size under a range of alternatives. Our results build on others' prior research by demonstrating in realistic settings that simple and easily implemented pre-specified adaptive designs provide only very small efficiency gains over group sequential designs with the same number of analyses. In addition, we describe optimal rules for modifying the sample size, providing efficient adaptation boundaries on a variety of scales for the interim test statistic for adaptation analyses occurring at several different stages of the trial. We thus provide insight into what are good and bad choices of adaptive sampling plans when the added flexibility of adaptive designs is desired. Copyright © 2012 John Wiley & Sons, Ltd.

  16. Design aspects of automation system for initial processing of fecal samples

    International Nuclear Information System (INIS)

    Sawant, Pramilla D.; Prabhu, Supreetha P.; Suja, A.; Wankhede, Sonal; Chaudhary, Seema; Rao, D.D.; Pradeepkumar, K.S.; Das, A.P.; Badodkar, B.D.

    2014-01-01

    The procedure for initial handling of the fecal samples at Bioassay Lab., Trombay is as follows: overnight fecal samples are collected from the worker in a kit consisting of a polythene bag placed in a wide mouth polythene container closed with an inner lid and a screw cap. Occupational worker collects the sample in the polythene bag. On receiving the sample, the polythene container along with the sample is weighed, polythene bag containing fecal sample is lifted out of the container using a pair of tongs placed inside a crucible and ashed inside a muffle furnace at 450℃. After complete ashing, the crucible containing white ash is taken-up for further radiochemical processing. This paper describes the various steps in developing a prototype automated system for initial handling of fecal samples. The proposed system for handling and processing of fecal samples is proposed to automate the above. The system once developed will help eliminate manual intervention till the ashing stage and reduce the biological hazard involved in handling such samples mentioned procedure

  17. Human factors issues and approaches in the spatial layout of a space station control room, including the use of virtual reality as a design analysis tool

    Science.gov (United States)

    Hale, Joseph P., II

    1994-01-01

    Human Factors Engineering support was provided for the 30% design review of the late Space Station Freedom Payload Control Area (PCA). The PCA was to be the payload operations control room, analogous to the Spacelab Payload Operations Control Center (POCC). This effort began with a systematic collection and refinement of the relevant requirements driving the spatial layout of the consoles and PCA. This information was used as input for specialized human factors analytical tools and techniques in the design and design analysis activities. Design concepts and configuration options were developed and reviewed using sketches, 2-D Computer-Aided Design (CAD) drawings, and immersive Virtual Reality (VR) mockups.

  18. System design specification for rotary mode core sample trucks No. 2, 3, and 4 programmable logic controller

    International Nuclear Information System (INIS)

    Dowell, J.L.; Akers, J.C.

    1995-01-01

    The system this document describes controls several functions of the Core Sample Truck(s) used to obtain nuclear waste samples from various underground storage tanks at Hanford. The system will monitor the sampling process and provide alarms and other feedback to insure the sampling process is performed within the prescribed operating envelope. The intended audience for this document is anyone associated with rotary or push mode core sampling. This document describes the Alarm and Control logic installed on Rotary Mode Core Sample Trucks (RMCST) number-sign 2, 3, and 4. It is intended to define the particular requirements of the RMCST alarm and control operation (not defined elsewhere) sufficiently for detailed design to implement on a Programmable Logic Controller (PLC)

  19. Representativeness-based sampling network design for the State of Alaska

    Science.gov (United States)

    Forrest M. Hoffman; Jitendra Kumar; Richard T. Mills; William W. Hargrove

    2013-01-01

    Resource and logistical constraints limit the frequency and extent of environmental observations, particularly in the Arctic, necessitating the development of a systematic sampling strategy to maximize coverage and objectively represent environmental variability at desired scales. A quantitative methodology for stratifying sampling domains, informing site selection,...

  20. Design of a radioactive gas sampling system for NESHAP compliance measurements of 41Ar

    International Nuclear Information System (INIS)

    Newton, G.J.; McDonald, M.J.; Ghanbari, F.; Hoover, M.D.; Barr, E.B.

    1994-01-01

    United States Department of Energy facilities are required to comply with the U.S. Environmental Protection Agency, National Emission Standard for Hazardous Air Pollutants (NESHAP) 40 CFR, part 61, subpart H. Compliance generally requires confirmatory measurements of emitted radionuclides. Although a number of standard procedures exist for extractive sampling of particle-associated radionuclides, sampling approaches for radioactive gases are less defined. Real-time, flow-through sampling of radioactive gases can be done when concentrations are high compared to interferences from background radiation. Cold traps can be used to collect and concentrate condensible effluents in applications where cryogenic conditions can be established and maintained. Commercially available gas-sampling cylinders can be used to capture grab samples of contaminated air under ambient or compressed conditions, if suitable sampling and control hardware are added to the cylinders. The purpose of the current study was to develop an efficient and compact set of sampling and control hardware for use with commercially available gas-sampling cylinders, and to demonstrate its use in NESHAP compliance testing of 41 Ar at two experimental research reactors

  1. Effects of sampling design on age ratios of migrants captured at stopover sites

    Science.gov (United States)

    Jeffrey F. Kelly; Deborah M. Finch

    2000-01-01

    Age classes of migrant songbirds often differ in migration timing. This difference creates the potential for age-ratios recorded at stopover sites to vary with the amount and distribution of sampling effort used. To test for these biases, we sub-sampled migrant capture data from the Middle Rio Grande Valley of New Mexico. We created data sets that reflected the age...

  2. Exploiting H infinity sampled-data control theory for high-precision electromechanical servo control design

    NARCIS (Netherlands)

    Oomen, T.A.E.; Wal, van de M.M.J.; Bosgra, O.H.

    2006-01-01

    Optimal design of digital controllers for industrial electromechanical servo systems using an Hinf-criterion is considered. Present industrial practice is to perform the control design in the continuous time domain and to discretize the controller a posteriori. This procedure involves unnecessary

  3. SYNTHESIS OF ACTIVE SCREENING SYSTEM OF MAGNETIC FIELD OF HIGH VOLTAGE POWER LINES OF DIFFERENT DESIGN TAKING INTO ACCOUNT SPATIAL AND TEMPORAL DISTRIBUTION OF MAGNETIC FIELD

    Directory of Open Access Journals (Sweden)

    B.I. Kuznetsov

    2017-04-01

    Full Text Available Purpose. Analyze the spatial and temporal distribution of the magnetic field of high voltage power lines with different design allowing and development of recommendations for the design of active screening systems by magnetic field of high voltage power lines. Methodology. Analysis of the spatial and temporal distribution of the magnetic field of high voltage power lines of different design allowing is made on the basis of Maxwell's equations solutions in the quasi-stationary approximation. Determination of the number, configuration, spatial arrangement and the compensation coil currents is formulated in the form of multiobjective optimization problem that is solved by multi-agent multiswarm stochastic optimization based on Pareto optimal solutions. Results of active screening system for the synthesis of various types of transmission lines with different numbers of windings controlled. The possibility of a significant reduction in the level of the flux density of the magnetic field source within a given region of space. Originality. For the first time an analysis of the spatial and temporal distribution of the magnetic field of power lines with different types and based on findings developed recommendations for the design of active screening system by magnetic field of high voltage power lines. Practical value. Practical recommendations on reasonable choice of the number and spatial arrangement of compensating windings of active screening system by magnetic field of high voltage power lines of different design allowing for the spatial and temporal distribution of the magnetic field. Results of active screening system synthesis of the magnetic field of industrial frequency generated by single-circuit 110 kV high voltage power lines with the supports have 330 - 1T «triangle» rotating magnetic field with full polarization in a residential five-storey building, located near the power lines. The system contains three compensating coil and reduces

  4. The PowerAtlas: a power and sample size atlas for microarray experimental design and research

    Directory of Open Access Journals (Sweden)

    Wang Jelai

    2006-02-01

    Full Text Available Abstract Background Microarrays permit biologists to simultaneously measure the mRNA abundance of thousands of genes. An important issue facing investigators planning microarray experiments is how to estimate the sample size required for good statistical power. What is the projected sample size or number of replicate chips needed to address the multiple hypotheses with acceptable accuracy? Statistical methods exist for calculating power based upon a single hypothesis, using estimates of the variability in data from pilot studies. There is, however, a need for methods to estimate power and/or required sample sizes in situations where multiple hypotheses are being tested, such as in microarray experiments. In addition, investigators frequently do not have pilot data to estimate the sample sizes required for microarray studies. Results To address this challenge, we have developed a Microrarray PowerAtlas 1. The atlas enables estimation of statistical power by allowing investigators to appropriately plan studies by building upon previous studies that have similar experimental characteristics. Currently, there are sample sizes and power estimates based on 632 experiments from Gene Expression Omnibus (GEO. The PowerAtlas also permits investigators to upload their own pilot data and derive power and sample size estimates from these data. This resource will be updated regularly with new datasets from GEO and other databases such as The Nottingham Arabidopsis Stock Center (NASC. Conclusion This resource provides a valuable tool for investigators who are planning efficient microarray studies and estimating required sample sizes.

  5. Design and characterization of poly(dimethylsiloxane)-based valves for interfacing continuous-flow sampling to microchip electrophoresis.

    Science.gov (United States)

    Li, Michelle W; Huynh, Bryan H; Hulvey, Matthew K; Lunte, Susan M; Martin, R Scott

    2006-02-15

    This work describes the fabrication and evaluation of a poly(dimethyl)siloxane (PDMS)-based device that enables the discrete injection of a sample plug from a continuous-flow stream into a microchannel for subsequent analysis by electrophoresis. Devices were fabricated by aligning valving and flow channel layers followed by plasma sealing the combined layers onto a glass plate that contained fittings for the introduction of liquid sample and nitrogen gas. The design incorporates a reduced-volume pneumatic valve that actuates (on the order of hundreds of milliseconds) to allow analyte from a continuously flowing sampling channel to be injected into a separation channel for electrophoresis. The injector design was optimized to include a pushback channel to flush away stagnant sample associated with the injector dead volume. The effect of the valve actuation time, the pushback voltage, and the sampling stream flow rate on the performance of the device was characterized. Using the optimized design and an injection frequency of 0.64 Hz showed that the injection process is reproducible (RSD of 1.77%, n = 15). Concentration change experiments using fluorescein as the analyte showed that the device could achieve a lag time as small as 14 s. Finally, to demonstrate the potential uses of this device, the microchip was coupled to a microdialysis probe to monitor a concentration change and sample a fluorescein dye mixture.

  6. Gaseous and Freely-Dissolved PCBs in the Lower Great Lakes Based on Passive Sampling: Spatial Trends and Air-Water Exchange.

    Science.gov (United States)

    Liu, Ying; Wang, Siyao; McDonough, Carrie A; Khairy, Mohammed; Muir, Derek C G; Helm, Paul A; Lohmann, Rainer

    2016-05-17

    Polyethylene passive sampling was performed to quantify gaseous and freely dissolved polychlorinated biphenyls (PCBs) in the air and water of Lakes Erie and Ontario during 2011-2012. In view of differing physical characteristics and the impacts of historical contamination by PCBs within these lakes, spatial variation of PCB concentrations and air-water exchange across these lakes may be expected. Both lakes displayed statistically similar aqueous and atmospheric PCB concentrations. Total aqueous concentrations of 29 PCBs ranged from 1.5 pg L(-1) in the open lake of Lake Erie (site E02) in 2011 spring to 105 pg L(-1) in Niagara (site On05) in 2012 summer, while total atmospheric concentrations were 7.7-634 pg m(-3) across both lakes. A west-to-east gradient was observed for aqueous PCBs in Lake Erie. River discharge and localized influences (e.g., sediment resuspension and regional alongshore transport) likely dominated spatial trends of aqueous PCBs in both lakes. Air-water exchange fluxes of Σ7PCBs ranged from -2.4 (±1.9) ng m(-2) day(-1) (deposition) in Sheffield (site E03) to 9.0 (±3.1) ng m(-2) day(-1) (volatilization) in Niagara (site On05). Net volatilization of PCBs was the primary trend across most sites and periods. Almost half of variation in air-water exchange fluxes was attributed to the difference in aqueous concentrations of PCBs. Uncertainty analysis in fugacity ratios and mass fluxes in air-water exchange of PCBs indicated that PCBs have reached or approached equilibrium only at the eastern Lake Erie and along the Canadian shore of Lake Ontario sites, where air-water exchange fluxes dominated atmospheric concentrations.

  7. Adaptation of G-TAG Software for Validating Touch-and-Go Comet Surface Sampling Design Methodology

    Science.gov (United States)

    Mandic, Milan; Acikmese, Behcet; Blackmore, Lars

    2011-01-01

    The G-TAG software tool was developed under the R&TD on Integrated Autonomous Guidance, Navigation, and Control for Comet Sample Return, and represents a novel, multi-body dynamics simulation software tool for studying TAG sampling. The G-TAG multi-body simulation tool provides a simulation environment in which a Touch-and-Go (TAG) sampling event can be extensively tested. TAG sampling requires the spacecraft to descend to the surface, contact the surface with a sampling collection device, and then to ascend to a safe altitude. The TAG event lasts only a few seconds but is mission-critical with potentially high risk. Consequently, there is a need for the TAG event to be well characterized and studied by simulation and analysis in order for the proposal teams to converge on a reliable spacecraft design. This adaptation of the G-TAG tool was developed to support the Comet Odyssey proposal effort, and is specifically focused to address comet sample return missions. In this application, the spacecraft descends to and samples from the surface of a comet. Performance of the spacecraft during TAG is assessed based on survivability and sample collection performance. For the adaptation of the G-TAG simulation tool to comet scenarios, models are developed that accurately describe the properties of the spacecraft, approach trajectories, and descent velocities, as well as the models of the external forces and torques acting on the spacecraft. The adapted models of the spacecraft, descent profiles, and external sampling forces/torques were more sophisticated and customized for comets than those available in the basic G-TAG simulation tool. Scenarios implemented include the study of variations in requirements, spacecraft design (size, locations, etc. of the spacecraft components), and the environment (surface properties, slope, disturbances, etc.). The simulations, along with their visual representations using G-View, contributed to the Comet Odyssey New Frontiers proposal

  8. Optimizing trial design in pharmacogenetics research: comparing a fixed parallel group, group sequential, and adaptive selection design on sample size requirements.

    Science.gov (United States)

    Boessen, Ruud; van der Baan, Frederieke; Groenwold, Rolf; Egberts, Antoine; Klungel, Olaf; Grobbee, Diederick; Knol, Mirjam; Roes, Kit

    2013-01-01

    Two-stage clinical trial designs may be efficient in pharmacogenetics research when there is some but inconclusive evidence of effect modification by a genomic marker. Two-stage designs allow to stop early for efficacy or futility and can offer the additional opportunity to enrich the study population to a specific patient subgroup after an interim analysis. This study compared sample size requirements for fixed parallel group, group sequential, and adaptive selection designs with equal overall power and control of the family-wise type I error rate. The designs were evaluated across scenarios that defined the effect sizes in the marker positive and marker negative subgroups and the prevalence of marker positive patients in the overall study population. Effect sizes were chosen to reflect realistic planning scenarios, where at least some effect is present in the marker negative subgroup. In addition, scenarios were considered in which the assumed 'true' subgroup effects (i.e., the postulated effects) differed from those hypothesized at the planning stage. As expected, both two-stage designs generally required fewer patients than a fixed parallel group design, and the advantage increased as the difference between subgroups increased. The adaptive selection design added little further reduction in sample size, as compared with the group sequential design, when the postulated effect sizes were equal to those hypothesized at the planning stage. However, when the postulated effects deviated strongly in favor of enrichment, the comparative advantage of the adaptive selection design increased, which precisely reflects the adaptive nature of the design. Copyright © 2013 John Wiley & Sons, Ltd.

  9. Design Sensitivity Method for Sampling-Based RBDO with Fixed COV

    Science.gov (United States)

    2015-04-29

    contours of the input model at initial design d0 and RBDO optimum design dopt are shown. As the limit state functions are not linear and some input...Glasser, M. L., Moore, R. A., and Scott, T. C., 1990, "Evaluation of Classes of Definite Integrals Involving Elementary Functions via...Differentiation of Special Functions," Applicable Algebra in Engineering, Communication and Computing, 1(2), pp. 149-165. [25] Cho, H., Bae, S., Choi, K. K

  10. Report: Independent Environmental Sampling Shows Some Properties Designated by EPA as Available for Use Had Some Contamination

    Science.gov (United States)

    Report #15-P-0221, July 21, 2015. Some OIG sampling results showed contamination was still present at sites designated by the EPA as ready for reuse. This was unexpected and could signal a need to implement changes to ensure human health protection.

  11. Design and building of a homemade sample changer for automation of the irradiation in neutron activation analysis technique

    International Nuclear Information System (INIS)

    Gago, Javier; Hernandez, Yuri; Baltuano, Oscar; Bedregal, Patricia; Lopez, Yon; Urquizo, Rafael

    2014-01-01

    Because the RP-10 research reactor operates during weekends, it was necessary to design and build a sample changer for irradiation as part of the automation process of neutron activation analysis technique. The device is formed by an aluminum turntable disk which can accommodate 19 polyethylene capsules, containing samples to be sent using the pneumatic transfer system from the laboratory to the irradiation position. The system is operate by a control switchboard to send and return capsules in a variable preset time and by two different ways, allowing the determination of short, medium and long lived radionuclides. Also another mechanism is designed called 'exchange valve' for changing travel paths (pipelines) allowing the irradiated samples to be stored for a longer time in the reactor hall. The system design has allowed complete automation of this technique, enabling the irradiation of samples without the presence of an analyst. The design, construction and operation of the device is described and presented in this article. (authors).

  12. Statistical properties of mean stand biomass estimators in a LIDAR-based double sampling forest survey design.

    Science.gov (United States)

    H.E. Anderson; J. Breidenbach

    2007-01-01

    Airborne laser scanning (LIDAR) can be a valuable tool in double-sampling forest survey designs. LIDAR-derived forest structure metrics are often highly correlated with important forest inventory variables, such as mean stand biomass, and LIDAR-based synthetic regression estimators have the potential to be highly efficient compared to single-stage estimators, which...

  13. MCMC-ODPR: Primer design optimization using Markov Chain Monte Carlo sampling

    Directory of Open Access Journals (Sweden)

    Kitchen James L

    2012-11-01

    Full Text Available Abstract Background Next generation sequencing technologies often require numerous primer designs that require good target coverage that can be financially costly. We aimed to develop a system that would implement primer reuse to design degenerate primers that could be designed around SNPs, thus find the fewest necessary primers and the lowest cost whilst maintaining an acceptable coverage and provide a cost effective solution. We have implemented Metropolis-Hastings Markov Chain Monte Carlo for optimizing primer reuse. We call it the Markov Chain Monte Carlo Optimized Degenerate Primer Reuse (MCMC-ODPR algorithm. Results After repeating the program 1020 times to assess the variance, an average of 17.14% fewer primers were found to be necessary using MCMC-ODPR for an equivalent coverage without implementing primer reuse. The algorithm was able to reuse primers up to five times. We compared MCMC-ODPR with single sequence primer design programs Primer3 and Primer-BLAST and achieved a lower primer cost per amplicon base covered of 0.21 and 0.19 and 0.18 primer nucleotides on three separate gene sequences, respectively. With multiple sequences, MCMC-ODPR achieved a lower cost per base covered of 0.19 than programs BatchPrimer3 and PAMPS, which achieved 0.25 and 0.64 primer nucleotides, respectively. Conclusions MCMC-ODPR is a useful tool for designing primers at various melting temperatures at good target coverage. By combining degeneracy with optimal primer reuse the user may increase coverage of sequences amplified by the designed primers at significantly lower costs. Our analyses showed that overall MCMC-ODPR outperformed the other primer-design programs in our study in terms of cost per covered base.

  14. MCMC-ODPR: primer design optimization using Markov Chain Monte Carlo sampling.

    Science.gov (United States)

    Kitchen, James L; Moore, Jonathan D; Palmer, Sarah A; Allaby, Robin G

    2012-11-05

    Next generation sequencing technologies often require numerous primer designs that require good target coverage that can be financially costly. We aimed to develop a system that would implement primer reuse to design degenerate primers that could be designed around SNPs, thus find the fewest necessary primers and the lowest cost whilst maintaining an acceptable coverage and provide a cost effective solution. We have implemented Metropolis-Hastings Markov Chain Monte Carlo for optimizing primer reuse. We call it the Markov Chain Monte Carlo Optimized Degenerate Primer Reuse (MCMC-ODPR) algorithm. After repeating the program 1020 times to assess the variance, an average of 17.14% fewer primers were found to be necessary using MCMC-ODPR for an equivalent coverage without implementing primer reuse. The algorithm was able to reuse primers up to five times. We compared MCMC-ODPR with single sequence primer design programs Primer3 and Primer-BLAST and achieved a lower primer cost per amplicon base covered of 0.21 and 0.19 and 0.18 primer nucleotides on three separate gene sequences, respectively. With multiple sequences, MCMC-ODPR achieved a lower cost per base covered of 0.19 than programs BatchPrimer3 and PAMPS, which achieved 0.25 and 0.64 primer nucleotides, respectively. MCMC-ODPR is a useful tool for designing primers at various melting temperatures at good target coverage. By combining degeneracy with optimal primer reuse the user may increase coverage of sequences amplified by the designed primers at significantly lower costs. Our analyses showed that overall MCMC-ODPR outperformed the other primer-design programs in our study in terms of cost per covered base.

  15. Optimal sampling designs for estimation of Plasmodium falciparum clearance rates in patients treated with artemisinin derivatives

    Science.gov (United States)

    2013-01-01

    Background The emergence of Plasmodium falciparum resistance to artemisinins in Southeast Asia threatens the control of malaria worldwide. The pharmacodynamic hallmark of artemisinin derivatives is rapid parasite clearance (a short parasite half-life), therefore, the in vivo phenotype of slow clearance defines the reduced susceptibility to the drug. Measurement of parasite counts every six hours during the first three days after treatment have been recommended to measure the parasite clearance half-life, but it remains unclear whether simpler sampling intervals and frequencies might also be sufficient to reliably estimate this parameter. Methods A total of 2,746 parasite density-time profiles were selected from 13 clinical trials in Thailand, Cambodia, Mali, Vietnam, and Kenya. In these studies, parasite densities were measured every six hours until negative after treatment with an artemisinin derivative (alone or in combination with a partner drug). The WWARN Parasite Clearance Estimator (PCE) tool was used to estimate “reference” half-lives from these six-hourly measurements. The effect of four alternative sampling schedules on half-life estimation was investigated, and compared to the reference half-life (time zero, 6, 12, 24 (A1); zero, 6, 18, 24 (A2); zero, 12, 18, 24 (A3) or zero, 12, 24 (A4) hours and then every 12 hours). Statistical bootstrap methods were used to estimate the sampling distribution of half-lives for parasite populations with different geometric mean half-lives. A simulation study was performed to investigate a suite of 16 potential alternative schedules and half-life estimates generated by each of the schedules were compared to the “true” half-life. The candidate schedules in the simulation study included (among others) six-hourly sampling, schedule A1, schedule A4, and a convenience sampling schedule at six, seven, 24, 25, 48 and 49 hours. Results The median (range) parasite half-life for all clinical studies combined was 3.1 (0

  16. Data-driven soft sensor design with multiple-rate sampled data

    DEFF Research Database (Denmark)

    Lin, Bao; Recke, Bodil; Knudsen, Jørgen K.H.

    2007-01-01

    Multi-rate systems are common in industrial processes where quality measurements have slower sampling rate than other process variables. Since inter-sample information is desirable for effective quality control, different approaches have been reported to estimate the quality between samples......, including numerical interpolation, polynomial transformation, data lifting and weighted partial least squares (WPLS). Two modifications to the original data lifting approach are proposed in this paper: reformulating the extraction of a fast model as an optimization problem and ensuring the desired model...... properties through Tikhonov Regularization. A comparative investigation of the four approaches is performed in this paper. Their applicability, accuracy and robustness to process noise are evaluated on a single-input single output (SISO) system. The regularized data lifting and WPLS approaches...

  17. Conceptual design and sampling procedures of the biological programme of NuukBasic

    DEFF Research Database (Denmark)

    Aastrup, Peter; Nymand, Josephine; Raundrup, Katrine

    uorescence in three series of plots. Arthropods are sampled by means of yellow pitfall traps as well as in window traps. Microarthropods are sampled in soil cores and extracted in an extractor by gradually heating up soil. The avifauna is monitored with special emphasis on passerine birds. Only few...... Vegetation Index (NDVI). The fl ux of CO2 is measured in natural conditions as well as in manipulations simulating increased temperature, increased cloud cover, shorter growing season, and longer growing season. The effect of increased UV-B radiation on plant stress is studied by measuring chlorophyll fl...

  18. Diurnal Variation and Spatial Distribution Effects on Sulfur Speciation in Aerosol Samples as Assessed by X-Ray Absorption Near-Edge Structure (XANES

    Directory of Open Access Journals (Sweden)

    Siwatt Pongpiachan

    2012-01-01

    Full Text Available This paper focuses on providing new results relating to the impacts of Diurnal variation, Vertical distribution, and Emission source on sulfur K-edge XANES spectrum of aerosol samples. All aerosol samples used in the diurnal variation experiment were preserved using anoxic preservation stainless cylinders (APSCs and pressure-controlled glove boxes (PCGBs, which were specially designed to prevent oxidation of the sulfur states in PM10. Further investigation of sulfur K-edge XANES spectra revealed that PM10 samples were dominated by S(VI, even when preserved in anoxic conditions. The “Emission source effect” on the sulfur oxidation state of PM10 was examined by comparing sulfur K-edge XANES spectra collected from various emission sources in southern Thailand, while “Vertical distribution effects” on the sulfur oxidation state of PM10 were made with samples collected from three different altitudes from rooftops of the highest buildings in three major cities in Thailand. The analytical results have demonstrated that neither “Emission source” nor “Vertical distribution” appreciably contribute to the characteristic fingerprint of sulfur K-edge XANES spectrum in PM10.

  19. Algorithm/Architecture Co-design of the Generalized Sampling Theorem Based De-Interlacer.

    NARCIS (Netherlands)

    Beric, A.; Haan, de G.; Sethuraman, R.; Meerbergen, van J.

    2005-01-01

    De-interlacing is a major determinant of image quality in a modern display processing chain. The de-interlacing method based on the generalized sampling theorem (GST)applied to motion estimation and motion compensation provides the best de-interlacing results. With HDTV interlaced input material

  20. Sample requirements and design of an inter-laboratory trial for radiocarbon laboratories

    NARCIS (Netherlands)

    Bryant, C; Carmi, [No Value; Cook, G; Gulliksen, S; Harkness, D; Heinemeier, J; McGee, E; Naysmith, P; Possnert, G; van der Plicht, H; van Strydonck, M; Carmi, Israel

    2000-01-01

    An on-going inter-comparison programme which is focused on assessing and establishing consensus protocols to be applied in the identification, selection and sub-sampling of materials for subsequent C-14 analysis is described. The outcome of the programme will provide a detailed quantification of the

  1. Modified sampling design for age-0 fish electrofishing at beach habitats

    Czech Academy of Sciences Publication Activity Database

    Janáč, Michal; Jurajda, Pavel

    2010-01-01

    Roč. 30, č. 5 (2010), s. 1210-1220 ISSN 0275-5947 R&D Projects: GA MŠk LC522 Institutional research plan: CEZ:AV0Z60930519 Keywords : young-of-the-year * electrofishing * sampling Subject RIV: EH - Ecology, Behaviour Impact factor: 1.203, year: 2010

  2. Sample design and gamma-ray counting strategy of neutron activation system for triton burnup measurements in KSTAR

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Jungmin [Department of Energy System Engineering, Seoul National University, Seoul (Korea, Republic of); Cheon, Mun Seong [ITER Korea, National Fusion Research Institute, Daejeon (Korea, Republic of); Chung, Kyoung-Jae, E-mail: jkjlsh1@snu.ac.kr [Department of Energy System Engineering, Seoul National University, Seoul (Korea, Republic of); Hwang, Y.S. [Department of Energy System Engineering, Seoul National University, Seoul (Korea, Republic of)

    2016-11-01

    Highlights: • Sample design for triton burnup ratio measurement is carried out. • Samples for 14.1 MeV neutron measurements are selected for KSTAR. • Si and Cu are the most suitable materials for d-t neutron measurements. • Appropriate γ-ray counting strategies for each selected sample are established. - Abstract: On the purpose of triton burnup measurements in Korea Superconducting Tokamak Advanced Research (KSTAR) deuterium plasmas, appropriate neutron activation system (NAS) samples for 14.1 MeV d-t neutron measurements have been designed and gamma-ray counting strategy is established. Neutronics calculations are performed with the MCNP5 neutron transport code for the KSTAR neutral beam heated deuterium plasma discharges. Based on those calculations and the assumed d-t neutron yield, the activities induced by d-t neutrons are estimated with the inventory code FISPACT-2007 for candidate sample materials: Si, Cu, Al, Fe, Nb, Co, Ti, and Ni. It is found that Si, Cu, Al, and Fe are suitable for the KSATR NAS in terms of the minimum detectable activity (MDA) calculated based on the standard deviation of blank measurements. Considering background gamma-rays radiated from surrounding structures activated by thermalized fusion neutrons, appropriate gamma-ray counting strategy for each selected sample is established.

  3. Time‐of‐flight secondary ion mass spectrometry imaging of biological samples with delayed extraction for high mass and high spatial resolutions

    Science.gov (United States)

    Vanbellingen, Quentin P.; Elie, Nicolas; Eller, Michael J.; Della‐Negra, Serge; Touboul, David

    2015-01-01

    Rationale In Time‐of‐Flight Secondary Ion Mass Spectrometry (TOF‐SIMS), pulsed and focused primary ion beams enable mass spectrometry imaging, a method which is particularly useful to map various small molecules such as lipids at the surface of biological samples. When using TOF‐SIMS instruments, the focusing modes of the primary ion beam delivered by liquid metal ion guns can provide either a mass resolution of several thousand or a sub‐µm lateral resolution, but the combination of both is generally not possible. Methods With a TOF‐SIMS setup, a delayed extraction applied to secondary ions has been studied extensively on rat cerebellum sections in order to compensate for the effect of long primary ion bunches. Results The use of a delayed extraction has been proven to be an efficient solution leading to unique features, i.e. a mass resolution up to 10000 at m/z 385.4 combined with a lateral resolution of about 400 nm. Simulations of ion trajectories confirm the experimental determination of optimal delayed extraction and allow understanding of the behavior of ions as a function of their mass‐to‐charge ratio. Conclusions Although the use of a delayed extraction has been well known for many years and is very popular in MALDI, it is much less used in TOF‐SIMS. Its full characterization now enables secondary ion images to be recorded in a single run with a submicron spatial resolution and with a mass resolution of several thousand. This improvement is very useful when analyzing lipids on tissue sections, or rare, precious, or very small size samples. © 2015 The Authors. Rapid Communications in Mass Spectrometry published by John Wiley & Sons Ltd. PMID:26395603

  4. Spatial trends of polybrominated diphenyl ethers in avian species: Utilization of stored samples in the Environmental Specimen Bank of Ehime University (es-Bank)

    International Nuclear Information System (INIS)

    Kunisue, Tatsuya; Higaki, Yumi; Isobe, Tomohiko; Takahashi, Shin; Subramanian, Annamalai; Tanabe, Shinsuke

    2008-01-01

    The present study determined concentrations and patterns of polybrominated diphenyl ethers (PBDEs) and polychlorinated biphenyls (PCBs) in specimens of open sea, and Japanese coastal and inland avian species, which have been stored in the Environmental Specimen Bank of Ehime University (es-Bank), to examine the spatial trends. PBDEs and PCBs were detected in all the muscle samples analyzed, suggesting that PBDE pollution has spread even to the remote open sea areas, as in the case of PCBs. Japanese coastal and inland birds accumulated higher concentrations of PBDEs than open sea birds. In addition, higher PBDE/PCB concentration ratios were observed in Japanese coastal and inland birds than in open sea birds, indicating the input of PBDEs into the Japanese terrestrial environment. Compositions of PBDEs varied among avian species with a predominance of BDE47 or BDE153. This could be due to differences in their habitat, food habit and/or biotransformation capacity of PBDEs. - Open sea birds have been exposed to PBDEs, but the accumulation levels were lower than those in Japanese coastal and inland birds

  5. A SAMPLE STUDY ON THE IMPORTANCE AND THE EVALUATION OF THREE DIMENSIONAL EXPRESSION TECHNIQUES IN THE EDUCATION OF PLANTING DESIGN

    Directory of Open Access Journals (Sweden)

    Banu Çiçek Kurdoğlu

    2008-04-01

    Full Text Available :Drafts developed in graphical expression techniques and models formed in abstract manners and gradually becoming concrete are used for the exhibition of the targeted images in the design process, which is also a mental improvement process. Among the biggest difficulty beginner architecture students face is failing to make comments on the products they design in architecture design process; their spatial relationships and express them in two or three-dimensional models. Expression and modelling techniques to be used in this process are very important. In this study, a lesson programme enriched with two and three – dimensional model expression techniques for planting design education, which is of vital significance in landscape architecture departments, was developed and applied. Advantages and disadvantages of the programme were evaluated and some suggestions were offered. Consequently, importance of three dimensional expression techniques and need for them were re-emphasized and the efficiency of the modelling technique used in the study was determined under today’s and Turkey’s conditions.

  6. Data-driven soft sensor design with multiple-rate sampled data: a comparative study

    DEFF Research Database (Denmark)

    Lin, Bao; Recke, Bodil; Schmidt, Torben M.

    2009-01-01

    to design quality soft sensors for cement kiln processes using data collected from a simulator and a plant log system. Preliminary results reveal that the WPLS approach is able to provide accurate one-step-ahead prediction. The regularized data lifting technique predicts the product quality of cement kiln...

  7. Addressing Underrepresentation in Sex Work Research: Reflections on Designing a Purposeful Sampling Strategy.

    Science.gov (United States)

    Bungay, Vicky; Oliffe, John; Atchison, Chris

    2016-06-01

    Men, transgender people, and those working in off-street locales have historically been underrepresented in sex work health research. Failure to include all sections of sex worker populations precludes comprehensive understandings about a range of population health issues, including potential variations in the manifestation of such issues within and between population subgroups, which in turn can impede the development of effective services and interventions. In this article, we describe our attempts to define, determine, and recruit a purposeful sample for a qualitative study examining the interrelationships between sex workers' health and the working conditions in the Vancouver off-street sex industry. Detailed is our application of ethnographic mapping approaches to generate information about population diversity and work settings within distinct geographical boundaries. Bearing in mind the challenges and the overwhelming discrimination sex workers experience, we scope recommendations for safe and effective purposeful sampling inclusive of sex workers' heterogeneity. © The Author(s) 2015.

  8. Statistical Methods and Sampling Design for Estimating Step Trends in Surface-Water Quality

    Science.gov (United States)

    Hirsch, Robert M.

    1988-01-01

    This paper addresses two components of the problem of estimating the magnitude of step trends in surface water quality. The first is finding a robust estimator appropriate to the data characteristics expected in water-quality time series. The J. L. Hodges-E. L. Lehmann class of estimators is found to be robust in comparison to other nonparametric and moment-based estimators. A seasonal Hodges-Lehmann estimator is developed and shown to have desirable properties. Second, the effectiveness of various sampling strategies is examined using Monte Carlo simulation coupled with application of this estimator. The simulation is based on a large set of total phosphorus data from the Potomac River. To assure that the simulated records have realistic properties, the data are modeled in a multiplicative fashion incorporating flow, hysteresis, seasonal, and noise components. The results demonstrate the importance of balancing the length of the two sampling periods and balancing the number of data values between the two periods.

  9. Control sample design using a geodemographic discriminator: An application of Super Profiles

    Science.gov (United States)

    Brown, Peter J. B.; McCulloch, Peter G.; Williams, Evelyn M. I.; Ashurst, Darren C.

    The development and application of an innovative sampling framework for use in a British study of the early detection of gastric cancer are described. The Super Profiles geodemographic discriminator is used in the identification of geographically distinct control and contrast areas from which samples of cancer registry case records may be drawn for comparison with the records of patients participating in the gastric cancer intervention project. Preliminary results of the application of the framework are presented and confirm its effectiveness in satisfactorily reflecting known patterns of variation in cancer occurrence by age, gender and social class. The method works well for cancers with a known and clear social gradient, such as lung and breast cancer, moderately well for gastric cancer and somewhat less well for oesophageal cancer, where the social class gradient is less clear.

  10. Sampling plan design and analysis for a low level radioactive waste disposal program

    International Nuclear Information System (INIS)

    Hassig, N.L.; Wanless, J.W.

    1989-01-01

    Low-level wastes that are candidates for BRC (below regulatory concern) disposal must be subjected to an extensive monitoring program to insure the wastes meet (potential) bulk property and contamination concentration BRC criteria for disposal. This paper addresses the statistical implications of using various methods to verify BRC criteria. While surface and volumetric monitoring each have their advantages and disadvantages, a dual, sequential monitoring process is the preferred choice from a statistical reliability perspective. With dual monitoring, measurements on the contamination are verifiable, and sufficient to allow for a complete characterization of the wastes. As these characterizations become more reliable and stable, something less than 100% sampling may be possible for release of wastes for BRC disposal. This paper provides a survey of the issues involved in the selection of a monitoring and sampling program for the disposal of BRC wastes

  11. A weighted sampling algorithm for the design of RNA sequences with targeted secondary structure and nucleotide distribution.

    Science.gov (United States)

    Reinharz, Vladimir; Ponty, Yann; Waldispühl, Jérôme

    2013-07-01

    The design of RNA sequences folding into predefined secondary structures is a milestone for many synthetic biology and gene therapy studies. Most of the current software uses similar local search strategies (i.e. a random seed is progressively adapted to acquire the desired folding properties) and more importantly do not allow the user to control explicitly the nucleotide distribution such as the GC-content in their sequences. However, the latter is an important criterion for large-scale applications as it could presumably be used to design sequences with better transcription rates and/or structural plasticity. In this article, we introduce IncaRNAtion, a novel algorithm to design RNA sequences folding into target secondary structures with a predefined nucleotide distribution. IncaRNAtion uses a global sampling approach and weighted sampling techniques. We show that our approach is fast (i.e. running time comparable or better than local search methods), seedless (we remove the bias of the seed in local search heuristics) and successfully generates high-quality sequences (i.e. thermodynamically stable) for any GC-content. To complete this study, we develop a hybrid method combining our global sampling approach with local search strategies. Remarkably, our glocal methodology overcomes both local and global approaches for sampling sequences with a specific GC-content and target structure. IncaRNAtion is available at csb.cs.mcgill.ca/incarnation/. Supplementary data are available at Bioinformatics online.

  12. Baseline Design Compliance Matrix for the Type 4 In Situ Vapor Samplers and Supernate and Sludge and Soft Saltcake Grab Sampling

    International Nuclear Information System (INIS)

    BOGER, R.M.

    2000-01-01

    The DOE has identified a need to sample vapor space, exhaust ducts, supernate, sludge, and soft saltcake in waste tanks that store radioactive waste. This document provides the Design Compliance Matrix (DCM) for the Type 4 In-Situ Vapor Sampling (ISVS) system and the Grab Sampling System that are used for completing this type of sampling function. The DCM identifies the design requirements and the source of the requirements for the Type 4 ISVS system and the Grab Sampling system. The DCM is a single-source compilation design requirements for sampling and sampling support equipment and supports the configuration management of these systems

  13. Heteronuclear Micro-Helmholtz Coil Facilitates µm-Range Spatial and Sub-Hz Spectral Resolution NMR of nL-Volume Samples on Customisable Microfluidic Chips.

    Directory of Open Access Journals (Sweden)

    Nils Spengler

    Full Text Available We present a completely revised generation of a modular micro-NMR detector, featuring an active sample volume of ∼ 100 nL, and an improvement of 87% in probe efficiency. The detector is capable of rapidly screening different samples using exchangeable, application-specific, MEMS-fabricated, microfluidic sample containers. In contrast to our previous design, the sample holder chips can be simply sealed with adhesive tape, with excellent adhesion due to the smooth surfaces surrounding the fluidic ports, and so withstand pressures of ∼2.5 bar, while simultaneously enabling high spectral resolution up to 0.62 Hz for H2O, due to its optimised geometry. We have additionally reworked the coil design and fabrication processes, replacing liquid photoresists by dry film stock, whose final thickness does not depend on accurate volume dispensing or precise levelling during curing. We further introduced mechanical alignment structures to avoid time-intensive optical alignment of the chip stacks during assembly, while we exchanged the laser-cut, PMMA spacers by diced glass spacers, which are not susceptible to melting during cutting. Doing so led to an overall simplification of the entire fabrication chain, while simultaneously increasing the yield, due to an improved uniformity of thickness of the individual layers, and in addition, due to more accurate vertical positioning of the wirebonded coils, now delimited by a post base plateau. We demonstrate the capability of the design by acquiring a 1H spectrum of ∼ 11 nmol sucrose dissolved in D2O, where we achieved a linewidth of 1.25 Hz for the TSP reference peak. Chemical shift imaging experiments were further recorded from voxel volumes of only ∼ 1.5 nL, which corresponded to amounts of just 1.5 nmol per voxel for a 1 M concentration. To extend the micro-detector to other nuclei of interest, we have implemented a trap circuit, enabling heteronuclear spectroscopy, demonstrated by two 1H/13C 2D HSQC

  14. Exploring the utility of quantitative network design in evaluating Arctic sea ice thickness sampling strategies

    OpenAIRE

    Kaminski, T.; Kauker, F.; Eicken, H.; Karcher, M.

    2015-01-01

    We present a quantitative network design (QND) study of the Arctic sea ice-ocean system using a software tool that can evaluate hypothetical observational networks in a variational data assimilation system. For a demonstration, we evaluate two idealised flight transects derived from NASA's Operation IceBridge airborne ice surveys in terms of their potential to improve ten-day to five-month sea-ice forecasts. As target regions for the forecasts we select the Chukchi Sea, a...

  15. Design and performance of a multi-channel, multi-sampling, PSD-enabling integrated circuit

    International Nuclear Information System (INIS)

    Engel, G.L.; Hall, M.J.; Proctor, J.M.; Elson, J.M.; Sobotka, L.G.; Shane, R.; Charity, R.J.

    2009-01-01

    This paper presents the design and test results of an eight-channel prototype integrated circuit chip intended to greatly simplify the pulse-processing electronics needed for large arrays of scintillation detectors. Because the chip design employs (user-controlled) multi-region charge integration, particle identification is incorporated into the basic design. Each channel on the chip also contains a time-to-voltage converter which provides relative time information. The pulse-height integrals and the relative time are all stored on capacitors and are either reset, after a user controlled time, or sequentially read out if acquisition of the event is desired. Each of the three pulse-height sub-channels consists of a gated integrator with eight programmable charging rates and an externally programmable gate generator that defines the start (with four time ranges) and width (with four time ranges) of the gate relative to an external discriminator signal. The chip supports three triggering modes, two time ranges, two power modes, and produces four sparsified analog pulse trains (three for the integrators and another for the time) with synchronized addresses for off-chip digitization with a pipelined ADC. The eight-channel prototype chip occupies an area of 2.8 mmx5.7 mm, dissipates 60 mW (low-power mode), and was fabricated in the AMI 0.5-μm process (C5N).

  16. Design and performance of a multi-channel, multi-sampling, PSD-enabling integrated circuit

    Energy Technology Data Exchange (ETDEWEB)

    Engel, G.L., E-mail: gengel@siue.ed [Department of Electrical and Computer Engineering, VLSI Design Research Laboratory, Southern Illinois University Edwardsville, Engineering Building, Room 3043 Edwardsville, IL 62026 1081 (United States); Hall, M.J.; Proctor, J.M. [Department of Electrical and Computer Engineering, VLSI Design Research Laboratory, Southern Illinois University Edwardsville, Engineering Building, Room 3043 Edwardsville, IL 62026 1081 (United States); Elson, J.M.; Sobotka, L.G.; Shane, R.; Charity, R.J. [Departments of Chemistry and Physics, Washington University, Saint Louis, MO 63130 (United States)

    2009-12-21

    This paper presents the design and test results of an eight-channel prototype integrated circuit chip intended to greatly simplify the pulse-processing electronics needed for large arrays of scintillation detectors. Because the chip design employs (user-controlled) multi-region charge integration, particle identification is incorporated into the basic design. Each channel on the chip also contains a time-to-voltage converter which provides relative time information. The pulse-height integrals and the relative time are all stored on capacitors and are either reset, after a user controlled time, or sequentially read out if acquisition of the event is desired. Each of the three pulse-height sub-channels consists of a gated integrator with eight programmable charging rates and an externally programmable gate generator that defines the start (with four time ranges) and width (with four time ranges) of the gate relative to an external discriminator signal. The chip supports three triggering modes, two time ranges, two power modes, and produces four sparsified analog pulse trains (three for the integrators and another for the time) with synchronized addresses for off-chip digitization with a pipelined ADC. The eight-channel prototype chip occupies an area of 2.8 mmx5.7 mm, dissipates 60 mW (low-power mode), and was fabricated in the AMI 0.5-mum process (C5N).

  17. High-resolution space-time characterization of convective rain cells: implications on spatial aggregation and temporal sampling operated by coarser resolution instruments

    Science.gov (United States)

    Marra, Francesco; Morin, Efrat

    2017-04-01

    Forecasting the occurrence of flash floods and debris flows is fundamental to save lives and protect infrastructures and properties. These natural hazards are generated by high-intensity convective storms, on space-time scales that cannot be properly monitored by conventional instrumentation. Consequently, a number of early-warning systems are nowadays based on remote sensing precipitation observations, e.g. from weather radars or satellites, that proved effective in a wide range of situations. However, the uncertainty affecting rainfall estimates represents an important issue undermining the operational use of early-warning systems. The uncertainty related to remote sensing estimates results from (a) an instrumental component, intrinsic of the measurement operation, and (b) a discretization component, caused by the discretization of the continuous rainfall process. Improved understanding on these sources of uncertainty will provide crucial information to modelers and decision makers. This study aims at advancing knowledge on the (b) discretization component. To do so, we take advantage of an extremely-high resolution X-Band weather radar (60 m, 1 min) recently installed in the Eastern Mediterranean. The instrument monitors a semiarid to arid transition area also covered by an accurate C-Band weather radar and by a relatively sparse rain gauge network ( 1 gauge/ 450 km2). Radar quantitative precipitation estimation includes corrections reducing the errors due to ground echoes, orographic beam blockage and attenuation of the signal in heavy rain. Intense, convection-rich, flooding events recently occurred in the area serve as study cases. We (i) describe with very high detail the spatiotemporal characteristics of the convective cores, and (ii) quantify the uncertainty due to spatial aggregation (spatial discretization) and temporal sampling (temporal discretization) operated by coarser resolution remote sensing instruments. We show that instantaneous rain intensity

  18. Design and Development Computer-Based E-Learning Teaching Material for Improving Mathematical Understanding Ability and Spatial Sense of Junior High School Students

    Science.gov (United States)

    Nurjanah; Dahlan, J. A.; Wibisono, Y.

    2017-02-01

    This paper aims to make a design and development computer-based e-learning teaching material for improving mathematical understanding ability and spatial sense of junior high school students. Furthermore, the particular aims are (1) getting teaching material design, evaluation model, and intrument to measure mathematical understanding ability and spatial sense of junior high school students; (2) conducting trials computer-based e-learning teaching material model, asessment, and instrument to develop mathematical understanding ability and spatial sense of junior high school students; (3) completing teaching material models of computer-based e-learning, assessment, and develop mathematical understanding ability and spatial sense of junior high school students; (4) resulting research product is teaching materials of computer-based e-learning. Furthermore, the product is an interactive learning disc. The research method is used of this study is developmental research which is conducted by thought experiment and instruction experiment. The result showed that teaching materials could be used very well. This is based on the validation of computer-based e-learning teaching materials, which is validated by 5 multimedia experts. The judgement result of face and content validity of 5 validator shows that the same judgement result to the face and content validity of each item test of mathematical understanding ability and spatial sense. The reliability test of mathematical understanding ability and spatial sense are 0,929 and 0,939. This reliability test is very high. While the validity of both tests have a high and very high criteria.

  19. Design and relevant sample calculations for a neutral particle energy diagnostic based on time of flight

    Energy Technology Data Exchange (ETDEWEB)

    Cecconello, M

    1999-05-01

    Extrap T2 will be equipped with a neutral particles energy diagnostic based on time of flight technique. In this report, the expected neutral fluxes for Extrap T2 are estimated and discussed in order to determine the feasibility and the limits of such diagnostic. These estimates are based on a 1D model of the plasma. The input parameters of such model are the density and temperature radial profiles of electrons and ions and the density of neutrals at the edge and in the centre of the plasma. The atomic processes included in the model are the charge-exchange and the electron-impact ionization processes. The results indicate that the plasma attenuation length varies from a/5 to a, a being the minor radius. Differential neutral fluxes, as well as the estimated power losses due to CX processes (2 % of the input power), are in agreement with experimental results obtained in similar devices. The expected impurity influxes vary from 10{sup 14} to 10{sup 11} cm{sup -2}s{sup -1}. The neutral particles detection and acquisition systems are discussed. The maximum detectable energy varies from 1 to 3 keV depending on the flight distances d. The time resolution is 0.5 ms. Output signals from the waveform recorder are foreseen in the range 0-200 mV. An 8-bit waveform recorder having 2 MHz sampling frequency and 100K sample of memory capacity is the minimum requirement for the acquisition system 20 refs, 19 figs.

  20. Spreading Design of Radioactivity in Sea Water, Algae and Fish Samples inthe Coastal of Muria Peninsula Area

    International Nuclear Information System (INIS)

    Sutjipto; Muryono; Sumining

    2000-01-01

    Spreading design of radioactivity in sea water, brown algae (phaeopyceae)and kerapu fish (epeniphelus) samples in the coastal of Muria peninsula areahas been studied. This research was carried out with designed beside to knowspreading each radioactivity but also spreading design in relation to thecontent of Pu-239 and Cs-137. Samples taken, preparation and analysis basedon the procedures of environmental radioactivity analysis. The instrumentused for the analysis radioactivity were alpha counter with detector ZnS, lowlevel beta counter modified P3TM-BATAN with detector GM and spectrometergamma with detector Ge(Li). Alpha radioactivity obtained of sea water, algaeand fish were the fluctuation form of the natural background. Radionuclide ofPu-239 in samples not detect, because its concentration/radioactivity stillbelow the maximum concentration detection value of Pu-239 for algae and fishwas that 1.10 Bq/g, whereas for sea water was that 0.07 Bq/mL. Result for theradioactivity which give the highest alpha radioactivity obtained on thekerapu fish was that 1.56 x 10 -3 Bq/g, beta radioactivity on sea water wasthat 1.75 x 10 2 mBq/L, gamma radioactivity of K-40 on brown algae was that3.72 x 10 -2 Bq/g and gamma radioactivity of Tl-208 on fish as mentionedabove was that 1.35 x 10 -2 Bq/g. All the peak spectrum gamma energy ofCs-137 do not detect with gamma counter, so there are not the radionuclide ofCs-137 in the samples. Spreading design of radioactivity which occur in thecoastal of Muria peninsula area for alpha radioactivity was found on kerapufish, beta radioactivities on sea water and gamma radioactivity on brownalgae and kerapu fish. (author)

  1. Interior-exterior connection in architectural design based on the incorporation of spatial in between layers. Study of four architectural projects

    Directory of Open Access Journals (Sweden)

    Krstić Hristina

    2016-01-01

    Full Text Available Different spatial layers in the architectural structure of a building can create particular spatial relations and an architectural space that cannot be defined as an inner space or as an outer space, but one which has the characteristics of both. This space, which can be called “in between space”, appears as the result of a specific design concept in which the architectural composition is created by gradual insertion of volumes one inside another, like a box that is placed inside a box, inside of which is placed another smaller box and so on. The incorporation of various layers in the spatial arrangement of volumes in certain architectural compositions can be conceived as a possible approach in connecting the interior and exterior. This kind of conceptual design distinguishes itself from the common approach by its specific architecture that offers richness, variety, complexity and unique perception of space, thereby increasing its value. The paper investigates this particular concept through the analysis of four residential houses (Villa Le Lac by Le Corbusier, Solar House by Oswald Mathias Ungers, House N by Sou Fujimoto and Guerrero House by Alberto Campo Baeza, and it strives to find out the concept’s use and advantages, all with the aim of opening up new possibilities in the design of buildings and enriching the design process.

  2. Sampling for quality assurance of grading decisions in diabetic retinopathy screening: designing the system to detect errors.

    Science.gov (United States)

    Slattery, Jim

    2005-01-01

    To evaluate various designs for a quality assurance system to detect and control human errors in a national screening programme for diabetic retinopathy. A computer simulation was performed of some possible ways of sampling the referral decisions made during grading and of different criteria for initiating more intensive QA investigations. The effectiveness of QA systems was assessed by the ability to detect a grader making occasional errors in referral. Substantial QA sample sizes are needed to ensure against inappropriate failure to refer. Detection of a grader who failed to refer one in ten cases can be achieved with a probability of 0.58 using an annual sample size of 300 and 0.77 using a sample size of 500. An unmasked verification of a sample of non-referrals by a specialist is the most effective method of internal QA for the diabetic retinopathy screening programme. Preferential sampling of those with some degree of disease may improve the efficiency of the system.

  3. Spatial and seasonal variations of pesticide contamination in agricultural soils and crops sample from an intensive horticulture area of Hohhot, North-West China.

    Science.gov (United States)

    Zhang, Fujin; He, Jiang; Yao, Yiping; Hou, Dekun; Jiang, Cai; Zhang, Xinxin; Di, Caixia; Otgonbayar, Khureldavaa

    2013-08-01

    The spatial variability and temporal trend in concentrations of the organochlorine pesticides (OCPs), hexachlorocyclohexane (HCH) and dichlorodiphenyltrichloroethane (DDT), in soils and agricultural corps were investigated on an intensive horticulture area in Hohhot, North-West China, from 2008 to 2011. The most frequently found and abundant pesticides were the metabolites of DDT (p,p'-DDE, p,p'-DDT, o,p'-DDT and p,p'-DDD). Total DDT concentrations ranged from ND (not detectable) to 507.41 ng/g and were higher than the concentration of total HCHs measured for the range of 4.84-281.44 ng/g. There were significantly positive correlations between the ∑DDT and ∑HCH concentrations (r (2)>0.74) in soils, but no significant correlation was found between the concentrations of OCPs in soils and clay content while a relatively strong correlation was found between total OCP concentrations and total organic carbon (TOC). β-HCH was the main isomer of HCHs, and was detected in all samples; the maximum proportion of β-HCH compared to ∑HCHs (mean value 54%) was found, suggesting its persistence. The α/γ-HCH ratio was between 0.89 and 5.39, which signified the combined influence of technical HCHs and lindane. Low p,p'-DDE/p,p'-DDT in N1, N3 and N9 were found, reflecting the fresh input of DDTs, while the relatively high o,p'-DDT/p,p'-DDT ratios indicated the agricultural application of dicofol. Ratios of DDT/(DDE+DDD) in soils do not indicate recent inputs of DDT into Hohhot farmland soil environment. Seasonal variations of OCPs featured higher concentrations in autumn and lower concentrations in spring. This was likely associated with their temperature-driven re-volatilization and application of dicofol in late spring.

  4. Effects of physical activity calorie expenditure (PACE) labeling: study design and baseline sample characteristics.

    Science.gov (United States)

    Viera, Anthony J; Tuttle, Laura; Olsson, Emily; Gras-Najjar, Julie; Gizlice, Ziya; Hales, Derek; Linnan, Laura; Lin, Feng-Chang; Noar, Seth M; Ammerman, Alice

    2017-09-12

    Obesity and physical inactivity are responsible for more than 365,000 deaths per year and contribute substantially to rising healthcare costs in the US, making clear the need for effective public health interventions. Calorie labeling on menus has been implemented to guide consumer ordering behaviors, but effects on calories purchased has been minimal. In this project, we tested the effect of physical activity calorie expenditure (PACE) food labels on actual point-of-decision food purchasing behavior as well as physical activity. Using a two-group interrupted time series cohort study design in three worksite cafeterias, one cafeteria was assigned to the intervention condition, and the other two served as controls. Calories from food purchased in the cafeteria were assessed by photographs of meals (accompanied by notes made on-site) using a standardized calorie database and portion size-estimation protocol. Primary outcomes will be average calories purchased and minutes of moderate to vigorous physical activity (MVPA) by individuals in the cohorts. We will compare pre-post changes in study outcomes between study groups using piecewise generalized linear mixed model regressions (segmented regressions) with a single change point in our interrupted time-series study. The results of this project will provide evidence of the effectiveness of worksite cafeteria menu labeling, which could potentially inform policy intervention approaches. Labels that convey information in a more readily understandable manner may be more effective at motivating behavior change. Strengths of this study include its cohort design and its robust data capture methods using food photographs and accelerometry.

  5. High-resolution delineation of chlorinated volatile organic compounds in a dipping, fractured mudstone: depth- and strata-dependent spatial variability from rock-core sampling

    Science.gov (United States)

    Goode, Daniel J.; Imbrigiotta, Thomas E.; Lacombe, Pierre J.

    2014-01-01

    dipping mudstones. Despite more than 18 years of pump and treat (P&T) remediation, and natural attenuation processes, CVOC concentrations in aqueous samples pumped from these deeper strata remain elevated in isolated intervals. DNAPL was detected in one borehole during coring at a depth of 27 m. In contrast to core samples from the weathered zone, concentrations in core samples from deeper unweathered and unfractured strata are typically below detection. However, high CVOC concentrations were found in isolated samples from fissile black carbon-rich strata and fractured gray laminated strata. Aqueous-phase concentrations were correspondingly high in samples pumped from these strata via short-interval wells or packer-isolated zones in long boreholes. A refined conceptual site model considers that prior to P&T remediation groundwater flow was primarily subhorizontal in the higher-permeability near surface strata, and the bulk of contaminant mass was shallow. CVOCs diffused into these fractured and weathered mudstones. DNAPL and high concentrations of CVOCs migrated slowly down in deeper unweathered strata, primarily along isolated dipping bedding-plane fractures. After P&T began in 1995, using wells open to both shallow and deep strata, downward transport of dissolved CVOCs accelerated. Diffusion of TCE and other CVOCs from deeper fractures penetrated only a few centimeters into the unweathered rock matrix, likely due to sorption of CVOCs on rock organic carbon. Remediation in the deep, unweathered strata may benefit from the relatively limited migration of CVOCs into the rock matrix. Synthesis of rock core sampling from closely spaced boreholes with geophysical logging and hydraulic testing improves understanding of the controls on CVOC delineation and informs remediation design and monitoring.

  6. Bayesian Geostatistical Design

    DEFF Research Database (Denmark)

    Diggle, Peter; Lophaven, Søren Nymand

    2006-01-01

    locations to, or deletion of locations from, an existing design, and prospective design, which consists of choosing positions for a new set of sampling locations. We propose a Bayesian design criterion which focuses on the goal of efficient spatial prediction whilst allowing for the fact that model...

  7. Design and Development of a Robot-Based Automation System for Cryogenic Crystal Sample Mounting at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Shu, D.; Preissner, C.; Nocher, D.; Han, Y.; Barraza, J.; Lee, P.; Lee, W.-K.; Cai, Z.; Ginell, S.; Alkire, R.; Lazarski, K.; Schuessler, R.; Joachimiak, A.

    2004-01-01

    X-ray crystallography is the primary method to determine the 3D structures of complex macromolecules at high resolution. In the years to come, the Advanced Photon Source (APS) and similar 3rd-generation synchrotron sources elsewhere will become the most powerful tools for studying atomic structures of biological molecules. One of the major bottlenecks in the x-ray data collection process is the constant need to change and realign the crystal sample. This is a very time- and manpower-consuming task. An automated sample mounting system will help to solve this bottleneck problem. We have developed a novel robot-based automation system for cryogenic crystal sample mounting at the APS. Design of the robot-based automation system, as well as its on-line test results at the Argonne Structural Biology Center (SBC) 19-BM experimental station, are presented in this paper

  8. Relative Efficiencies of a Three-Stage Versus a Two-Stage Sample Design For a New NLS Cohort Study. 22U-884-38.

    Science.gov (United States)

    Folsom, R. E.; Weber, J. H.

    Two sampling designs were compared for the planned 1978 national longitudinal survey of high school seniors with respect to statistical efficiency and cost. The 1972 survey used a stratified two-stage sample of high schools and seniors within schools. In order to minimize interviewer travel costs, an alternate sampling design was proposed,…

  9. Assessment of habitat representation across a network of marine protected areas with implications for the spatial design of monitoring.

    Science.gov (United States)

    Young, Mary; Carr, Mark

    2015-01-01

    Networks of marine protected areas (MPAs) are being adopted globally to protect ecosystems and supplement fisheries management. The state of California recently implemented a coast-wide network of MPAs, a statewide seafloor mapping program, and ecological characterizations of species and ecosystems targeted for protection by the network. The main goals of this study were to use these data to evaluate how well seafloor features, as proxies for habitats, are represented and replicated across an MPA network and how well ecological surveys representatively sampled fish habitats inside MPAs and adjacent reference sites. Seafloor data were classified into broad substrate categories (rock and sediment) and finer scale geomorphic classifications standard to marine classification schemes using surface analyses (slope, ruggedness, etc.) done on the digital elevation model derived from multibeam bathymetry data. These classifications were then used to evaluate the representation and replication of seafloor structure within the MPAs and across the ecological surveys. Both the broad substrate categories and the finer scale geomorphic features were proportionately represented for many of the classes with deviations of 1-6% and 0-7%, respectively. Within MPAs, however, representation of seafloor features differed markedly from original estimates, with differences ranging up to 28%. Seafloor structure in the biological monitoring design had mismatches between sampling in the MPAs and their corresponding reference sites and some seafloor structure classes were missed entirely. The geomorphic variables derived from multibeam bathymetry data for these analyses are known determinants of the distribution and abundance of marine species and for coastal marine biodiversity. Thus, analyses like those performed in this study can be a valuable initial method of evaluating and predicting the conservation value of MPAs across a regional network.

  10. Assessment of habitat representation across a network of marine protected areas with implications for the spatial design of monitoring.

    Directory of Open Access Journals (Sweden)

    Mary Young

    Full Text Available Networks of marine protected areas (MPAs are being adopted globally to protect ecosystems and supplement fisheries management. The state of California recently implemented a coast-wide network of MPAs, a statewide seafloor mapping program, and ecological characterizations of species and ecosystems targeted for protection by the network. The main goals of this study were to use these data to evaluate how well seafloor features, as proxies for habitats, are represented and replicated across an MPA network and how well ecological surveys representatively sampled fish habitats inside MPAs and adjacent reference sites. Seafloor data were classified into broad substrate categories (rock and sediment and finer scale geomorphic classifications standard to marine classification schemes using surface analyses (slope, ruggedness, etc. done on the digital elevation model derived from multibeam bathymetry data. These classifications were then used to evaluate the representation and replication of seafloor structure within the MPAs and across the ecological surveys. Both the broad substrate categories and the finer scale geomorphic features were proportionately represented for many of the classes with deviations of 1-6% and 0-7%, respectively. Within MPAs, however, representation of seafloor features differed markedly from original estimates, with differences ranging up to 28%. Seafloor structure in the biological monitoring design had mismatches between sampling