WorldWideScience

Sample records for fragment size statistical

  1. Fragment size distribution statistics in dynamic fragmentation of laser shock-loaded tin

    Directory of Open Access Journals (Sweden)

    Weihua He

    2017-06-01

    Full Text Available This work investigates the geometric statistics method to characterize the size distribution of tin fragments produced in the laser shock-loaded dynamic fragmentation process. In the shock experiments, the ejection of the tin sample with etched V-shape groove in the free surface are collected by the soft recovery technique. Subsequently, the produced fragments are automatically detected with the fine post-shot analysis techniques including the X-ray micro-tomography and the improved watershed method. To characterize the size distributions of the fragments, a theoretical random geometric statistics model based on Poisson mixtures is derived for dynamic heterogeneous fragmentation problem, which reveals linear combinational exponential distribution. The experimental data related to fragment size distributions of the laser shock-loaded tin sample are examined with the proposed theoretical model, and its fitting performance is compared with that of other state-of-the-art fragment size distribution models. The comparison results prove that our proposed model can provide far more reasonable fitting result for the laser shock-loaded tin.

  2. Coagulation-Fragmentation Model for Animal Group-Size Statistics

    Science.gov (United States)

    Degond, Pierre; Liu, Jian-Guo; Pego, Robert L.

    2017-04-01

    We study coagulation-fragmentation equations inspired by a simple model proposed in fisheries science to explain data for the size distribution of schools of pelagic fish. Although the equations lack detailed balance and admit no H-theorem, we are able to develop a rather complete description of equilibrium profiles and large-time behavior, based on recent developments in complex function theory for Bernstein and Pick functions. In the large-population continuum limit, a scaling-invariant regime is reached in which all equilibria are determined by a single scaling profile. This universal profile exhibits power-law behavior crossing over from exponent -2/3 for small size to -3/2 for large size, with an exponential cutoff.

  3. Coagulation-Fragmentation Model for Animal Group-Size Statistics

    Science.gov (United States)

    Degond, Pierre; Liu, Jian-Guo; Pego, Robert L.

    2016-10-01

    We study coagulation-fragmentation equations inspired by a simple model proposed in fisheries science to explain data for the size distribution of schools of pelagic fish. Although the equations lack detailed balance and admit no H-theorem, we are able to develop a rather complete description of equilibrium profiles and large-time behavior, based on recent developments in complex function theory for Bernstein and Pick functions. In the large-population continuum limit, a scaling-invariant regime is reached in which all equilibria are determined by a single scaling profile. This universal profile exhibits power-law behavior crossing over from exponent -2/3 for small size to -3/2 for large size, with an exponential cutoff.

  4. Tracing a phase transition with fluctuations of the largest fragment size: Statistical multifragmentation models and the ALADIN S254 data

    CERN Document Server

    Pietrzak, T; Aumann, T; Bacri, C O; Barczyk, T; Bassini, R; Bianchin, S; Boiano, C; Botvina, A S; Boudard, A; Brzychczyk, J; Chbihi, A; Cibor, J; Czech, B; De Napoli, M; Ducret, J -E; Emling, H; Frankland, J D; Hellstrom, M; Henzlova, D; Imme, G; Iori, I; Johansson, H; Kezzar, K; Lafriakh, A; Le Fèvre, A; Gentil, E Le; Leifels, Y; Luhning, J; Lukasik, J; Lynch, W G; Lynen, U; Majka, Z; Mocko, M; Muller, W F J; Mykulyak, A; Orth, H; Otte, A N; Palit, R; Pawlowski, P; Pullia, A; Raciti, G; Rapisarda, E; Sann, H; Schwarz, C; Sfienti, C; Simon, H; Summerer, K; Trautmann, W; Tsang, M B; Verde, G; Volant, C; Wallace, M; Weick, H; Wiechula, J; Wieloch, A; Zwieglinski, B

    2010-01-01

    A phase transition signature associated with cumulants of the largest fragment size distribution has been identified in statistical multifragmentation models and examined in analysis of the ALADIN S254 data on fragmentation of neutron-poor and neutron-rich projectiles. Characteristics of the transition point indicated by this signature are weakly dependent on the A/Z ratio of the fragmenting spectator source. In particular, chemical freeze-out temperatures are estimated within the range 5.9 to 6.5 MeV. The experimental results are well reproduced by the SMM model.

  5. Statistical study of auroral fragmentation into patches

    Science.gov (United States)

    Hashimoto, Ayumi; Shiokawa, Kazuo; Otsuka, Yuichi; Oyama, Shin-ichiro; Nozawa, Satonori; Hori, Tomoaki; Lester, Mark; Johnsen, Magnar Gullikstad

    2015-08-01

    The study of auroral dynamics is important when considering disturbances of the magnetosphere. Shiokawa et al. (2010, 2014) reported observations of finger-like auroral structures that cause auroral fragmentation. Those structures are probably produced by macroscopic instabilities in the magnetosphere, mainly of the Rayleigh-Taylor type. However, the statistical characteristics of these structures have not yet been investigated. Here based on observations by an all-sky imager at Tromsø (magnetic latitude = 67.1°N), Norway, over three winter seasons, we statistically analyzed the occurrence conditions of 14 large-scale finger-like structures that developed from large-scale auroral regions including arcs and 6 small-scale finger-like structures that developed in auroral patches. The large-scale structures were seen from midnight to dawn local time and usually appeared at the beginning of the substorm recovery phase, near the low-latitude boundary of the auroral region. The small-scale structures were primarily seen at dawn and mainly occurred in the late recovery phase of substorms. The sizes of these large- and small-scale structures mapped in the magnetospheric equatorial plane are usually larger than the gyroradius of 10 keV protons, indicating that the finger-like structures could be caused by magnetohydrodynamic instabilities. However, the scale of small structures is only twice the gyroradius of 10 keV protons, suggesting that finite Larmor radius effects may contribute to the formation of small-scale structures. The eastward propagation velocities of the structures are -40 to +200 m/s and are comparable with those of plasma drift velocities measured by the colocating Super Dual Auroral Radar Network radar.

  6. Three-dimensional Statistical Jet Fragmentation

    CERN Document Server

    Urmossy, Karoly

    2016-01-01

    We reproduce the distribution of the longitudinal and transverse components of momenta of charged hadrons stemming from jets created in proton-proton collisions at $\\sqrt s$ = 7 TeV by a statistical fragmentation model. Our hadronisation model is based on microcanonical statistics and negative binomial multiplicity fluctuations. We describe the scale dependence of the fit parameters of the model with formulas obtained by approximating the exact solution of the DGLAP equation in the $\\phi^3$ theory with leading order splitting function and 1-loop coupling.

  7. Fragment Identification and Statistics Method of Hypervelocity Impact SPH Simulation

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xiaotian; JIA Guanghui; HUANG Hai

    2011-01-01

    A comprehensive treatment to the fragment identification and statistics for the smoothed particle hydrodynamics (SPH) simulation of hypervelocity impact is presented.Based on SPH method, combined with finite element method (FEM), the computation is performed.The fragments are identified by a new pre- and post-processing algorithm and then converted into a binary graph.The number of fragments and the attached SPH particles are determined by counting the quantity of connected domains on the binary graph.The size, velocity vector and mass of each fragment are calculated by the particles' summation and weighted average.The dependence of this method on finite element edge length and simulation terminal time is discussed.An example of tungsten rods impacting steel plates is given for calibration.The computation results match experiments well and demonstrate the effectiveness of this method.

  8. Size Effects in Heavy Ions Fragmentation

    CERN Document Server

    Barrañon, A; Dorso, C O

    2003-01-01

    Rise-Plateau Caloric curves for different Heavy Ion collisions have been obtained, in the range of experimental observations. Limit temperature decreases when the residual size is increased, in agreement with recent theoretical analysis of experimental results reported by other Collaborations. Besides, promptly emitted particles influence on temperature plateau is shown. LATINO binary interaction semiclassical model is used to reproduce the inter-nucleonic forces via Pandharipande Potential and fragments are detected with an Early Cluster Recognition Algorithm.

  9. Fragmentation and exfoliation of 2-dimensional materials: a statistical approach

    Science.gov (United States)

    Kouroupis-Agalou, Konstantinos; Liscio, Andrea; Treossi, Emanuele; Ortolani, Luca; Morandi, Vittorio; Pugno, Nicola Maria; Palermo, Vincenzo

    2014-05-01

    The main advantage for applications of graphene and related 2D materials is that they can be produced on large scales by liquid phase exfoliation. The exfoliation process shall be considered as a particular fragmentation process, where the 2D character of the exfoliated objects will influence significantly fragmentation dynamics as compared to standard materials. Here, we used automatized image processing of Atomic Force Microscopy (AFM) data to measure, one by one, the exact shape and size of thousands of nanosheets obtained by exfoliation of an important 2D-material, boron nitride, and used different statistical functions to model the asymmetric distribution of nanosheet sizes typically obtained. Being the resolution of AFM much larger than the average sheet size, analysis could be performed directly at the nanoscale and at the single sheet level. We find that the size distribution of the sheets at a given time follows a log-normal distribution, indicating that the exfoliation process has a ``typical'' scale length that changes with time and that exfoliation proceeds through the formation of a distribution of random cracks that follow Poisson statistics. The validity of this model implies that the size distribution does not depend on the different preparation methods used, but is a common feature in the exfoliation of this material and thus probably for other 2D materials.The main advantage for applications of graphene and related 2D materials is that they can be produced on large scales by liquid phase exfoliation. The exfoliation process shall be considered as a particular fragmentation process, where the 2D character of the exfoliated objects will influence significantly fragmentation dynamics as compared to standard materials. Here, we used automatized image processing of Atomic Force Microscopy (AFM) data to measure, one by one, the exact shape and size of thousands of nanosheets obtained by exfoliation of an important 2D-material, boron nitride, and used

  10. Statistical ensembles and fragmentation of finite nuclei

    Science.gov (United States)

    Das, P.; Mallik, S.; Chaudhuri, G.

    2017-09-01

    Statistical models based on different ensembles are very commonly used to describe the nuclear multifragmentation reaction in heavy ion collisions at intermediate energies. Canonical model results are more appropriate for finite nuclei calculations while those obtained from the grand canonical ones are more easily calculable. A transformation relation has been worked out for converting results of finite nuclei from grand canonical to canonical and vice versa. The formula shows that, irrespective of the particle number fluctuation in the grand canonical ensemble, exact canonical results can be recovered for observables varying linearly or quadratically with the number of particles. This result is of great significance since the baryon and charge conservation constraints can make the exact canonical calculations extremely difficult in general. This concept developed in this work can be extended in future for transformation to ensembles where analytical solutions do not exist. The applicability of certain equations (isoscaling, etc.) in the regime of finite nuclei can also be tested using this transformation relation.

  11. Statistical universalities in fragmentation under scaling symmetry with a constant frequency of fragmentation

    Energy Technology Data Exchange (ETDEWEB)

    Gorokhovski, M A [Laboratoire de Mecanique des Fluides et d' Acoustique, CNRS - Ecole Centrale de Lyon - INSA Lyon - Universite Claude Bernard Lyon 1, 36 Avenue Guy de Collongue, 69134 Ecully Cedex (France); Saveliev, V L [Institut of Ionosphere, Kamenskoe Plato, 050020 Almaty (Kazakhstan)], E-mail: mikhael.gorokhovski@ec-lyon.fr, E-mail: saveliev@topmail.kz

    2008-04-21

    This paper analyses statistical universalities that arise over time during constant frequency fragmentation under scaling symmetry. The explicit expression of particle-size distribution obtained from the evolution kinetic equation shows that, with increasing time, the initial distribution tends to the ultimate steady-state delta function through at least two intermediate universal asymptotics. The earlier asymptotic is the well-known log-normal distribution of Kolmogorov (1941 Dokl. Akad. Nauk. SSSR 31 99-101). This distribution is the first universality and has two parameters: the first and the second logarithmic moments of the fragmentation intensity spectrum. The later asymptotic is a power function (stronger universality) with a single parameter that is given by the ratio of the first two logarithmic moments. At large times, the first universality implies that the evolution equation can be reduced exactly to the Fokker-Planck equation instead of making the widely used but inconsistent assumption about the smallness of higher than second order moments. At even larger times, the second universality shows evolution towards a fractal state with dimension identified as a measure of the fracture resistance of the medium.

  12. Statistical mechanics of fragmentation processes of ice and rock bodies

    Science.gov (United States)

    Bashkirov, A. G.; Vityazev, A. V.

    1996-09-01

    It is a well-known experimental fact that impact fragmentation, specifically of ice and rock bodies, causes a two-step ("knee"-shaped) power distribution of fragment masses with exponent values within the limits -4 and -1.5 (here and henceforth the differential distribution is borne in mind). A new theoretical approach is proposed to determine the exponent values, a minimal fracture mass, and properties of the knee. As a basis for construction of non-equilibrium statistical mechanics of condensed matter fragmentation the maximum-entropy variational principle is used. In contrast to the usual approach founded on the Boltzmann entropy the more general Tsallis entropy allowing stationary solutions not only in the exponential Boltzmann-Gibbs form but in the form of the power (fractal) law distribution as well is invoked. Relying on the analysis of a lot of published experiments a parameter β is introduced to describe an inhomogeneous distribution of the impact energy over the target. It varies from 0 (for an utterly inhomogeneous distribution of the impact energy) to 1 (for a homogeneous distribution). The lower limit of fragment masses is defined as a characteristic fragment mass for which the energy of fragment formation is minimal. This mass value depends crucially on the value of β. It is shown that for β≪1 only small fragments can be formed, and the maximal permitted fragment (of mass m1) is the upper boundary of the first stage of the fracture process and the point where the knee takes place. The second stage may be realized after a homogeneous redistribution of the remainder of the impact energy over the remainder of the target (when β→1). Here, the formation of great fragments is permitted only and the smallest of them (of mass m2) determines a lower boundary of the second stage. Different forms of the knee can be observed depending on relations between m1 and m2.

  13. Imaging Systems for Size Measurements of Debrisat Fragments

    Science.gov (United States)

    Shiotani, B.; Scruggs, T.; Toledo, R.; Fitz-Coy, N.; Liou, J. C.; Sorge, M.; Huynh, T.; Opiela, J.; Krisko, P.; Cowardin, H.

    2017-01-01

    The overall objective of the DebriSat project is to provide data to update existing standard spacecraft breakup models. One of the key sets of parameters used in these models is the physical dimensions of the fragments (i.e., length, average-cross sectional area, and volume). For the DebriSat project, only fragments with at least one dimension greater than 2 mm are collected and processed. Additionally, a significant portion of the fragments recovered from the impact test are needle-like and/or flat plate-like fragments where their heights are almost negligible in comparison to their other dimensions. As a result, two fragment size categories were defined: 2D objects and 3D objects. While measurement systems are commercially available, factors such as measurement rates, system adaptability, size characterization limitations and equipment costs presented significant challenges to the project and a decision was made to develop our own size characterization systems. The size characterization systems consist of two automated image systems, one referred to as the 3D imaging system and the other as the 2D imaging system. Which imaging system to use depends on the classification of the fragment being measured. Both imaging systems utilize point-and-shoot cameras for object image acquisition and create representative point clouds of the fragments. The 3D imaging system utilizes a space-carving algorithm to generate a 3D point cloud, while the 2D imaging system utilizes an edge detection algorithm to generate a 2D point cloud. From the point clouds, the three largest orthogonal dimensions are determined using a convex hull algorithm. For 3D objects, in addition to the three largest orthogonal dimensions, the volume is computed via an alpha-shape algorithm applied to the point clouds. The average cross-sectional area is also computed for 3D objects. Both imaging systems have automated size measurements (image acquisition and image processing) driven by the need to quickly

  14. Cut Size Statistics of Graph Bisection Heuristics

    OpenAIRE

    Schreiber, G. R.; Martin, O. C.

    1998-01-01

    We investigate the statistical properties of cut sizes generated by heuristic algorithms which solve approximately the graph bisection problem. On an ensemble of sparse random graphs, we find empirically that the distribution of the cut sizes found by ``local'' algorithms becomes peaked as the number of vertices in the graphs becomes large. Evidence is given that this distribution tends towards a Gaussian whose mean and variance scales linearly with the number of vertices of the graphs. Given...

  15. Basic Statistical Concepts for Sample Size Estimation

    Directory of Open Access Journals (Sweden)

    Vithal K Dhulkhed

    2008-01-01

    Full Text Available For grant proposals the investigator has to include an estimation of sample size .The size of the sample should be adequate enough so that there is sufficient data to reliably answer the research question being addressed by the study. At the very planning stage of the study the investigator has to involve the statistician. To have meaningful dialogue with the statistician every research worker should be familiar with the basic concepts of statistics. This paper is concerned with simple principles of sample size calculation. Concepts are explained based on logic rather than rigorous mathematical calculations to help him assimilate the fundamentals.

  16. Evaluation of eruptive energy of a pyroclastic deposit applying fractal geometry to fragment size distributions

    Science.gov (United States)

    Paredes Marino, Joali; Morgavi, Daniele; Di Vito, Mauro; de Vita, Sandro; Sansivero, Fabio; Perugini, Diego

    2016-04-01

    Fractal fragmentation theory has been applied to characterize the particle size distribution of pyroclastic deposits generated by volcanic explosions. Recent works have demonstrated that fractal dimension on grain size distributions can be used as a proxy for estimating the energy associated with volcanic eruptions. In this work we seek to establish a preliminary analytical protocol that can be applied to better characterize volcanic fall deposits and derive the potential energy for fragmentation that was stored in the magma prior/during an explosive eruption. The methodology is based on two different techniques for determining the grain-size distribution of the pyroclastic samples: 1) dry manual sieving (particles larger than 297μm), and 2) automatic grain size analysis via a CamSizer-P4®device, the latter measure the distribution of projected area, obtaining a cumulative distribution based on volume fraction for particles up to 30mm. Size distribution data have been analyzed by applying the fractal fragmentation theory estimating the value of Df, i.e. the fractal dimension of fragmentation. In order to test our protocol we studied the Cretaio eruption, Ischia island, Italy. Results indicate that size distributions of pyroclastic fall deposits follow a fractal law, indicating that the fragmentation process of these deposits reflects a scale-invariant fragmentation mechanism. Matching the results from manual and automated techniques allows us to obtain a value of the "fragmentation energy" from the explosive eruptive events that generate the Cretaio deposits. We highlight the importance of these results, based on fractal statistics, as an additional volcanological tool for addressing volcanic risk based on the analyses of grain size distributions of natural pyroclastic deposits. Keywords: eruptive energy, fractal dimension of fragmentation, pyroclastic fallout.

  17. Source size scaling of fragment production in projectile breakup

    CERN Document Server

    Beaulieu, L; Fox, D; Das-Gupta, S; Pan, J; Ball, G C; Djerroud, B; Doré, D; Galindo-Uribarri, A; Guinet, D; Hagberg, E; Horn, D; Laforest, R; Larochelle, Y; Lautesse, P; Samri, M; Roy, R; Saint-Pierre, C

    1996-01-01

    Fragment production has been studied as a function of the source mass and excitation energy in peripheral collisions of $^{35}$Cl+$^{197}$Au at 43 MeV/nucleon and $^{70}$Ge+$^{nat}$Ti at 35 MeV/nucleon. The results are compared to the Au+Au data at 600 MeV/nucleon obtained by the ALADIN collaboration. A mass scaling, by $A_{source} \\sim$ 35 to 190, strongly correlated to excitation energy per nucleon, is presented, suggesting a thermal fragment production mechanism. Comparisons to a standard sequential decay model and the lattice-gas model are made. Fragment emission from a hot, rotating source is unable to reproduce the experimental source size scaling.

  18. The fractal nature of fragment size distributions of pyroclastic fall deposits from Cretaio eruption, Ischia Island (Italy)

    Science.gov (United States)

    Paredes Marino, Joali; Morgavi, Daniele; Di Vito, Mauro; de Vita, Sandro; Sansivero, Fabio; Perugini, Diego

    2016-04-01

    The principles of fractal theory have had a strong influence on the understanding of many geological processes. Combining laboratory experiments on natural deposits generated by explosive volcanic eruptions along with statistical fractal analysis allows us to characterize precisely pyroclastic deposits and opens the possibility for substantial advances in the quantification of fragmentation processes during explosive volcanic events. A set of samples from the Cretaio eruption (1.86 Ka B.P.) was analyzed using fractal geometry to characterize the particle size distribution (PSD) of pyroclastic fragments erupted during its fallout phase. PSD analyses were performed on ten samples corresponding to ten different explosive episodes during the eruption. Samples were divided in juvenile fraction, (JV) and lithic fraction, (LC). Each fraction was analyzed separately. The results for the investigated size range (3mm to 300μm) showed that the fragmentation process is well characterized by a fractal distribution, exhibiting a multi-fractal behavior, explained by different and sequential processes of fragmentation. Frequency-size distribution of JV and LC fractions exhibit opposite behavior: for JV-fraction smaller particles (<1mm) shows a higher dimension of fragmentation relative to the bigger particles, a feature that can be related to a secondary process of fragmentation; the opposite behavior is observed for the LC fraction (smallest dimensions of fragmentation correspond to the smaller particle sizes). These differences can be explained by the different rheology of the fragmented materials and/or the occurrence of different fragmentation processes. These results highlight the importance of fractal statistics as a tool for addressing volcanic risk based on the analyses of natural grain size distributions and allow discriminating different fragmentation processes occurring inside the conduit during the volcanic explosions. Keywords: volcanic fragmentation; juvenile

  19. Fragment size does not matter when you are well connected: effects of fragmentation on fitness of coexisting gypsophiles.

    Science.gov (United States)

    Matesanz, S; Gómez-Fernández, A; Alcocer, I; Escudero, A

    2015-09-01

    Most habitat fragmentation studies have focused on the effects of population size on reproductive success of single species, but studies assessing the effects of both fragment size and connectivity, and their interaction, on several coexisting species are rare. In this study, we selected 20 fragments along two continuous gradients of size and degree of isolation in a gypsum landscape in central Spain. In each fragment, we selected 15 individuals of each of three dominant gypsophiles (Centaurea hyssopifolia, Lepidium subulatum and Helianthemum squamatum, 300 plants per species, 900 plants in total) and measured several reproductive traits: inflorescence number, fruit set, seed set and seed mass. We hypothesised that plant fitness would be lower on small and isolated fragments due to an interaction between fragment size and connectivity, and that response patterns would be species-specific. Overall, fragment size had very little effect on reproductive traits compared to that of connectivity. We observed a positive effect of fragment connectivity on C. hyssopifolia fitness, mediated by the increased seed predation in plants from isolated fragments, resulting in fewer viable seeds per capitulum and lower seed set. Furthermore, seed mass was lower in plants from isolated fragments for both C. hyssopifolia and L. subulatum. In contrast, few reproductive traits of H. squamatum were affected by habitat fragmentation. We discuss the implications of species-specific responses to habitat fragmentation for the dynamics and conservation of gypsum plant communities. Our results highlight the complex interplay among plants and their mutualistic and antagonistic visitors, and reinforce the often-neglected role of habitat connectivity as a key component of the fragmentation process.

  20. Fragment and particle size distribution of impacted ceramic tiles

    NARCIS (Netherlands)

    Carton, E.P.; Weerheijm, J.; Ditzhuijzen, C.; Tuinman, I.

    2014-01-01

    The fragmentation of ceramic tiles under ballistic impact has been studied. Fragments and aerosol (respirable) particles were collected and analyzed to determine the total surface area generated by fracturing (macro-cracking and comminution) of armor grade ceramics. The larger fragments were collect

  1. Statistical methods for detecting periodic fragments in DNA sequence data

    Directory of Open Access Journals (Sweden)

    Ying Hua

    2011-04-01

    Full Text Available Abstract Background Period 10 dinucleotides are structurally and functionally validated factors that influence the ability of DNA to form nucleosomes, histone core octamers. Robust identification of periodic signals in DNA sequences is therefore required to understand nucleosome organisation in genomes. While various techniques for identifying periodic components in genomic sequences have been proposed or adopted, the requirements for such techniques have not been considered in detail and confirmatory testing for a priori specified periods has not been developed. Results We compared the estimation accuracy and suitability for confirmatory testing of autocorrelation, discrete Fourier transform (DFT, integer period discrete Fourier transform (IPDFT and a previously proposed Hybrid measure. A number of different statistical significance procedures were evaluated but a blockwise bootstrap proved superior. When applied to synthetic data whose period-10 signal had been eroded, or for which the signal was approximately period-10, the Hybrid technique exhibited superior properties during exploratory period estimation. In contrast, confirmatory testing using the blockwise bootstrap procedure identified IPDFT as having the greatest statistical power. These properties were validated on yeast sequences defined from a ChIP-chip study where the Hybrid metric confirmed the expected dominance of period-10 in nucleosome associated DNA but IPDFT identified more significant occurrences of period-10. Application to the whole genomes of yeast and mouse identified ~ 21% and ~ 19% respectively of these genomes as spanned by period-10 nucleosome positioning sequences (NPS. Conclusions For estimating the dominant period, we find the Hybrid period estimation method empirically to be the most effective for both eroded and approximate periodicity. The blockwise bootstrap was found to be effective as a significance measure, performing particularly well in the problem of

  2. Estimation of the initial shape of meteoroids based on statistical distributions of fragment masses

    Science.gov (United States)

    Vinnikov, V. V.; Gritsevich, M. I.; Kuznetsova, D. V.; Turchak, L. I.

    2016-06-01

    An approach to the estimation of the initial shape of a meteoroid based on the statistical distributions of masses of its recovered fragments is presented. The fragment distribution function is used to determine the corresponding scaling index of the power law with exponential cutoff. The scaling index is related empirically to the shape parameter of a fragmenting body by a quadratic equation, and the shape parameter is expressed through the proportions of the initial object. This technique is used to study a representative set of fragments of the Bassikounou meteorite and compare the obtained data with the results of statistical analysis of other meteorites.

  3. A dearth of small particles in debris disks: An energy-constrained smallest fragment size

    CERN Document Server

    Krijt, Sebastiaan

    2014-01-01

    A prescription for the fragment size distribution resulting from dust grain collisions is essential when modelling a range of astrophysical systems, such as debris disks and planetary rings. While the slope of the fragment size distribution and the size of the largest fragment are well known, the behaviour of the distribution at the small size end is theoretically and experimentally poorly understood. This leads debris disk codes to generally assume a limit equal to, or below, the radiation blow-out size. We use energy conservation to analytically derive a lower boundary of the fragment size distribution for a range of collider mass ratios. Focussing on collisions between equal-sized bodies, we apply the method to debris disks. For a given collider mass, the size of the smallest fragments is found to depend on collision velocity, material parameters, and the size of the largest fragment. We provide a physically motivated recipe for the calculation of the smallest fragment, which can be easily implemented in c...

  4. Sizing of rock fragmentation modeling due to bench blasting using adaptive neuro-fuzzy inference system (ANFIS)

    Institute of Scientific and Technical Information of China (English)

    Karami Alireza; Afiuni-Zadeh Somaieh

    2013-01-01

    One of the most important characters of blasting, a basic step of surface mining, is rock fragmentation because it directly effects on the costs of drilling and economics of the subsequent operations of loading, hauling and crushing in mines. Adaptive neuro-fuzzy inference system (ANFIS) and radial basis function (RBF) show potentials for modeling the behavior of complex nonlinear processes such as those involved in fragmentation due to blasting of rocks. We developed ANFIS and RBF methods for modeling of sizing of rock fragmentation due to bench blasting by estimation of 80%passing size (K80) of Golgohar iron mine of Sirjan, Iran. Comparing the results of ANFIS and RBF models shows that although the statistical parame-ters RBF model is acceptable but ANFIS proposed model is superior and also simpler because ANFIS model is constructed using only two input parameters while seven input parameters used for construction of RBF model.

  5. Simulating the particle size distribution of rockfill materials based on its statistical regularity

    Institute of Scientific and Technical Information of China (English)

    YAN Zongling; QIU Xiande; YU Yongqiang

    2003-01-01

    The particle size distribution of rockfill is studied by using granular mechanics, mesomechanics and probability statistics to reveal the relationship of the distribution of particle size to that of the potential energy intensity before fragmentation,which finds out that the potential energy density has a linear relation to the logarithm of particle size and deduces that the distribution of the logarithm of particle size conforms to normal distribution because the distribution of the potential energy density does so. Based on this finding and by including the energy principle of rock fragmentation, the logarithm distribution model of particle size is formulated, which uncovers the natural characteristics of particle sizes on statistical distribution. Exploring the properties of the average value, the expectation, and the unbiased variance of particle size indicates that the expectation does notequal to the average value, but increases with increasing particle size and its ununiformity, and is always larger than the average value, and the unbiased variance increases as the ununiformity and geometric average value increase. A case study proves that the simulated results by the proposed logarithm distribution model accord with the actual data. It is concluded that the logarithm distribution model and Kuz-Ram model can be used to forecast the particle-size distribution of inartificial rockfill while for blasted rockfill, Kuz-Ram model is an option, and in combined application of the two models, it is necessary to do field tests to adjust some parameters of the model.

  6. Size-selected genomic libraries: the distribution and size-fractionation of restricted genomic DNA fragments by gel electrophoresis.

    Science.gov (United States)

    Gondo, Y

    1995-02-01

    By using one-dimensional genome scanning, it is possible to directly identify the restricted genomic DNA fragment that reflects the site of genetic change. The subsequent strategies to obtain the molecular clones of the corresponding restriction fragment are usually as follows: (i) the restriction of a mass quantity of an appropriate genomic DNA, (ii) the size-fractionation of the restricted DNA on a preparative electrophoresis gel in order to enrich the corresponding restriction fragment, (iii) the construction of the size-selected libraries from the fractionated genomic DNA, and (iv) the screening of the library to obtain an objective clone which is identified on the analytical genome scanning gel. A knowledge of the size distribution pattern of restriction fragments of the genomic DNA makes it possible to calculate the heterogeneity or complexity of the restriction fragment in each size-fraction. This manuscript first describes the distribution of the restriction fragments with respect to their length. Some examples of the practical application of this theory to genome scanning is then discussed using presumptive genome scanning gels. The way to calculate such DNA complexities in the prepared size-fractionated samples is also demonstrated. Such information should greatly facilitate the design of experimental strategies for the cloning of a certain size of genomic DNA after digestion with restriction enzyme(s) as is the case with genome scanning.

  7. Population size, habitat fragmentation, and the nature of adaptive variation in a stream fish.

    Science.gov (United States)

    Fraser, Dylan J; Debes, Paul V; Bernatchez, Louis; Hutchings, Jeffrey A

    2014-09-07

    Whether and how habitat fragmentation and population size jointly affect adaptive genetic variation and adaptive population differentiation are largely unexplored. Owing to pronounced genetic drift, small, fragmented populations are thought to exhibit reduced adaptive genetic variation relative to large populations. Yet fragmentation is known to increase variability within and among habitats as population size decreases. Such variability might instead favour the maintenance of adaptive polymorphisms and/or generate more variability in adaptive differentiation at smaller population size. We investigated these alternative hypotheses by analysing coding-gene, single-nucleotide polymorphisms associated with different biological functions in fragmented brook trout populations of variable sizes. Putative adaptive differentiation was greater between small and large populations or among small populations than among large populations. These trends were stronger for genetic population size measures than demographic ones and were present despite pronounced drift in small populations. Our results suggest that fragmentation affects natural selection and that the changes elicited in the adaptive genetic composition and differentiation of fragmented populations vary with population size. By generating more variable evolutionary responses, the alteration of selective pressures during habitat fragmentation may affect future population persistence independently of, and perhaps long before, the effects of demographic and genetic stochasticity are manifest. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  8. Particle Size Reduction in Geophysical Granular Flows: The Role of Rock Fragmentation

    Science.gov (United States)

    Bianchi, G.; Sklar, L. S.

    2016-12-01

    Particle size reduction in geophysical granular flows is caused by abrasion and fragmentation, and can affect transport dynamics by altering the particle size distribution. While the Sternberg equation is commonly used to predict the mean abrasion rate in the fluvial environment, and can also be applied to geophysical granular flows, predicting the evolution of the particle size distribution requires a better understanding the controls on the rate of fragmentation and the size distribution of resulting particle fragments. To address this knowledge gap we are using single-particle free-fall experiments to test for the influence of particle size, impact velocity, and rock properties on fragmentation and abrasion rates. Rock types tested include granodiorite, basalt, and serpentinite. Initial particle masses and drop heights range from 20 to 1000 grams and 0.1 to 3.0 meters respectively. Preliminary results of free-fall experiments suggest that the probability of fragmentation varies as a power function of kinetic energy on impact. The resulting size distributions of rock fragments can be collapsed by normalizing by initial particle mass, and can be fit with a generalized Pareto distribution. We apply the free-fall results to understand the evolution of granodiorite particle-size distributions in granular flow experiments using rotating drums ranging in diameter from 0.2 to 4.0 meters. In the drums, we find that the rates of silt production by abrasion and gravel production by fragmentation scale with drum size. To compare these rates with free-fall results we estimate the particle impact frequency and velocity. We then use population balance equations to model the evolution of particle size distributions due to the combined effects of abrasion and fragmentation. Finally, we use the free-fall and drum experimental results to model particle size evolution in Inyo Creek, a steep, debris-flow dominated catchment, and compare model results to field measurements.

  9. Absolute fragmentation cross sections in atom-molecule collisions: Scaling laws for non-statistical fragmentation of polycyclic aromatic hydrocarbon molecules

    Energy Technology Data Exchange (ETDEWEB)

    Chen, T.; Gatchell, M.; Stockett, M. H.; Alexander, J. D.; Schmidt, H. T.; Cederquist, H.; Zettergren, H., E-mail: henning@fysik.su.se [Department of Physics, Stockholm University, S-106 91 Stockholm (Sweden); Zhang, Y. [Department of Mathematics, Faculty of Physics, M. V. Lomonosov Moscow State University, Leninskie Gory, 119991 Moscow (Russian Federation); Rousseau, P.; Maclot, S.; Delaunay, R.; Adoui, L. [CIMAP, UMR 6252, CEA/CNRS/ENSICAEN/Université de Caen Basse-Normandie, bd Henri Becquerel, BP 5133, F-14070 Caen Cedex 05 (France); Université de Caen Basse-Normandie, Esplanade de la Paix, F-14032 Caen (France); Domaracka, A.; Huber, B. A. [CIMAP, UMR 6252, CEA/CNRS/ENSICAEN/Université de Caen Basse-Normandie, bd Henri Becquerel, BP 5133, F-14070 Caen Cedex 05 (France); Schlathölter, T. [Zernike Institute for Advanced Materials, University of Groningen, Nijenborgh 4, 9747AG Groningen (Netherlands)

    2014-06-14

    We present scaling laws for absolute cross sections for non-statistical fragmentation in collisions between Polycyclic Aromatic Hydrocarbons (PAH/PAH{sup +}) and hydrogen or helium atoms with kinetic energies ranging from 50 eV to 10 keV. Further, we calculate the total fragmentation cross sections (including statistical fragmentation) for 110 eV PAH/PAH{sup +} + He collisions, and show that they compare well with experimental results. We demonstrate that non-statistical fragmentation becomes dominant for large PAHs and that it yields highly reactive fragments forming strong covalent bonds with atoms (H and N) and molecules (C{sub 6}H{sub 5}). Thus nonstatistical fragmentation may be an effective initial step in the formation of, e.g., Polycyclic Aromatic Nitrogen Heterocycles (PANHs). This relates to recent discussions on the evolution of PAHNs in space and the reactivities of defect graphene structures.

  10. Effects of Mixtures on Liquid and Solid Fragment Size Distributions

    Science.gov (United States)

    2016-05-01

    Bath of an Immiscible Liquid, Physical Review Letters, 110, 264503, 2013 X. Li and R. S. Tankin, Droplet Size Distribution: A Derivation of a...10), 811-823, 1969 C. R. Hoggatt and R. F. Recht, Fracture Behavior of Tubular Bombs , Journal of Applied Physics, 39(3), 1856-1862, 1968

  11. Causality in Statistical Power: Isomorphic Properties of Measurement, Research Design, Effect Size, and Sample Size

    Science.gov (United States)

    Heidel, R. Eric

    2016-01-01

    Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power. PMID:27073717

  12. Causality in Statistical Power: Isomorphic Properties of Measurement, Research Design, Effect Size, and Sample Size

    Directory of Open Access Journals (Sweden)

    R. Eric Heidel

    2016-01-01

    Full Text Available Statistical power is the ability to detect a significant effect, given that the effect actually exists in a population. Like most statistical concepts, statistical power tends to induce cognitive dissonance in hepatology researchers. However, planning for statistical power by an a priori sample size calculation is of paramount importance when designing a research study. There are five specific empirical components that make up an a priori sample size calculation: the scale of measurement of the outcome, the research design, the magnitude of the effect size, the variance of the effect size, and the sample size. A framework grounded in the phenomenon of isomorphism, or interdependencies amongst different constructs with similar forms, will be presented to understand the isomorphic effects of decisions made on each of the five aforementioned components of statistical power.

  13. Species-genetic diversity correlations in habitat fragmentation can be biased by small sample sizes.

    Science.gov (United States)

    Nazareno, Alison G; Jump, Alistair S

    2012-06-01

    Predicted parallel impacts of habitat fragmentation on genes and species lie at the core of conservation biology, yet tests of this rule are rare. In a recent article in Ecology Letters, Struebig et al. (2011) report that declining genetic diversity accompanies declining species diversity in tropical forest fragments. However, this study estimates diversity in many populations through extrapolation from very small sample sizes. Using the data of this recent work, we show that results estimated from the smallest sample sizes drive the species-genetic diversity correlation (SGDC), owing to a false-positive association between habitat fragmentation and loss of genetic diversity. Small sample sizes are a persistent problem in habitat fragmentation studies, the results of which often do not fit simple theoretical models. It is essential, therefore, that data assessing the proposed SGDC are sufficient in order that conclusions be robust.

  14. Effect size, confidence intervals and statistical power in psychological research.

    Directory of Open Access Journals (Sweden)

    Téllez A.

    2015-07-01

    Full Text Available Quantitative psychological research is focused on detecting the occurrence of certain population phenomena by analyzing data from a sample, and statistics is a particularly helpful mathematical tool that is used by researchers to evaluate hypotheses and make decisions to accept or reject such hypotheses. In this paper, the various statistical tools in psychological research are reviewed. The limitations of null hypothesis significance testing (NHST and the advantages of using effect size and its respective confidence intervals are explained, as the latter two measurements can provide important information about the results of a study. These measurements also can facilitate data interpretation and easily detect trivial effects, enabling researchers to make decisions in a more clinically relevant fashion. Moreover, it is recommended to establish an appropriate sample size by calculating the optimum statistical power at the moment that the research is designed. Psychological journal editors are encouraged to follow APA recommendations strictly and ask authors of original research studies to report the effect size, its confidence intervals, statistical power and, when required, any measure of clinical significance. Additionally, we must account for the teaching of statistics at the graduate level. At that level, students do not receive sufficient information concerning the importance of using different types of effect sizes and their confidence intervals according to the different types of research designs; instead, most of the information is focused on the various tools of NHST.

  15. Finite-size anisotropy in statistically uniform porous media

    CERN Document Server

    Koza, Zbigniew; Khalili, Arzhang

    2009-01-01

    Anisotropy of the permeability tensor in statistically uniform porous media of sizes used in typical computer simulations is studied. Although such systems are assumed to be isotropic by default, we show that de facto their anisotropic permeability can give rise to significant changes of transport parameters such as permeability and tortuosity. The main parameter controlling the anisotropy is $a/L$, being the ratio of the obstacle to system size. Distribution of the angle $\\alpha$ between the external force and the volumetric fluid stream is found to be approximately normal, and the standard deviation of $\\alpha$ is found to decay with the system size as $(a/L)^{d/2}$, where $d$ is the space dimensionality. These properties can be used to estimate both anisotropy-related statistical errors in large-scale simulations and the size of the representative elementary volume.

  16. Absolute fragmentation cross sections in atom-molecule collisions : Scaling laws for non-statistical fragmentation of polycyclic aromatic hydrocarbon molecules

    NARCIS (Netherlands)

    Chen, T.; Gatchell, M.; Stockett, M. H.; Alexander, J. D.; Zhang, Y.; Rousseau, P.; Domaracka, A.; Maclot, S.; Delaunay, R.; Adoui, L.; Huber, B. A.; Schlathölter, T.; Schmidt, H. T.; Cederquist, H.; Zettergren, H.

    2014-01-01

    We present scaling laws for absolute cross sections for non-statistical fragmentation in collisions between Polycyclic Aromatic Hydrocarbons (PAH/PAH+) and hydrogen or helium atoms with kinetic energies ranging from 50 eV to 10 keV. Further, we calculate the total fragmentation cross sections

  17. The Role of Surface Entropy in Statistical Emission of Massive Fragments from Equilibrated Nuclear Systems

    CERN Document Server

    Lü, J; T\\~oke, Jan; Lu, Jun

    2003-01-01

    Statistical fragment emission from excited nuclear systems is studied within the framework of a schematic Fermi-gas model combined with Weisskopf's detailed balance approach. The formalism considers thermal expansion of finite nuclear systems and pays special attention to the role of the diffuse surface region in the decay of hot equilibrated systems. It is found that with increasing excitation energy, effects of surface entropy lead to a systematic and significant reduction of effective emission barriers for fragments and, eventually, to the vanishing of these barriers. The formalism provides a natural explanation for the occurrence of negative nuclear heat capacities reported in the literature. It also accounts for the observed linearity of pseudo-Arrhenius plots of the logarithm of the fragment emission probability {\\it versus} the inverse square-root of the excitation energy, but does not predict true Arrhenius behavior of these emission probabilities.

  18. [Effect sizes, statistical power and sample sizes in "the Japanese Journal of Psychology"].

    Science.gov (United States)

    Suzukawa, Yumi; Toyoda, Hideki

    2012-04-01

    This study analyzed the statistical power of research studies published in the "Japanese Journal of Psychology" in 2008 and 2009. Sample effect sizes and sample statistical powers were calculated for each statistical test and analyzed with respect to the analytical methods and the fields of the studies. The results show that in the fields like perception, cognition or learning, the effect sizes were relatively large, although the sample sizes were small. At the same time, because of the small sample sizes, some meaningful effects could not be detected. In the other fields, because of the large sample sizes, meaningless effects could be detected. This implies that researchers who could not get large enough effect sizes would use larger samples to obtain significant results.

  19. The grain-size distribution of pyroclasts: Primary fragmentation, conduit sorting or abrasion?

    Science.gov (United States)

    Kueppers, U.; Schauroth, J.; Taddeucci, J.

    2013-12-01

    Explosive volcanic eruptions expel a mixture of pyroclasts and lithics. Pyroclasts, fragments of the juvenile magma, record the state of the magma at fragmentation in terms of porosity and crystallinity. The grain size distribution of pyroclasts is generally considered to be a direct consequence of the conditions at magma fragmentation that is mainly driven by gas overpressure in bubbles, high shear rates, contact with external water or a combination of these factors. Stress exerted by any of these processes will lead to brittle fragmentation by overcoming the magma's relaxation timescale. As a consequence, most pyroclasts exhibit angular shapes. Upon magma fragmentation, the gas pyroclast mixture is accelerated upwards and eventually ejected from the vent. The total grain size distribution deposited is a function of fragmentation conditions and transport related sorting. Porous pyroclasts are very susceptible to abrasion by particle-particle or particle-conduit wall interaction. Accordingly, pyroclastic fall deposits with angular clasts should proof a low particle abrasion upon contact to other surfaces. In an attempt to constrain the degree of particle interaction during conduit flow, monomodal batches of washed pyroclasts have been accelerated upwards by rapid decompression and subsequently investigated for their grain size distribution. In our set-up, we used a vertical cylindrical tube without surface roughness as conduit. We varied grain size (0.125-0.25; 0.5-1; 1-2 mm), porosity (0; 10; 30 %), gas-particle ratio (10 and 40%), conduit length (10 and 28 cm) and conduit diameter (2.5 and 6 cm). All ejected particles were collected after settling at the base of a 3.3 m high tank and sieved at one sieve size below starting size (half-Φ). Grain size reduction showed a positive correlation with starting grain size, porosity and overpressure at the vent. Although milling in a volcanic conduit may take place, porous pyroclasts are very likely to be a primary product

  20. Statistical determination of the step size of molecular motors

    Energy Technology Data Exchange (ETDEWEB)

    Neuman, K C; Saleh, O A; Lionnet, T; Lia, G; Allemand, J-F; Bensimon, D; Croquette, V [Laboratoire de Physique Statistique, Ecole Normale Superieure, 24 rue Lhomond, Paris 75005 (France)

    2005-11-30

    Molecular motors are enzymatic proteins that couple the consumption of chemical energy to mechanical displacement. In order to elucidate the translocation mechanisms of these enzymes, it is of fundamental importance to measure the physical step size. The step size can, in certain instances, be directly measured with single-molecule techniques; however, in the majority of cases individual steps are masked by noise. The step size can nevertheless be obtained from noisy single-molecule records through statistical methods. This analysis is analogous to determining the charge of the electron from current shot noise. We review methods for obtaining the step size based on analysing, in both the time and frequency domains, the variance in position from noisy single-molecule records of motor displacement. Additionally, we demonstrate how similar methods may be applied to measure the step size in bulk kinetic experiments.

  1. Exploring Explanations of Subglacial Bedform Sizes Using Statistical Models.

    Directory of Open Access Journals (Sweden)

    John K Hillier

    Full Text Available Sediments beneath modern ice sheets exert a key control on their flow, but are largely inaccessible except through geophysics or boreholes. In contrast, palaeo-ice sheet beds are accessible, and typically characterised by numerous bedforms. However, the interaction between bedforms and ice flow is poorly constrained and it is not clear how bedform sizes might reflect ice flow conditions. To better understand this link we present a first exploration of a variety of statistical models to explain the size distribution of some common subglacial bedforms (i.e., drumlins, ribbed moraine, MSGL. By considering a range of models, constructed to reflect key aspects of the physical processes, it is possible to infer that the size distributions are most effectively explained when the dynamics of ice-water-sediment interaction associated with bedform growth is fundamentally random. A 'stochastic instability' (SI model, which integrates random bedform growth and shrinking through time with exponential growth, is preferred and is consistent with other observations of palaeo-bedforms and geophysical surveys of active ice sheets. Furthermore, we give a proof-of-concept demonstration that our statistical approach can bridge the gap between geomorphological observations and physical models, directly linking measurable size-frequency parameters to properties of ice sheet flow (e.g., ice velocity. Moreover, statistically developing existing models as proposed allows quantitative predictions to be made about sizes, making the models testable; a first illustration of this is given for a hypothesised repeat geophysical survey of bedforms under active ice. Thus, we further demonstrate the potential of size-frequency distributions of subglacial bedforms to assist the elucidation of subglacial processes and better constrain ice sheet models.

  2. Study of system- size effects in multi- fragmentation using Quantum Molecular Dynamics model

    CERN Document Server

    Singh, J; Aichelin, Jörg; Singh, Jaivir; Puri, Rajeev K.

    2001-01-01

    We report, for the first time, the dependence of the multiplicity of different fragments on the system size employing a quantum molecular dynamics model. This dependence is extracted from the simulations of symmetric collisions of Ca+Ca, Ni+Ni, Nb+Nb, Xe+Xe, Er+Er, Au+Au and U+U at incident energies between 50 A MeV and 1 A GeV. We find that the multiplicity of different fragments scales with the size of the system which can be parameterized by a simple power law.

  3. Gene flow and effective population sizes of the butterfly Maculinea alcon in a highly fragmented, anthropogenic landscape

    DEFF Research Database (Denmark)

    Vanden Broeck, An; Maes, Dirk; Kelager, Andreas

    2017-01-01

    fragmentation as they occupy narrow niches or restricted habitat ranges. Here, we assess contemporary interpopulation connectedness of the threatened, myrmecophilous butterfly,Maculinea alcon, in a highly fragmented landscape.Weinferred dispersal, effective population sizes, genetic diversity and structure...

  4. Computing physical properties with quantum Monte Carlo methods with statistical fluctuations independent of system size.

    Science.gov (United States)

    Assaraf, Roland

    2014-12-01

    We show that the recently proposed correlated sampling without reweighting procedure extends the locality (asymptotic independence of the system size) of a physical property to the statistical fluctuations of its estimator. This makes the approach potentially vastly more efficient for computing space-localized properties in large systems compared with standard correlated methods. A proof is given for a large collection of noninteracting fragments. Calculations on hydrogen chains suggest that this behavior holds not only for systems displaying short-range correlations, but also for systems with long-range correlations.

  5. Particle size distributions and the sequential fragmentation/transport theory applied to volcanic ash

    Science.gov (United States)

    Wohletz, K. H.; Sheridan, M. F.; Brown, W. K.

    1989-11-01

    The assumption that distributions of mass versus size interval for fragmented materials fit the log normal distribution is empirically based and has historical roots in the late 19th century. Other often used distributions (e.g., Rosin-Rammler, Weibull) are also empirical and have the general form for mass per size interval: n(l) = klα exp (-lβ), where n(l) represents the number of particles of diameter l, l is the normalized particle diameter, and k, α, and β are constants. We describe and extend the sequential fragmentation distribution to include transport effects upon observed volcanic ash size distributions. The sequential fragmentation/transport (SFT) distribution is also of the above mathematical form, but it has a physical basis rather than empirical. The SFT model applies to a particle-mass distribution formed by a sequence of fragmentation (comminution) and transport (size sorting) events acting upon an initial mass m': n(x, m) = C ∫∫ n(x', m')p(ξ)dx' dm', where x' denotes spatial location along a linear axis, C is a constant, and integration is performed over distance from an origin to the sample location and mass limits from 0 to m. We show that the probability function that models the production of particles of different size from an initial mass and sorts that distribution, p(ξ), is related to mg, where g (noted as γ for fragmentation processes) is a free parameter that determines the location, breadth, and skewness of the distribution; g(γ) must be greater than -1, and it increases from that value as the distribution matures with greater number of sequential steps in the fragmentation or transport process; γ is expected to be near -1 for "sudden" fragmentation mechanisms such as single-event explosions and transport mechanisms that are functionally dependent upon particle mass. This free parameter will be more positive for evolved fragmentation mechanisms such as ball milling and complex transport processes such as saltation. The SFT

  6. Particle size distributions and the sequential fragmentation/transport theory applied to volcanic ash

    Energy Technology Data Exchange (ETDEWEB)

    Wohletz, K.H. (Earth and Space Science Division Los Alamos National Laboratory, New Mexico (USA)); Sheridan, M.F. (Department of Geology, Arizona State University, Tempe (USA)); Brown, W.K. (Math/Science Division, Lassen College, Susanville, California (USA))

    1989-11-10

    The assumption that distributions of mass versus size interval for fragmented materials fit the log normal distribution is empirically based and has historical roots in the late 19th century. Other often used distributions (e.g., Rosin-Rammler, Weibull) are also empirical and have the general form for mass per size interval: {ital n}({ital l})={ital kl}{sup {alpha}} exp(-{ital l}{beta}), where {ital n}({ital l}) represents the number of particles of diameter {ital l}, {ital l} is the normalized particle diameter, and {ital k}, {alpha}, and {beta} are constants. We describe and extend the sequential fragmentation distribution to include transport effects upon observed volcanic ash size distributions. The sequential fragmentation/transport (SFT) distribution is also of the above mathematical form, but it has a physical basis rather than empirical. The SFT model applies to a particle-mass distribution formed by a sequence of fragmentation (comminution) and transport (size sorting) events acting upon an initial mass {ital m}{prime}: {ital n}({ital x}, {ital m})={ital C} {integral}{integral} {ital n}({ital x}{prime}, {ital m}{prime}){ital p}({xi}) {ital dx}{prime} {ital dm}{prime}, where {ital x}{prime} denotes spatial location along a linear axis, {ital C} is a constant, and integration is performed over distance from an origin to the sample location and mass limits from 0 to {ital m}.

  7. Influence of Magnetic Field Amplitude on Quantity and Sizes of Disintegration Fragments of Magnetic Particles Cluster

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The disintegration of a mass of magnetic particles is investigated at pulsing switching on a magnetic field. The influence of field value on quantity, sizes and allocation of fragments of disintegration is explored. The presence of two critical fields, defining the process of disintegration, is revealed. The results can be used at manufacture of packings to magnetic filters.

  8. A theoretical explanation of grain size distributions in explosive rock fragmentation

    Science.gov (United States)

    Fowler, A. C.; Scheu, Bettina

    2016-06-01

    We have measured grain size distributions of the results of laboratory decompression explosions of volcanic rock. The resulting distributions can be approximately represented by gamma distributions of weight per cent as a function of ϕ =-log2⁡d , where d is the grain size in millimetres measured by sieving, with a superimposed long tail associated with the production of fines. We provide a description of the observations based on sequential fragmentation theory, which we develop for the particular case of `self-similar' fragmentation kernels, and we show that the corresponding evolution equation for the distribution can be explicitly solved, yielding the long-time lognormal distribution associated with Kolmogorov's fragmentation theory. Particular features of the experimental data, notably time evolution, advection, truncation and fines production, are described and predicted within the constraints of a generalized, `reductive' fragmentation model, and it is shown that the gamma distribution of coarse particles is a natural consequence of an assumed uniform fragmentation kernel. We further show that an explicit model for fines production during fracturing can lead to a second gamma distribution, and that the sum of the two provides a good fit to the observed data.

  9. Patch size and isolation predict plant species density in a naturally fragmented forest.

    Science.gov (United States)

    Munguía-Rosas, Miguel A; Montiel, Salvador

    2014-01-01

    Studies of the effects of patch size and isolation on plant species density have yielded contrasting results. However, much of the available evidence comes from relatively recent anthropogenic forest fragments which have not reached equilibrium between extinction and immigration. This is a critical issue because the theory clearly states that only when equilibrium has been reached can the number of species be accurately predicted by habitat size and isolation. Therefore, species density could be better predicted by patch size and isolation in an ecosystem that has been fragmented for a very long time. We tested whether patch area, isolation and other spatial variables explain variation among forest patches in plant species density in an ecosystem where the forest has been naturally fragmented for long periods of time on a geological scale. Our main predictions were that plant species density will be positively correlated with patch size, and negatively correlated with isolation (distance to the nearest patch, connectivity, and distance to the continuous forest). We surveyed the vascular flora (except lianas and epiphytes) of 19 forest patches using five belt transects (50×4 m each) per patch (area sampled per patch = 0.1 ha). As predicted, plant species density was positively associated (logarithmically) with patch size and negatively associated (linearly) with patch isolation (distance to the nearest patch). Other spatial variables such as patch elevation and perimeter, did not explain among-patch variability in plant species density. The power of patch area and isolation as predictors of plant species density was moderate (together they explain 43% of the variation), however, a larger sample size may improve the explanatory power of these variables. Patch size and isolation may be suitable predictors of long-term plant species density in terrestrial ecosystems that are naturally and anthropogenically fragmented.

  10. Patch size and isolation predict plant species density in a naturally fragmented forest.

    Directory of Open Access Journals (Sweden)

    Miguel A Munguía-Rosas

    Full Text Available Studies of the effects of patch size and isolation on plant species density have yielded contrasting results. However, much of the available evidence comes from relatively recent anthropogenic forest fragments which have not reached equilibrium between extinction and immigration. This is a critical issue because the theory clearly states that only when equilibrium has been reached can the number of species be accurately predicted by habitat size and isolation. Therefore, species density could be better predicted by patch size and isolation in an ecosystem that has been fragmented for a very long time. We tested whether patch area, isolation and other spatial variables explain variation among forest patches in plant species density in an ecosystem where the forest has been naturally fragmented for long periods of time on a geological scale. Our main predictions were that plant species density will be positively correlated with patch size, and negatively correlated with isolation (distance to the nearest patch, connectivity, and distance to the continuous forest. We surveyed the vascular flora (except lianas and epiphytes of 19 forest patches using five belt transects (50×4 m each per patch (area sampled per patch = 0.1 ha. As predicted, plant species density was positively associated (logarithmically with patch size and negatively associated (linearly with patch isolation (distance to the nearest patch. Other spatial variables such as patch elevation and perimeter, did not explain among-patch variability in plant species density. The power of patch area and isolation as predictors of plant species density was moderate (together they explain 43% of the variation, however, a larger sample size may improve the explanatory power of these variables. Patch size and isolation may be suitable predictors of long-term plant species density in terrestrial ecosystems that are naturally and anthropogenically fragmented.

  11. Sizing of rock fragmentation modeling due to bench blasting using adaptive neuro-fuzzy inference system and radial basis function

    Institute of Scientific and Technical Information of China (English)

    Karami Alireza; Afiuni-Zadeh Somaieh

    2012-01-01

    One of the most important characters of blasting,a basic step of surface mining,is rock fragmentation.It directly effects on the costs of drilling and economics of the subsequent operations of loading,hauling and crushing in mines.Adaptive neuro-fuzzy inference system (ANFIS) and radial basis function (RBF)show potentials for modeling the behavior of complex nonlinear processes such as those involved in fragmentation due to blasting of rocks.In this paper we developed ANFIS and RBF methods for modeling of sizing of rock fragmentation due to bench blasting by estimation of 80% passing size (K80) of Golgohar iron ore mine of Sir jan,Iran.Comparing the results of ANFIS and RBF models shows that although the statistical parameters RBF model is acceptable but the ANFIS proposed model is superior and also simpler because the ANFIS model is constructed using only two input parameters while seven input parameters used for construction of the RBF model.

  12. Statistical traffic modeling of MPEG frame size: Experiments and Analysis

    Directory of Open Access Journals (Sweden)

    Haniph A. Latchman

    2009-12-01

    Full Text Available For guaranteed quality of service (QoS and sufficient bandwidth in a communication network which provides an integrated multimedia service, it is important to obtain an analytical and tractable model of the compressed MPEG data. This paper presents a statistical approach to a group of picture (GOP MPEG frame size model to increase network traffic performance in a communication network. We extract MPEG frame data from commercial DVD movies and make probability histograms to analyze the statistical characteristics of MPEG frame data. Six candidates of probability distributions are considered here and their parameters are obtained from the empirical data using the maximum likelihood estimation (MLE. This paper shows that the lognormal distribution is the best fitting model of MPEG-2 total frame data.

  13. Size-dependent enrichment of waste slag aggregate fragments abraded from asphalt concrete.

    Science.gov (United States)

    Takahashi, Fumitake; Shimaoka, Takayuki; Gardner, Kevin; Kida, Akiko

    2011-10-30

    Authors consider the environmental prospects of using melted waste slag as the aggregate for asphalt pavement. In particular, the enrichment of slag-derived fragments in fine abrasion dust particles originated from slag asphalt concrete and its size dependency were concerned. A series of surface abrasion tests for asphalt concrete specimens, containing only natural aggregates as reference or 30 wt% of substituted slag aggregates, were performed. Although two of three slag-asphalt concretes generated 1.5-3.0 times larger amount of abrasion dust than the reference asphalt concrete did, it could not be explained only by abrasion resistance of slag. The enrichment of slag-derived fragments in abrasion dust, estimated on the basis of the peak intensity of quartz and heavy metal concentrations, had size dependency for all slag-asphalt concretes. Slag-derived fragments were enriched in abrasion dust particles with diameters of 150-1000 μm. Enrichment factors were 1.4-2.1. In contrast, there was no enrichment in abrasion dust particles with diameter less than 75 μm. This suggests that prior airborne-size fragmentation of substituted slag aggregates does not need to be considered for tested slag aggregates when environmental risks of abrasion dust of slag-asphalt pavement are assessed.

  14. A statistical study of meteoroid fragmentation and differential ablation using the Resolute Bay Incoherent Scatter Radar

    Science.gov (United States)

    Malhotra, Akshay; Mathews, John D.

    2011-04-01

    There has been much interest in the meteor physics community recently regarding the detailed processes by which the meteoroid mass flux arrives in the upper atmosphere. Of particular interest are the relative roles of simple ablation, differential ablation, and fragmentation in interpretation of the meteor events observed by the high-power large-aperture (HPLA) radars. An understanding of the relative roles of these mechanisms is necessary to determine whether the considerable meteor mass flux arriving in the upper atmosphere arrives mostly in nanometer dust/smoke (via fragmentation) or atomic form (via ablation), which in turn has important consequences in understanding not only the aeronomy of the region but also the formation and evolution of various upper atmospheric phenomenon such as Polar Mesospheric Summer Echoes. Using meteor observations from the newly operational Resolute Bay Incoherent Scatter Radar (RISR), we present the first statistical study showing the relative contribution of these mechanisms. We find that RISR head echoes exhibited ˜48% fragmentation, ˜32% simple ablation, and ˜20% differential ablation. We also report existence of compound meteor events exhibiting signatures of more than one mass loss mechanism. These results emphasize that the processes by which the meteoroid mass is deposited into the upper atmosphere are complex and involve all three mechanisms described here. This conclusion is unlike the previously reported results that stress the importance of one or the other of these mechanisms. These results will also contribute in improving current meteoroid disintegration/ablation models.

  15. Representative volume size: A comparison of statistical continuum mechanics and statistical physics

    Energy Technology Data Exchange (ETDEWEB)

    AIDUN,JOHN B.; TRUCANO,TIMOTHY G.; LO,CHI S.; FYE,RICHARD M.

    1999-05-01

    In this combination background and position paper, the authors argue that careful work is needed to develop accurate methods for relating the results of fine-scale numerical simulations of material processes to meaningful values of macroscopic properties for use in constitutive models suitable for finite element solid mechanics simulations. To provide a definite context for this discussion, the problem is couched in terms of the lack of general objective criteria for identifying the size of the representative volume (RV) of a material. The objective of this report is to lay out at least the beginnings of an approach for applying results and methods from statistical physics to develop concepts and tools necessary for determining the RV size, as well as alternatives to RV volume-averaging for situations in which the RV is unmanageably large. The background necessary to understand the pertinent issues and statistical physics concepts is presented.

  16. Subcritical, Critical and Supercritical Size Distributions in Random Coagulation-Fragmentation Processes

    Institute of Scientific and Technical Information of China (English)

    Dong HAN; Xin Sheng ZHANG; Wei An ZHENG

    2008-01-01

    We consider the asymptotic probability distribution of the size of a reversible random coagula-tion-fragmentation process in the thermodynamic limit.We prove that the distributions of small,medium and the largest clusters converge to Gaussian,Poisson and 0-1 distributions in the supercritical stage (post-gelation),respectively.We show also that the mutually dependent distributions of clusters will become independent after the occurrence of a gelation transition.Furthermore,it is proved that all the number distributions of clusters are mutually independent at the critical stage (gelation),but the distributions of medium and the largest clusters are mutually dependent with positive correlation coe .cient in the supercritical stage.When the fragmentation strength goes to zero,there will exist only two types of clusters in the process,one type consists of the smallest clusters, the other is the largest one which has a size nearly equal to the volume (total number of units).

  17. Statistical properties of the normalized ice particle size distribution

    Science.gov (United States)

    Delanoë, Julien; Protat, Alain; Testud, Jacques; Bouniol, Dominique; Heymsfield, A. J.; Bansemer, A.; Brown, P. R. A.; Forbes, R. M.

    2005-05-01

    Testud et al. (2001) have recently developed a formalism, known as the "normalized particle size distribution (PSD)", which consists in scaling the diameter and concentration axes in such a way that the normalized PSDs are independent of water content and mean volume-weighted diameter. In this paper we investigate the statistical properties of the normalized PSD for the particular case of ice clouds, which are known to play a crucial role in the Earth's radiation balance. To do so, an extensive database of airborne in situ microphysical measurements has been constructed. A remarkable stability in shape of the normalized PSD is obtained. The impact of using a single analytical shape to represent all PSDs in the database is estimated through an error analysis on the instrumental (radar reflectivity and attenuation) and cloud (ice water content, effective radius, terminal fall velocity of ice crystals, visible extinction) properties. This resulted in a roughly unbiased estimate of the instrumental and cloud parameters, with small standard deviations ranging from 5 to 12%. This error is found to be roughly independent of the temperature range. This stability in shape and its single analytical approximation implies that two parameters are now sufficient to describe any normalized PSD in ice clouds: the intercept parameter N*0 and the mean volume-weighted diameter Dm. Statistical relationships (parameterizations) between N*0 and Dm have then been evaluated in order to reduce again the number of unknowns. It has been shown that a parameterization of N*0 and Dm by temperature could not be envisaged to retrieve the cloud parameters. Nevertheless, Dm-T and mean maximum dimension diameter -T parameterizations have been derived and compared to the parameterization of Kristjánsson et al. (2000) currently used to characterize particle size in climate models. The new parameterization generally produces larger particle sizes at any temperature than the Kristjánsson et al. (2000

  18. A linear relationship between crystal size and fragment binding time observed crystallographically: implications for fragment library screening using acoustic droplet ejection.

    Directory of Open Access Journals (Sweden)

    Krystal Cole

    Full Text Available High throughput screening technologies such as acoustic droplet ejection (ADE greatly increase the rate at which X-ray diffraction data can be acquired from crystals. One promising high throughput screening application of ADE is to rapidly combine protein crystals with fragment libraries. In this approach, each fragment soaks into a protein crystal either directly on data collection media or on a moving conveyor belt which then delivers the crystals to the X-ray beam. By simultaneously handling multiple crystals combined with fragment specimens, these techniques relax the automounter duty-cycle bottleneck that currently prevents optimal exploitation of third generation synchrotrons. Two factors limit the speed and scope of projects that are suitable for fragment screening using techniques such as ADE. Firstly, in applications where the high throughput screening apparatus is located inside the X-ray station (such as the conveyor belt system described above, the speed of data acquisition is limited by the time required for each fragment to soak into its protein crystal. Secondly, in applications where crystals are combined with fragments directly on data acquisition media (including both of the ADE methods described above, the maximum time that fragments have to soak into crystals is limited by evaporative dehydration of the protein crystals during the fragment soak. Here we demonstrate that both of these problems can be minimized by using small crystals, because the soak time required for a fragment hit to attain high occupancy depends approximately linearly on crystal size.

  19. A linear relationship between crystal size and fragment binding time observed crystallographically: implications for fragment library screening using acoustic droplet ejection.

    Science.gov (United States)

    Cole, Krystal; Roessler, Christian G; Mulé, Elizabeth A; Benson-Xu, Emma J; Mullen, Jeffrey D; Le, Benjamin A; Tieman, Alanna M; Birone, Claire; Brown, Maria; Hernandez, Jesus; Neff, Sherry; Williams, Daniel; Allaire, Marc; Orville, Allen M; Sweet, Robert M; Soares, Alexei S

    2014-01-01

    High throughput screening technologies such as acoustic droplet ejection (ADE) greatly increase the rate at which X-ray diffraction data can be acquired from crystals. One promising high throughput screening application of ADE is to rapidly combine protein crystals with fragment libraries. In this approach, each fragment soaks into a protein crystal either directly on data collection media or on a moving conveyor belt which then delivers the crystals to the X-ray beam. By simultaneously handling multiple crystals combined with fragment specimens, these techniques relax the automounter duty-cycle bottleneck that currently prevents optimal exploitation of third generation synchrotrons. Two factors limit the speed and scope of projects that are suitable for fragment screening using techniques such as ADE. Firstly, in applications where the high throughput screening apparatus is located inside the X-ray station (such as the conveyor belt system described above), the speed of data acquisition is limited by the time required for each fragment to soak into its protein crystal. Secondly, in applications where crystals are combined with fragments directly on data acquisition media (including both of the ADE methods described above), the maximum time that fragments have to soak into crystals is limited by evaporative dehydration of the protein crystals during the fragment soak. Here we demonstrate that both of these problems can be minimized by using small crystals, because the soak time required for a fragment hit to attain high occupancy depends approximately linearly on crystal size.

  20. Evaluation of excitation energy and spin in fission fragments using the statistical model, and the FIPPS project

    Directory of Open Access Journals (Sweden)

    Sage C.

    2013-03-01

    Full Text Available We review the statistical model and its application for the process of nuclear fission. The expressions for excitation energy and spin distributions for the individual fission fragments are given. We will finally emphasize the importance of measuring prompt gamma decay to further test the statistical model in nuclear fission with the FIPPS project.

  1. Large Time Asymptotics for a Continuous Coagulation-Fragmentation Model with Degenerate Size-Dependent Diffusion

    KAUST Repository

    Desvillettes, Laurent

    2010-01-01

    We study a continuous coagulation-fragmentation model with constant kernels for reacting polymers (see [M. Aizenman and T. Bak, Comm. Math. Phys., 65 (1979), pp. 203-230]). The polymers are set to diffuse within a smooth bounded one-dimensional domain with no-flux boundary conditions. In particular, we consider size-dependent diffusion coefficients, which may degenerate for small and large cluster-sizes. We prove that the entropy-entropy dissipation method applies directly in this inhomogeneous setting. We first show the necessary basic a priori estimates in dimension one, and second we show faster-than-polynomial convergence toward global equilibria for diffusion coefficients which vanish not faster than linearly for large sizes. This extends the previous results of [J.A. Carrillo, L. Desvillettes, and K. Fellner, Comm. Math. Phys., 278 (2008), pp. 433-451], which assumes that the diffusion coefficients are bounded below. © 2009 Society for Industrial and Applied Mathematics.

  2. Fragmentation and reliable size distributions of large ammonia and water clusters

    Science.gov (United States)

    Bobbert, C.; Schütte, S.; Steinbach, C.; Buck, U.

    2002-05-01

    The interaction of large ammonia and water clusters in the size range from < n rangle = 10 to 3 400 with electrons is investigated in a reflectron time-of-flight mass spectrometer. The clusters are generated in adiabatic expansions through conical nozzles and are nearly fragmentation free detected by single photon ionization after they have been doped by one sodium atom. For ammonia also the (1+1) resonance enhanced two photon ionization through the tilde A state with v=6 operates similarly. In this way reliable size distributions of the neutral clusters are obtained which are analyzed in terms of a modified scaling law of the Hagena type [Surf. Sci. 106, 101 (1981)]. In contrast, using electron impact ionization, the clusters are strongly fragmented when varying the electron energy between 150 and 1 500 eV. The number of evaporated molecules depends on the cluster size and the energy dependence follows that of the stopping power of the solid material. Therefore we attribute the operating mechanism to that which is also responsible for the electronic sputtering of solid matter. The yields, however, are orders of magnitude larger for clusters than for the solid. This result is a consequence of the finite dimensions of the clusters which cannot accommodate the released energy.

  3. Raindrop size distribution variability estimated using ensemble statistics

    Directory of Open Access Journals (Sweden)

    C. R. Williams

    2009-02-01

    Full Text Available Before radar estimates of the raindrop size distribution (DSD can be assimilated into numerical weather prediction models, the DSD estimate must also include an uncertainty estimate. Ensemble statistics are based on using the same observations as inputs into several different models with the spread in the outputs providing an uncertainty estimate. In this study, Doppler velocity spectra from collocated vertically pointing profiling radars operating at 50 and 920 MHz were the input data for 42 different DSD retrieval models. The DSD retrieval models were perturbations of seven different DSD models (including exponential and gamma functions, two different inverse modeling methodologies (convolution or deconvolution, and three different cost functions (two spectral and one moment cost functions.

    Two rain events near Darwin, Australia, were analyzed in this study producing 26 725 independent ensembles of mass-weighted mean raindrop diameter Dm and rain rate R. The mean and the standard deviation (indicated by the symbols <x> and σx of Dm and R were estimated for each ensemble. For small ranges of <Dm> or <R>, histograms of σDm and σR were found to be asymmetric, which prevented Gaussian statistics from being used to describe the uncertainties. Therefore, 10, 50, and 90 percentiles of σDm and σR were used to describe the uncertainties for small intervals of <Dm> or <R>. The smallest Dm uncertainty occurred for <Dm> between 0.8 and 1.8 mm with the 90th and 50th percentiles being less than 0.15 and 0.11 mm, which correspond to relative errors of less than 20% and 15%, respectively. The uncertainty increased for smaller and larger <Dm> values. The uncertainty of R increased with <R>. While the 90th percentile

  4. Are fragment-based quantum chemistry methods applicable to medium-sized water clusters?

    Science.gov (United States)

    Yuan, Dandan; Shen, Xiaoling; Li, Wei; Li, Shuhua

    2016-06-28

    Fragment-based quantum chemistry methods are either based on the many-body expansion or the inclusion-exclusion principle. To compare the applicability of these two categories of methods, we have systematically evaluated the performance of the generalized energy based fragmentation (GEBF) method (J. Phys. Chem. A, 2007, 111, 2193) and the electrostatically embedded many-body (EE-MB) method (J. Chem. Theory Comput., 2007, 3, 46) for medium-sized water clusters (H2O)n (n = 10, 20, 30). Our calculations demonstrate that the GEBF method provides uniformly accurate ground-state energies for 10 low-energy isomers of three water clusters under study at a series of theory levels, while the EE-MB method (with one water molecule as a fragment and without using the cutoff distance) shows a poor convergence for (H2O)20 and (H2O)30 when the basis set contains diffuse functions. Our analysis shows that the neglect of the basis set superposition error for each subsystem has little effect on the accuracy of the GEBF method, but leads to much less accurate results for the EE-MB method. The accuracy of the EE-MB method can be dramatically improved by using an appropriate cutoff distance and using two water molecules as a fragment. For (H2O)30, the average deviation of the EE-MB method truncated up to the three-body level calculated using this strategy (relative to the conventional energies) is about 0.003 hartree at the M06-2X/6-311++G** level, while the deviation of the GEBF method with a similar computational cost is less than 0.001 hartree. The GEBF method is demonstrated to be applicable for electronic structure calculations of water clusters at any basis set.

  5. System size dependence of intermediate mass fragments in heavy-ion collisions

    CERN Document Server

    Kaur, Sukhjit

    2011-01-01

    We simulate the central reactions of $^{20}$Ne+$^{20}$Ne, $^{40}$Ar+$^{45}$Sc, $^{58}$Ni+$^{58}$Ni, $^{86}$Kr+$^{93}$Nb, $^{129}$Xe+$^{118}$Sn, $^{86}$Kr+$^{197}$Au and $^{197}$Au+$^{197}$Au at different incident energies for different equations of state (EOS), binary cross sections and different widths of Gaussians. A rise and fall behaviour of the multiplicity of intermediate mass fragments (IMFs) is observed. The system size dependence of peak center-of-mass energy E$_{c.m.} ^{max}$ and peak IMF multiplicity $^{max}$ is also studied, where it is observed that E$_{c.m.}^{max}$ follows a linear behaviour and $^{max}$ shows a power law dependence. A comparison between two clusterization methods, the minimum spanning tree and the minimum spanning tree method with binding energy check (MSTB) is also made. We find that MSTB method reduces the $^{max}$ especially in heavy systems. The power law dependence is also observed for fragments of different sizes at E$_{c.m.} ^{max}$ and power law parameter $\\tau$ is foun...

  6. Statistical and off-equilibrium production of fragments in heavy ion collisions at intermediate energies; Production statistique et hors-equilibre de fragments dans les collisions d`ions lourdes aux energies intermediaires

    Energy Technology Data Exchange (ETDEWEB)

    Bocage, Frederic [Lab. de Physique Corpusculaire, Caen Univ., 14 - Caen (France)

    1998-12-15

    The study of reaction products, fragments and light charged particles, emitted during heavy-ion collisions at intermediate energies has shown the dominant binary dissipative character of the reaction, which is persisting for almost all impact parameters. However, in comparison with this purely binary process, an excess of nuclear matter is observed in-between the quasi-projectile and the quasi-target. To understand the mechanisms producing such an excess, this work studies more precisely the breakup in two fragments of the quasi-projectile formed in Xe+Sn, from 25 to 50 MeV/u, and Gd+C and Gd+U at 36 MeV/u. The data were obtained during the first INDRA experiment at GANIL. The angular distributions of the two fragments show the competition between statistical fission and non-equilibrated breakup of the quasi-projectile. In the second case, the two fragments are aligned along the separation axis of the two primary partners. The comparison of the fission directions and probabilities with statistical models allows us to measure the fission time, as well as the angular momentum, temperature and size of the fissioning residue. The relative velocities are compatible with Coulomb and thermal effects in the case of statistical fission and are found much higher for the breakup of a non-equilibrated quasi-projectile, which indicates that the projectile was deformed during interaction with the target. Such deformations should be compared with dynamical calculations in order to constrain the viscosity of nuclear matter and the parameters of the nucleon-nucleon interaction, (author) 148 refs., 77 figs., 11 tabs.

  7. Effects of fragmentation on the seed predation and dispersal by rodents differ among species with different seed size.

    Science.gov (United States)

    Chen, Qiong; Tomlinson, Kyle W; Cao, Lin; Wang, Bo

    2017-07-07

    Fragmentation influences the population dynamics and community composition of vertebrate animals. Fragmentation effects on rodent species in forests may in turn affect seed predation and dispersal of many plant species. Previous studies have usually addressed this question by monitoring a single species, and their results are contradictory. Very few studies have discussed the fragmentation effect on rodent-seed interaction among tree species with different seed sizes, which can significantly influence rodent foraging preference and seed fate. Given that fruiting periods for many coexisting plant species overlap, the changing foraging preference of rodents may substantially alter plant communities. In this study, we monitored the dispersal and predation by rodents of 9600 seeds, belonging to four Fagaceae species with great variation in seed size, in both the edge and interior areas of 12 tropical forest fragments ranging in area from 6.3 ha to 13872.9 ha in southwest China. The results showed forest fragmentation altered the seed fates of all the species, but the intensity and even the direction of fragmentation effect differed between species with large versus small seeds. For the seeds harvested, fragment size showed negative effects in forest interiors but positive effects at edges for the two large-seeded species, but showed little effect for the two small-seeded species. For the seeds removed, negative effects of fragment size only existed among the small-seeded species. The different fragmentation effect on seed dispersal and predation among plant species may in turn translate into the composition differences of the regeneration of the whole fragmented forest. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  8. Impact of forest fragment size on the population structure of three palm species (Arecaceae) in the Brazilian Atlantic rainforest.

    Science.gov (United States)

    Portela, Rita de Cássia Quitete; dos Santos, Flavio Antonio Maes

    2014-06-01

    The main threats to natural populations in terrestrial ecosystems have been widly recognized to be the habitat fragmentation and the exploitation of forest products. In this study, we compared the density of the populations and the structure of three tropical palm species, Astrocaryum aculeatissimum, Euterpe edulis and Geonoma schottiana. For this, we selected five forest fragments of different sizes (3 500ha, 2 400ha, 57ha, 21ha and 19ha) where palms were censused in nine 30 x 30m plots. We tracked the palms survival from 2005 to 2007, and recorded all new individuals encountered. Each individual was assigned in one of the five ontogenetic stages: seedling, infant, juvenile, immature and reproductive. The demographic structure of each palm species was analyzed and compared by a generalized linear model (GLM). The analysis was performed per palm species. The forest fragment area and the year of observation were explanatory variables, and the proportion of individuals in each ontogenetic class and palm density were response variables. The total number of individuals (from seedlings to reproductives, of all species) monitored was 6 450 in 2005, 7 268 in 2006, and 8 664 in 2007. The densities of two palm species were not influenced by the size of the fragment, but the population density of A. aculeatissimum was dependent on the size of the fragment: there were more individuals in the bigger than in the smaller forest fragments. The population structure of A. aculeatissimum, E. edulis, and G. schottiana was not altered in the smaller fragments, except the infants of G. schottiana. The main point to be drawn from the results found in this study is that the responses of density and population structure seem not to be dependent on fragment size, except for one species that resulted more abundant in bigger fragments.

  9. Yield scaling, size hierarchy and fluctuations of observables in fragmentation of excited heavy nuclei

    CERN Document Server

    Neindre, N Le; Wieleczko, J P; Borderie, B; Gulminelli, F; Rivet, M F; Bougault, R; Chbihi, A; Dayras, R; Frankland, J D; Galíchet, E; Guinet, D; Lautesse, P; López, O; Lukasik, J; Mercier, D; Moisan, J; Pârlog, M; Rosato, E; Roy, R; Schwarz, C; Sfienti, C; Tamain, B; Trautmann, W; Trzcinski, A; Turzó, K; Vient, E; Vigilante, M; Zwieglinski, B

    2007-01-01

    Multifragmentation properties measured with INDRA are studied for single sources produced in Xe+Sn reactions in the incident energy range 32-50 A MeV and quasiprojectiles from Au+Au collisions at 80 A MeV. A comparison for both types of sources is presented concerning Fisher scaling, Zipf law, fragment size and fluctuation observables. A Fisher scaling is observed for all the data. The pseudo-critical energies extracted from the Fisher scaling are consistent between Xe+Sn central collisions and Au quasi-projectiles. In the latter case it also corresponds to the energy region at which fluctuations are maximal. The critical energies deduced from the Zipf analysis are higher than those from the Fisher analysis.

  10. Coagulation and Fragmentation in molecular clouds. II. The opacity of the dust aggregate size distribution

    CERN Document Server

    Ormel, C W; Tielens, A G G M; Dominik, C; Paszun, D

    2011-01-01

    The dust size distribution in molecular clouds can be strongly affected by ice-mantle formation and (subsequent) grain coagulation. Following previous work where the dust size distribution has been calculated from a state-of-the art collision model for dust aggregates that involves both coagulation and fragmentation (Paper I), the corresponding opacities are presented in this study. The opacities are calculated by applying the effective medium theory assuming that the dust aggregates are a mix of 0.1{\\mu}m silicate and graphite grains and vacuum. In particular, we explore how the coagulation affects the near-IR opacities and the opacity in the 9.7{\\mu}m silicate feature. We find that as dust aggregates grow to {\\mu}m-sizes both the near-IR color excess and the opacity in the 9.7 {\\mu}m feature increases. Despite their coagulation, porous aggregates help to prolong the presence of the 9.7{\\mu}m feature. We find that the ratio between the opacity in the silicate feature and the near-IR color excess becomes lowe...

  11. Multi-Scale Particle Size Distributions of Mars, Moon and Itokawa based on a time-maturation dependent fragmentation model

    Science.gov (United States)

    Charalambous, C. A.; Pike, W. T.

    2013-12-01

    We present the development of a soil evolution framework and multiscale modelling of the surface of Mars, Moon and Itokawa thus providing an atlas of extra-terrestrial Particle Size Distributions (PSD). These PSDs are profoundly based on a tailoring method which interconnects several datasets from different sites captured by the various missions. The final integrated product is then fully justified through a soil evolution analysis model mathematically constructed via fundamental physical principles (Charalambous, 2013). The construction of the PSD takes into account the macroscale fresh primary impacts and their products, the mesoscale distributions obtained by the in-situ data of surface missions (Golombek et al., 1997, 2012) and finally the microscopic scale distributions provided by Curiosity and Phoenix Lander (Pike, 2011). The distribution naturally extends at the magnitudinal scales at which current data does not exist due to the lack of scientific instruments capturing the populations at these data absent scales. The extension is based on the model distribution (Charalambous, 2013) which takes as parameters known values of material specific probabilities of fragmentation and grinding limits. Additionally, the establishment of a closed-form statistical distribution provides a quantitative description of the soil's structure. Consequently, reverse engineering of the model distribution allows the synthesis of soil that faithfully represents the particle population at the studied sites (Charalambous, 2011). Such representation essentially delivers a virtual soil environment to work with for numerous applications. A specific application demonstrated here will be the information that can directly be extracted for the successful drilling probability as a function of distance in an effort to aid the HP3 instrument of the 2016 Insight Mission to Mars. Pike, W. T., et al. "Quantification of the dry history of the Martian soil inferred from in situ microscopy

  12. Behavioral response of the coachwhip (Masticophis flagellum) to habitat fragment size and isolation in an urban landscape

    Science.gov (United States)

    Mitrovich, Milan J.; Diffendorfer, Jay E.; Fisher, Robert N.

    2009-01-01

    Habitat fragmentation is a significant threat to biodiversity worldwide. Habitat loss and the isolation of habitat fragments disrupt biological communities, accelerate the extinction of populations, and often lead to the alteration of behavioral patterns typical of individuals in large, contiguous natural areas. We used radio-telemetry to study the space-use behavior of the Coachwhip, a larger-bodied, wide-ranging snake species threatened by habitat fragmentation, in fragmented and contiguous areas of coastal southern California. We tracked 24 individuals at three sites over two years. Movement patterns of Coachwhips changed in habitat fragments. As area available to the snakes was reduced, individuals faced increased crowding, had smaller home-range sizes, tolerated greater home-range overlap, and showed more concentrated movement activity and convoluted movement pathways. The behavioral response shown by Coachwhips suggests, on a regional level, area-effects alone cannot explain observed extinctions on habitat fragments but, instead, suggests changes in habitat configuration are more likely to explain the decline of this species. Ultimately, if "edge-exposure" is a common cause of decline, then isolated fragments, appropriately buffered to reduce emigration and edge effects, may support viable populations of fragmentation-sensitive species.

  13. Summary statistics for size over space and time.

    Science.gov (United States)

    Gorea, Andrei; Belkoura, Seddik; Solomon, Joshua A

    2014-08-25

    A number of studies have investigated how the visual system extracts the average feature-value of an ensemble of simultaneously or sequentially delivered stimuli. In this study we model these two processes within the unitary framework of linear systems theory. The specific feature value used in this investigation is size, which we define as the logarithm of a circle's diameter. Within each ensemble, sizes were drawn from a normal distribution. Average size discrimination was measured using ensembles of one and eight circles. These circles were presented simultaneously (display times: 13-427 ms), one at a time, or eight at a time (temporal-frequencies: 1.2-38 Hz). Thresholds for eight-item ensembles were lower than thresholds for one-item ensembles. Thresholds decreased by a factor of 1.3 for a 3,200% increase in display time, and decreased by the same factor for a 3,200% decrease in temporal frequency. Modeling and simulations show that the data are consistent with one readout of three to four items every 210 ms.

  14. Analysis of proton-induced fragment production cross sections by the Quantum Molecular Dynamics plus Statistical Decay Model

    Energy Technology Data Exchange (ETDEWEB)

    Chiba, Satoshi; Iwamoto, Osamu; Fukahori, Tokio; Niita, Koji; Maruyama, Toshiki; Maruyama, Tomoyuki; Iwamoto, Akira [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-03-01

    The production cross sections of various fragments from proton-induced reactions on {sup 56}Fe and {sup 27}Al have been analyzed by the Quantum Molecular Dynamics (QMD) plus Statistical Decay Model (SDM). It was found that the mass and charge distributions calculated with and without the statistical decay have very different shapes. These results also depend strongly on the impact parameter, showing an importance of the dynamical treatment as realized by the QMD approach. The calculated results were compared with experimental data in the energy region from 50 MeV to 5 GeV. The QMD+SDM calculation could reproduce the production cross sections of the light clusters and intermediate-mass to heavy fragments in a good accuracy. The production cross section of {sup 7}Be was, however, underpredicted by approximately 2 orders of magnitude, showing the necessity of another reaction mechanism not taken into account in the present model. (author)

  15. M3C: A Computational Approach To Describe Statistical Fragmentation of Excited Molecules and Clusters.

    Science.gov (United States)

    Aguirre, Néstor F; Díaz-Tendero, Sergio; Hervieux, Paul-Antoine; Alcamí, Manuel; Martín, Fernando

    2017-02-07

    The Microcanonical Metropolis Monte Carlo method, based on a random sampling of the density of states, is revisited for the study of molecular fragmentation in the gas phase (isolated molecules, atomic and molecular clusters, complex biomolecules, etc.). A random walk or uniform random sampling in the configurational space (atomic positions) and a uniform random sampling of the relative orientation, vibrational energy, and chemical composition of the fragments is used to estimate the density of states of the system, which is continuously updated as the random sampling populates individual states. The validity and usefulness of the method is demonstrated by applying it to evaluate the caloric curve of a weakly bound rare gas cluster (Ar13), to interpret the fragmentation of highly excited small neutral and singly positively charged carbon clusters (Cn, n = 5,7,9 and Cn(+), n = 4,5) and to simulate the mass spectrum of the acetylene molecule (C2H2).

  16. The evolutionary consequences of habitat fragmentation: Body morphology and coloration differentiation among brook trout populations of varying size.

    Science.gov (United States)

    Zastavniouk, Carol; Weir, Laura K; Fraser, Dylan J

    2017-09-01

    A reduction in population size due to habitat fragmentation can alter the relative roles of different evolutionary mechanisms in phenotypic trait differentiation. While deterministic (selection) and stochastic (genetic drift) mechanisms are expected to affect trait evolution, genetic drift may be more important than selection in small populations. We examined relationships between mature adult traits and ecological (abiotic and biotic) variables among 14 populations of brook trout. These naturally fragmented populations have shared ancestry but currently exhibit considerable variability in habitat characteristics and population size (49 habitat variation or operational sex ratio than to population size, suggesting that selection may overcome genetic drift at small population size. Phenotype-environment associations were also stronger in females than males, suggesting that natural selection due to abiotic conditions may act more strongly on females than males. Our results suggest that natural and sexual-selective pressures on phenotypic traits change during the process of habitat fragmentation, and that these changes are largely contingent upon existing habitat conditions within isolated fragments. Our study provides an improved understanding of the ecological and evolutionary consequences of habitat fragmentation and lends insight into the ability of some small populations to respond to selection and environmental change.

  17. A generalized statistical model for the size distribution of wealth

    Science.gov (United States)

    Clementi, F.; Gallegati, M.; Kaniadakis, G.

    2012-12-01

    In a recent paper in this journal (Clementi et al 2009 J. Stat. Mech. P02037), we proposed a new, physically motivated, distribution function for modeling individual incomes, having its roots in the framework of the κ-generalized statistical mechanics. The performance of the κ-generalized distribution was checked against real data on personal income for the United States in 2003. In this paper we extend our previous model so as to be able to account for the distribution of wealth. Probabilistic functions and inequality measures of this generalized model for wealth distribution are obtained in closed form. In order to check the validity of the proposed model, we analyze the US household wealth distributions from 1984 to 2009 and conclude an excellent agreement with the data that is superior to any other model already known in the literature.

  18. Fragmentation of Millimeter-Size Hypervelocity Projectiles on Combined Mesh-Plate Bumpers

    Directory of Open Access Journals (Sweden)

    Aleksandr Cherniaev

    2017-01-01

    Full Text Available This numerical study evaluates the concept of a combined mesh-plate bumper as a shielding system protecting unmanned spacecraft from small (1 mm orbital debris impacts. Two-component bumpers consisting of an external layer of woven mesh (aluminum or steel directly applied to a surface of the aluminum plate are considered. Results of numerical modeling with a projectile velocity of 7 km/s indicate that, in comparison to the steel mesh-combined bumper, the combination of aluminum mesh and aluminum plate provides better fragmentation of small hypervelocity projectiles. At the same time, none of the combined mesh/plate bumpers provide a significant increase of ballistic properties as compared to an aluminum plate bumper. This indicates that the positive results reported in the literature for bumpers with metallic meshes and large projectiles are not scalable down to millimeter-sized particles. Based on this investigation’s results, a possible modification of the combined mesh/plate bumper is proposed for the future study.

  19. CHARACTERISTICS OF FRAGMENT SIZE DISTRIBUTION OF DUCTILE MATERIALS FRAGMENTIZED UNDER HIGH STRAINRATE TENSION%韧性材料冲击拉伸碎裂中的碎片尺寸分布规律

    Institute of Scientific and Technical Information of China (English)

    郑宇轩; 陈磊; 胡时胜; 周风华

    2013-01-01

    利用有限元方法模拟韧性金属圆环高速膨胀过程中的碎裂过程,获得不同初始膨胀速度下碎片的样本集合.通过对碎片的尺寸进行统计分析发现:(1)无论初始膨胀速度如何,碎片的归一化尺寸分布具有相似性,可以用一个具有初始阈值的Weibull分布描述,近似地,这个分布还可以简化为Rayleigh分布;(2)碎片尺寸的累积分布曲线呈现阶梯特性,表现出较明显的“量子化”特性.在上述发现基础上,建立一个Monte-Carlo模型:碎裂点来自于颈缩点,颈缩之间的间距满足某种连续的Weibull分布,而碎片的尺寸为随机的若干个颈缩间距之和.概率模拟表明:除非早期的颈缩间距分布很宽,否则选择的离散性必然导致碎片尺寸分布呈现某种量子化特性.采用L04工业纯铝和无氧铜试件进行了爆炸膨胀碎裂实验,回收得到的碎片尺寸分布结果与理论分析基本一致.%Finite Element Method has been used to simulate the fracture and fragmentations of ductile metallic rings undergoing high rate expansions.In this paper,the numerical fragments obtained from the FEM simulations were collected for statistical analysis.It is found that:(1) The cumulative distributions of the normalized fragment sizes at different initial expansion velocities are similar,and collectively the fragment size distributions are modeled as a Weibull distribution with an initial threshold.Approximately,this distribution can be further simplified as a Rayleigh distribution,which is the special case with the Weibull parameter to be 2; (2) The cumulative distribution of the fragment sizes exhibits a step-like nature,which means that the fragment sizes may be “quantized”.A Monte-Carlo model is established to describe the origination of such quantization.In the model,the fractures occur at the sites where the tensioned material necks.The spacing of the necking sites follows a narrow Weibull distribution.As the fragment

  20. On the size distribution of collision fragments of NLC dust particles and their relevance to meteoric smoke particles

    Science.gov (United States)

    Havnes, O.; Gumbel, J.; Antonsen, T.; Hedin, J.; La Hoz, C.

    2014-10-01

    We present the results from a new dust probe MUDD on the PHOCUS payload which was launched in July 2011. In the interior of MUDD all the incoming NLC/PMSE icy dust particles will collide, at an impact angle ~70° to the surface normal, with a grid constructed such that no dust particles can directly hit the bottom plate of the probe. Only collision fragments will continue down towards the bottom plate. We determine an energy distribution of the charged fragments by applying a variable electric field between the impact grid and the bottom plate of MUDD. We find that ~30% of the charged fragments have kinetic energies less than 10 eV, ~20% have energies between 10 and 20 eV while ~50% have energies above 20 eV. The transformation of limits in kinetic energy for ice or meteoric smoke particles (MSP) to radius is dependent on many assumptions, the most crucial being fragment velocity. We find, however, that the sizes of the charged fragments most probably are in the range of 1 to 2 nm if meteoric smoke particles (MSP), and slightly higher if ice particles. The observed high charging fraction and the dominance of fragment sizes below a few nm makes it very unlikely that the fragments can consist mainly of ice but that they must be predominantly MSP as predicted by Havnes and Næsheim (2007) and recently observed by Hervig et al. (2012). The MUDD results indicate that MSP are embedded in NLC/PMSE ice particles with a minimum volume filling factor of ~.05% in the unlikely case that all embedded MSP are released and charged. A few % volume filling factor (Hervig et al., 2012) can easily be reached if ~10% of the MSP are released and that their charging probability is ~0.1.

  1. The impact of homogeniser speed, dispersing aggregate size and centrifugation on particle size analyses of pork as a measure of myofibrillar fragmentation.

    Science.gov (United States)

    Ngapo, T M; Vachon, L

    2017-11-01

    Particle size analysis has been proposed as a measure of myofibrillar fragmentation resulting from post-mortem proteolysis in meat. The aim of this study was to examine the effect of homogenisation speed, dispersing aggregate size and centrifugation on particle size characteristics of pork loin. Particle size characteristics were significantly (P≤0.023) greater for samples aged 2 than 8d for all but the 80 and 90% quantiles. Differentiation with ageing was only achieved when homogenised at 11,000rpm using the smaller dispersing aggregate (9 vs 13mm rotor diameters). Centrifugation had no effect on particle size characteristics. Significant correlations with MFI (r=-0.40 to -0.81, Psize analyses as a method of tenderness classification unlikely. Rather, value lies in the detailed profiles of particle size distributions with meat ageing. Copyright © 2017. Published by Elsevier Ltd.

  2. Effect Size as the Essential Statistic in Developing Methods for mTBI Diagnosis.

    Science.gov (United States)

    Gibson, Douglas Brandt

    2015-01-01

    The descriptive statistic known as "effect size" measures the distinguishability of two sets of data. Distingishability is at the core of diagnosis. This article is intended to point out the importance of effect size in the development of effective diagnostics for mild traumatic brain injury and to point out the applicability of the effect size statistic in comparing diagnostic efficiency across the main proposed TBI diagnostic methods: psychological, physiological, biochemical, and radiologic. Comparing diagnostic approaches is difficult because different researcher in different fields have different approaches to measuring efficacy. Converting diverse measures to effect sizes, as is done in meta-analysis, is a relatively easy way to make studies comparable.

  3. Effects of logging, hunting, and forest fragment size on physiological stress levels of two sympatric ateline primates in Colombia

    Science.gov (United States)

    Rimbach, Rebecca; Link, Andrés; Heistermann, Michael; Gómez-Posada, Carolina; Galvis, Nelson; Heymann, Eckhard W.

    2013-01-01

    Habitat fragmentation and anthropogenic disturbances are of major concern to the conservation of endangered species because of their potentially negative impact on animal populations. Both processes can impose physiological stress (i.e. increased glucocorticoid output) on animals, and chronically elevated stress levels can have detrimental effects on the long-term viability of animal populations. Here, we investigated the effect of fragment size and human impact (logging and hunting pressure) on glucocorticoid levels of two sympatric Neotropical primates, the red howler monkey (Alouatta seniculus) and the critically endangered brown spider monkey (Ateles hybridus). These two species have been reported to contrast strongly in their ability to cope with anthropogenic disturbances. We collected faecal samples from eight spider monkey groups and 31 howler monkey groups, living in seven and 10 different forest fragments in Colombia, respectively. We measured faecal glucocorticoid metabolite (FGCM) levels in both species using previously validated methods. Surprisingly, fragment size did not influence FGCM levels in either species. Spider monkeys showed elevated FGCMs in fragments with the highest level of human impact, whereas we did not find this effect in howler monkeys. This suggests that the two species differ in their physiological responsiveness to anthropogenic changes, further emphasizing why brown spider monkeys are at higher extinction risk than red howler monkeys. If these anthropogenic disturbances persist in the long term, elevated FGCM levels can potentially lead to a state of chronic stress, which might limit the future viability of populations. We propose that FGCM measurements should be used as a tool to monitor populations living in disturbed areas and to assess the success of conservation strategies, such as corridors connecting forest fragments. PMID:27293615

  4. The Surprising Power of Statistical Learning: When Fragment Knowledge Leads to False Memories of Unheard Words

    Science.gov (United States)

    Endress, Ansgar D.; Mehler, Jacques

    2009-01-01

    Word-segmentation, that is, the extraction of words from fluent speech, is one of the first problems language learners have to master. It is generally believed that statistical processes, in particular those tracking "transitional probabilities" (TPs), are important to word-segmentation. However, there is evidence that word forms are stored in…

  5. Distinguishing between the bone fragments of medium-sized mammals and children. A histological identification method for archaeology.

    Science.gov (United States)

    Cuijpers, Saddha A G F M

    2009-06-01

    In archaeology, it is not always possible to identify bone fragments. A novel approach was chosen to assess the potential of histology as an identification tool. Instead of studying a few bones of different categories from many species, this study concentrated on the diaphyses of long bones in four species of comparable size which are relevant to archaeology; young humans, pigs, sheep and goats, to broaden the insight into variations in diaphyseal bone structure within and between these species. A general difference in the primary bone structure was found between children older than one year and the three medium-sized mammals, namely lamellar vs. fibro-lamellar primary bone. Although, the diaphyseal bone structure of children below the age of one year also showed (developing) fibro-lamellar bone, its composition was distinctive from the medium-sized mammals. A difference in the secondary bone structure was also observed. Connecting (Volkmann's) canals, giving the secondary bone a reticular aspect, were seen in the medium-sized mammals but not in the young human long bones. To confirm the validity and applicability of these observed histological differences, a blind test was conducted on 14 diaphyseal fragments of identified long bones from archaeological sites. The results were very promising. All the bone fragments were correctly attributed using the difference in primary bone structure, even when the bone was severely degraded.

  6. Design of chimeric proteins by combination of subdomain-sized fragments.

    Science.gov (United States)

    Rico, José Arcadio Farías; Höcker, Birte

    2013-01-01

    Hybrid proteins or chimeras are generated by recombination of protein fragments. In the course of evolution, this mechanism has led to major diversification of protein folds and their functionalities. Similarly, protein engineers have taken advantage of this attractive strategy to build new proteins. Methods that use homologous recombination have been developed to (semi) randomly create chimeras from which the best can be selected. We wanted to recombine very divergent or even unrelated fragments, which is not possible with these methods. Consequently, based on the observation that nature evolves new proteins also through illegitimate recombination, we developed a strategy to design chimeras using protein fragments from different folds. For this approach, we employ detailed structure comparisons, and based on structural similarities, we choose the fragments used for recombination. Model building and minimization can be used to assess the design, and further optimization can be performed using established computational design methodologies. Here, we outline a general approach to rational protein chimera design based on our experience, and provide considerations for the selection of the fragments, the evaluation, and possible redesign of the constructs.

  7. Generalized eta and omega squared statistics: measures of effect size for some common research designs.

    Science.gov (United States)

    Olejnik, Stephen; Algina, James

    2003-12-01

    The editorial policies of several prominent educational and psychological journals require that researchers report some measure of effect size along with tests for statistical significance. In analysis of variance contexts, this requirement might be met by using eta squared or omega squared statistics. Current procedures for computing these measures of effect often do not consider the effect that design features of the study have on the size of these statistics. Because research-design features can have a large effect on the estimated proportion of explained variance, the use of partial eta or omega squared can be misleading. The present article provides formulas for computing generalized eta and omega squared statistics, which provide estimates of effect size that are comparable across a variety of research designs.

  8. Cluster Size Statistic and Cluster Mass Statistic: Two Novel Methods for Identifying Changes in Functional Connectivity Between Groups or Conditions

    Science.gov (United States)

    Ing, Alex; Schwarzbauer, Christian

    2014-01-01

    Functional connectivity has become an increasingly important area of research in recent years. At a typical spatial resolution, approximately 300 million connections link each voxel in the brain with every other. This pattern of connectivity is known as the functional connectome. Connectivity is often compared between experimental groups and conditions. Standard methods used to control the type 1 error rate are likely to be insensitive when comparisons are carried out across the whole connectome, due to the huge number of statistical tests involved. To address this problem, two new cluster based methods – the cluster size statistic (CSS) and cluster mass statistic (CMS) – are introduced to control the family wise error rate across all connectivity values. These methods operate within a statistical framework similar to the cluster based methods used in conventional task based fMRI. Both methods are data driven, permutation based and require minimal statistical assumptions. Here, the performance of each procedure is evaluated in a receiver operator characteristic (ROC) analysis, utilising a simulated dataset. The relative sensitivity of each method is also tested on real data: BOLD (blood oxygen level dependent) fMRI scans were carried out on twelve subjects under normal conditions and during the hypercapnic state (induced through the inhalation of 6% CO2 in 21% O2 and 73%N2). Both CSS and CMS detected significant changes in connectivity between normal and hypercapnic states. A family wise error correction carried out at the individual connection level exhibited no significant changes in connectivity. PMID:24906136

  9. Cluster size statistic and cluster mass statistic: two novel methods for identifying changes in functional connectivity between groups or conditions.

    Science.gov (United States)

    Ing, Alex; Schwarzbauer, Christian

    2014-01-01

    Functional connectivity has become an increasingly important area of research in recent years. At a typical spatial resolution, approximately 300 million connections link each voxel in the brain with every other. This pattern of connectivity is known as the functional connectome. Connectivity is often compared between experimental groups and conditions. Standard methods used to control the type 1 error rate are likely to be insensitive when comparisons are carried out across the whole connectome, due to the huge number of statistical tests involved. To address this problem, two new cluster based methods--the cluster size statistic (CSS) and cluster mass statistic (CMS)--are introduced to control the family wise error rate across all connectivity values. These methods operate within a statistical framework similar to the cluster based methods used in conventional task based fMRI. Both methods are data driven, permutation based and require minimal statistical assumptions. Here, the performance of each procedure is evaluated in a receiver operator characteristic (ROC) analysis, utilising a simulated dataset. The relative sensitivity of each method is also tested on real data: BOLD (blood oxygen level dependent) fMRI scans were carried out on twelve subjects under normal conditions and during the hypercapnic state (induced through the inhalation of 6% CO2 in 21% O2 and 73%N2). Both CSS and CMS detected significant changes in connectivity between normal and hypercapnic states. A family wise error correction carried out at the individual connection level exhibited no significant changes in connectivity.

  10. Improving the Process-Variation Tolerance of Digital Circuits Using Gate Sizing and Statistical Techniques

    CERN Document Server

    Neiroukh, Osama

    2011-01-01

    A new approach for enhancing the process-variation tolerance of digital circuits is described. We extend recent advances in statistical timing analysis into an optimization framework. Our objective is to reduce the performance variance of a technology-mapped circuit where delays across elements are represented by random variables which capture the manufacturing variations. We introduce the notion of statistical critical paths, which account for both means and variances of performance variation. An optimization engine is used to size gates with a goal of reducing the timing variance along the statistical critical paths. We apply a pair of nested statistical analysis methods deploying a slower more accurate approach for tracking statistical critical paths and a fast engine for evaluation of gate size assignments. We derive a new approximation for the max operation on random variables which is deployed for the faster inner engine. Circuit optimization is carried out using a gain-based algorithm that terminates w...

  11. Reversible phospholipid nanogels for deoxyribonucleic acid fragment size determinations up to 1500 base pairs and integrated sample stacking.

    Science.gov (United States)

    Durney, Brandon C; Bachert, Beth A; Sloane, Hillary S; Lukomski, Slawomir; Landers, James P; Holland, Lisa A

    2015-06-23

    Phospholipid additives are a cost-effective medium to separate deoxyribonucleic acid (DNA) fragments and possess a thermally-responsive viscosity. This provides a mechanism to easily create and replace a highly viscous nanogel in a narrow bore capillary with only a 10°C change in temperature. Preparations composed of dimyristoyl-sn-glycero-3-phosphocholine (DMPC) and 1,2-dihexanoyl-sn-glycero-3-phosphocholine (DHPC) self-assemble, forming structures such as nanodisks and wormlike micelles. Factors that influence the morphology of a particular DMPC-DHPC preparation include the concentration of lipid in solution, the temperature, and the ratio of DMPC and DHPC. It has previously been established that an aqueous solution containing 10% phospholipid with a ratio of [DMPC]/[DHPC]=2.5 separates DNA fragments with nearly single base resolution for DNA fragments up to 500 base pairs in length, but beyond this size the resolution decreases dramatically. A new DMPC-DHPC medium is developed to effectively separate and size DNA fragments up to 1500 base pairs by decreasing the total lipid concentration to 2.5%. A 2.5% phospholipid nanogel generates a resolution of 1% of the DNA fragment size up to 1500 base pairs. This increase in the upper size limit is accomplished using commercially available phospholipids at an even lower material cost than is achieved with the 10% preparation. The separation additive is used to evaluate size markers ranging between 200 and 1500 base pairs in order to distinguish invasive strains of Streptococcus pyogenes and Aspergillus species by harnessing differences in gene sequences of collagen-like proteins in these organisms. For the first time, a reversible stacking gel is integrated in a capillary sieving separation by utilizing the thermally-responsive viscosity of these self-assembled phospholipid preparations. A discontinuous matrix is created that is composed of a cartridge of highly viscous phospholipid assimilated into a separation matrix

  12. Linear induction of DNA double-strand breakage with X-ray dose, as determined from DNA fragment size distribution

    Energy Technology Data Exchange (ETDEWEB)

    Erixon, K.; Cedervall, B. [Karolinksa Institutet, Stockholm (Sweden)

    1995-05-01

    Pulsed-field gel electrophoresis has been applied to separate DNA from mouse L1210 cells exposed to X-ray doses of 1 to 50 Gy. Simultaneous separation of marker chromosomes in the range 0.1 to 12.6 Mbp allowed calculation of the size distribution of the radiation-induced fragments. The distribution was consistent with a random induction of double-strand breaks (DSBs). A theoretical relationship between the size distribution of such fragments and the average number of induced breaks was used to calculate the yield and dose response. The DNA distribution was determined by both radiolabeling and fluorescence staining. Two independent methods were use to evaluate the radiation-induced yield of DSBs, both assuming that all DNA is broken at random. In the first method we compared the theoretical and experimental fraction of DNA that is below a given size limit. By this method we estimated the yield to be 0.006-0.007 DSB/GY per million base pairs using the radiolabel and 0.004-0.008 DSB/Gy per million base pairs by fluorescence staining. The dose response was linear in both cases. In the second method we looked only at the size distribution in the resolving part of the gel and compared it to the theoretical distribution. By this method a value of approximately 0.012 DSB/Gy/Mb was found, using fluorescence as a measure of DNA distribution. In a normal diploid mammalian genome of size 60000 Mbp, this is equivalent to a yield of 25-50 DSBs/Gy or 70 DSBs/GY, respectively. The second approach, which looks only at the smaller fragments, may overestimate the yield, while the first approach suffers from uncertainties about the fraction of DNA irreversibly trapped in the well. The assay has the capacity to detect a dose of less than 1 Gy. 58 refs., 10 figs.

  13. Fragments analysis of Marajoara pubic covers using a portable system of X-ray fluorescence and multivariate statistics

    Energy Technology Data Exchange (ETDEWEB)

    Freitas, Renato [Instituto Federal de Educacao, Ciencia e Tecnologia do Rio de Janeiro (CPAR/IFRJ), RJ (Brazil). Curso de Licenciatura em Matematica; Calza, Cristiane Ferreira; Lopes, Ricardo Tadeu [Coordenacao dos Programas de Pos-Graduacao de Engenharia (COPPE/UFRJ), RJ (Brazil); Rabello, Angela; Lima, Tania [Museu Nacional (MN/UFRJ), Rio de Janeiro, RJ (Brazil)

    2011-07-01

    Full text: In this work it was characterized the elemental composition of 102 fragments of Marajoara pubic covers, belonging to the National Museum collection, using EDXRF and multivariate statistics analysis. The objective was to identify possible groups of samples that presented similar characteristics. This information will be useful in the development of a systematic classification of these artifacts. Provenance studies of ancient ceramics are based on the assumption that pottery produced from a specific clay will present a similar chemical composition, which will distinguish them from pottery produced from a different clay. In this way, the pottery is assigned to particular production groups, which are then correlated with their respective origins. EDXRF measurements were carried out with a portable system, developed in the Nuclear Instrumentation Laboratory, consisting of an X-ray tube Oxford TF3005 with tungsten (W) anode, operating at 25 kV and 100 {mu}A, and a Si-PIN XR-100CR detector from Amptek. In each one of the 102 fragments, six points were analyzed (three in the front part and three in the reverse) with an acquisition time of 600 s and a beam collimation of 2 mm. The spectra were processed and analyzed using the software QXAS-AXIL from IAEA. PCA was applied to the XRF results revealing a clear cluster separation to the samples. (author)

  14. Sample-Size Planning for More Accurate Statistical Power: A Method Adjusting Sample Effect Sizes for Publication Bias and Uncertainty.

    Science.gov (United States)

    Anderson, Samantha F; Kelley, Ken; Maxwell, Scott E

    2017-09-01

    The sample size necessary to obtain a desired level of statistical power depends in part on the population value of the effect size, which is, by definition, unknown. A common approach to sample-size planning uses the sample effect size from a prior study as an estimate of the population value of the effect to be detected in the future study. Although this strategy is intuitively appealing, effect-size estimates, taken at face value, are typically not accurate estimates of the population effect size because of publication bias and uncertainty. We show that the use of this approach often results in underpowered studies, sometimes to an alarming degree. We present an alternative approach that adjusts sample effect sizes for bias and uncertainty, and we demonstrate its effectiveness for several experimental designs. Furthermore, we discuss an open-source R package, BUCSS, and user-friendly Web applications that we have made available to researchers so that they can easily implement our suggested methods.

  15. Effects of habitat fragmentation, population size and demographic history on genetic diversity: the Cross River gorilla in a comparative context.

    Science.gov (United States)

    Bergl, Richard A; Bradley, Brenda J; Nsubuga, Anthony; Vigilant, Linda

    2008-09-01

    In small and fragmented populations, genetic diversity may be reduced owing to increased levels of drift and inbreeding. This reduced diversity is often associated with decreased fitness and a higher threat of extinction. However, it is difficult to determine when a population has low diversity except in a comparative context. We assessed genetic variability in the critically endangered Cross River gorilla (Gorilla gorilla diehli), a small and fragmented population, using 11 autosomal microsatellite loci. We show that levels of diversity in the Cross River population are not evenly distributed across the three genetically identified subpopulations, and that one centrally located subpopulation has higher levels of variability than the others. All measures of genetic variability in the Cross River population were comparable to those of the similarly small mountain gorilla (G. beringei beringei) populations (Bwindi and Virunga). However, for some measures both the Cross River and mountain gorilla populations show lower levels of diversity than a sample from a large, continuous western gorilla population (Mondika, G. gorilla gorilla). Finally, we tested for the genetic signature of a bottleneck in each of the four populations. Only Cross River showed strong evidence of a reduction in population size, suggesting that the reduction in size of this population was more recent or abrupt than in the two mountain gorilla populations. These results emphasize the need for maintaining connectivity in fragmented populations and highlight the importance of allowing small populations to expand.

  16. Secondary Craters and the Size-Velocity Distribution of Ejected Fragments around Lunar Craters Measured Using LROC Images

    Science.gov (United States)

    Singer, K. N.; Jolliff, B. L.; McKinnon, W. B.

    2013-12-01

    Title: Secondary Craters and the Size-Velocity Distribution of Ejected Fragments around Lunar Craters Measured Using LROC Images Authors: Kelsi N. Singer1, Bradley L. Jolliff1, and William B. McKinnon1 Affiliations: 1. Earth and Planetary Sciences, Washington University in St Louis, St. Louis, MO, United States. We report results from analyzing the size-velocity distribution (SVD) of secondary crater forming fragments from the 93 km diameter Copernicus impact. We measured the diameters of secondary craters and their distances from Copernicus using LROC Wide Angle Camera (WAC) and Narrow Angle Camera (NAC) image data. We then estimated the velocity and size of the ejecta fragment that formed each secondary crater from the range equation for a ballistic trajectory on a sphere and Schmidt-Holsapple scaling relations. Size scaling was carried out in the gravity regime for both non-porous and porous target material properties. We focus on the largest ejecta fragments (dfmax) at a given ejection velocity (υej) and fit the upper envelope of the SVD using quantile regression to an equation of the form dfmax = A*υej ^- β. The velocity exponent, β, describes how quickly fragment sizes fall off with increasing ejection velocity during crater excavation. For Copernicus, we measured 5800 secondary craters, at distances of up to 700 km (15 crater radii), corresponding to an ejecta fragment velocity of approximately 950 m/s. This mapping only includes secondary craters that are part of a radial chain or cluster. The two largest craters in chains near Copernicus that are likely to be secondaries are 6.4 and 5.2 km in diameter. We obtained a velocity exponent, β, of 2.2 × 0.1 for a non-porous surface. This result is similar to Vickery's [1987, GRL 14] determination of β = 1.9 × 0.2 for Copernicus using Lunar Orbiter IV data. The availability of WAC 100 m/pix global mosaics with illumination geometry optimized for morphology allows us to update and extend the work of Vickery

  17. A polymer, random walk model for the size-distribution of large DNA fragments after high linear energy transfer radiation

    Science.gov (United States)

    Ponomarev, A. L.; Brenner, D.; Hlatky, L. R.; Sachs, R. K.

    2000-01-01

    DNA double-strand breaks (DSBs) produced by densely ionizing radiation are not located randomly in the genome: recent data indicate DSB clustering along chromosomes. Stochastic DSB clustering at large scales, from > 100 Mbp down to energy transfer (LET) radiation, are obtained. They are found to be non-linear when the dose becomes so large that there is a significant probability of overlapping or close juxtaposition, along one chromosome, for different DSB clusters from different tracks. The non-linearity is more evident for large fragments than for small. The DNAbreak results furnish an example of the RLC (randomly located clusters) analytic formalism, which generalizes the broken-stick fragment-size distribution of the random-breakage model that is often applied to low-LET data.

  18. Computing Effect Size Measures with ViSta-The Visual Statistics System

    Directory of Open Access Journals (Sweden)

    Nuria Cortada de Kohan

    2009-03-01

    Full Text Available Effect size measures are recognized as a necessary complement to statistical hypothesis testing because they provide important information that such tests alone cannot offer. In this paper we: a briefly review the importance of effect size measures, b describe some calculation algorithms for the case of the difference between two means, and c provide a new and easy-to-use computer program to perform these calculations within ViSta “The Visual Statistics System”. A worked example is also provided to illustrate some practical issues concerning the interpretation and limits of effect size computation. The audience for this paper includes novice researchers as well as ViSta’s user interested on applying effect size measures.

  19. Phase analysis in single-chain variable fragment production by recombinant Pichia pastoris based on proteomics combined with multivariate statistics.

    Science.gov (United States)

    Fujiki, Yuya; Kumada, Yoichi; Kishimoto, Michimasa

    2015-08-01

    The proteomics technique, which consists of two-dimensional gel electrophoresis (2-DE), peptide mass fingerprinting (PMF), gel image analysis, and multivariate statistics, was applied to the phase analysis of a fed-batch culture for the production of a single-chain variable fragment (scFv) of an anti-C-reactive protein (CRP) antibody by Pichia pastoris. The time courses of the fed-batch culture were separated into three distinct phases: the growth phase of the batch process, the growth phase of the fed-batch process, and the production phase of the fed-batch process. Multivariate statistical analysis using 2-DE gel image analysis data clearly showed the change in the culture phase and provided information concerning the protein expression, which suggested a metabolic change related to cell growth and production during the fed-batch culture. Furthermore, specific proteins, such as alcohol oxidase, which is strongly related to scFv expression, and proteinase A, which could biodegrade scFv in the latter phases of production, were identified via the PMF method. The proteomics technique provided valuable information about the effect of the methanol concentration on scFv production.

  20. The effect of cluster size variability on statistical power in cluster-randomized trials.

    Directory of Open Access Journals (Sweden)

    Stephen A Lauer

    Full Text Available The frequency of cluster-randomized trials (CRTs in peer-reviewed literature has increased exponentially over the past two decades. CRTs are a valuable tool for studying interventions that cannot be effectively implemented or randomized at the individual level. However, some aspects of the design and analysis of data from CRTs are more complex than those for individually randomized controlled trials. One of the key components to designing a successful CRT is calculating the proper sample size (i.e. number of clusters needed to attain an acceptable level of statistical power. In order to do this, a researcher must make assumptions about the value of several variables, including a fixed mean cluster size. In practice, cluster size can often vary dramatically. Few studies account for the effect of cluster size variation when assessing the statistical power for a given trial. We conducted a simulation study to investigate how the statistical power of CRTs changes with variable cluster sizes. In general, we observed that increases in cluster size variability lead to a decrease in power.

  1. Submicron-sized boron carbide particles encapsulated in turbostratic graphite prepared by laser fragmentation in liquid medium.

    Science.gov (United States)

    Ishikawa, Yoshie; Sasaki, Takeshi; Koshizaki, Naoto

    2010-08-01

    Submicron-sized B4C spherical particles were obtained by laser fragmentation of large B4C particles dispersed in ethyl acetate. The irradiated surface of large B4C raw particles was heated and melted by laser energy absorption. B4C droplets were then cooled down, and finally B4C spherical particles were obtained. Moreover, each B4C particle obtained was encapsulated in a graphitic layer that is useful for medical functionalization of particles. Thus, obtained B4C particles encapsulated in graphitic layer may have potential uses in boron neutron capture therapy.

  2. Cell-free reconstitution of vacuole membrane fragmentation reveals regulation of vacuole size and number by TORC1

    Science.gov (United States)

    Michaillat, Lydie; Baars, Tonie Luise; Mayer, Andreas

    2012-01-01

    Size and copy number of organelles are influenced by an equilibrium of membrane fusion and fission. We studied this equilibrium on vacuoles—the lysosomes of yeast. Vacuole fusion can readily be reconstituted and quantified in vitro, but it had not been possible to study fission of the organelle in a similar way. Here we present a cell-free system that reconstitutes fragmentation of purified yeast vacuoles (lysosomes) into smaller vesicles. Fragmentation in vitro reproduces physiological aspects. It requires the dynamin-like GTPase Vps1p, V-ATPase pump activity, cytosolic proteins, and ATP and GTP hydrolysis. We used the in vitro system to show that the vacuole-associated TOR complex 1 (TORC1) stimulates vacuole fragmentation but not the opposing reaction of vacuole fusion. Under nutrient restriction, TORC1 is inactivated, and the continuing fusion activity then dominates the fusion/fission equilibrium, decreasing the copy number and increasing the volume of the vacuolar compartment. This result can explain why nutrient restriction not only induces autophagy and a massive buildup of vacuolar/lysosomal hydrolases, but also leads to a concomitant increase in volume of the vacuolar compartment by coalescence of the organelles into a single large compartment. PMID:22238359

  3. An Introduction to Confidence Intervals for Both Statistical Estimates and Effect Sizes.

    Science.gov (United States)

    Capraro, Mary Margaret

    This paper summarizes methods of estimating confidence intervals, including classical intervals and intervals for effect sizes. The recent American Psychological Association (APA) Task Force on Statistical Inference report suggested that confidence intervals should always be reported, and the fifth edition of the APA "Publication Manual"…

  4. Fissure formation in coke. 3: Coke size distribution and statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    D.R. Jenkins; D.E. Shaw; M.R. Mahoney [CSIRO, North Ryde, NSW (Australia). Mathematical and Information Sciences

    2010-07-15

    A model of coke stabilization, based on a fundamental model of fissuring during carbonisation is used to demonstrate the applicability of the fissuring model to actual coke size distributions. The results indicate that the degree of stabilization is important in determining the size distribution. A modified form of the Weibull distribution is shown to provide a better representation of the whole coke size distribution compared to the Rosin-Rammler distribution, which is generally only fitted to the lump coke. A statistical analysis of a large number of experiments in a pilot scale coke oven shows reasonably good prediction of the coke mean size, based on parameters related to blend rank, amount of low rank coal, fluidity and ash. However, the prediction of measures of the spread of the size distribution is more problematic. The fissuring model, the size distribution representation and the statistical analysis together provide a comprehensive capability for understanding and predicting the mean size and distribution of coke lumps produced during carbonisation. 12 refs., 16 figs., 4 tabs.

  5. Diversity of medium and large sized mammals in a Cerrado fragment of central Brazil

    Directory of Open Access Journals (Sweden)

    F.S. Campos

    2013-11-01

    Full Text Available Studies related to community ecology of medium and large mammals represent a priority in developing strategies for conservation of their habitats. Due to the significant ecological importance of these species, a concern in relation to anthropogenic pressures arises since their populations are vulnerable to hunting and fragmentation. In this study, we aimed to analyze the diversity of medium and large mammals in a representative area of the Cerrado biome, located in the National Forest of Silvânia, central Brazil, providing insights for future studies on the biodiversity and conservation of Cerrado mammals. Sampling was carried out by linear transects, search for traces, footprint traps and camera traps. We recorded 23 species, among which three are listed in threat categories (e.g., Myrmecophaga tridactyla, Chrysocyon brachyurus and Leopardus tigrinus. We registered 160 records in the study area, where the most frequently recorded species were Didelphis albiventris (30 records and Cerdocyon thous (28 records. Our results indicated that a small protected area of Cerrado can include a large and important percentage of the diversity of mammals in this biome, providing information about richness, abundance, spatial distribution and insights for future studies on the biodiversity and conservation of these biological communities.

  6. Statistical model of rough surface contact accounting for size-dependent plasticity and asperity interaction

    Science.gov (United States)

    Song, H.; Vakis, A. I.; Liu, X.; Van der Giessen, E.

    2017-09-01

    The work by Greenwood and Williamson (GW) has initiated a simple but effective method of contact mechanics: statistical modeling based on the mechanical response of a single asperity. Two main assumptions of the original GW model are that the asperity response is purely elastic and that there is no interaction between asperities. However, as asperities lie on a continuous substrate, the deformation of one asperity will change the height of all other asperities through deformation of the substrate and will thus influence subsequent contact evolution. Moreover, a high asperity contact pressure will result in plasticity, which below tens of microns is size dependent, with smaller being harder. In this paper, the asperity interaction effect is taken into account through substrate deformation, while a size-dependent plasticity model is adopted for individual asperities. The intrinsic length in the strain gradient plasticity (SGP) theory is obtained by fitting to two-dimensional discrete dislocation plasticity simulations of the flattening of a single asperity. By utilizing the single asperity response in three dimensions and taking asperity interaction into account, a statistical calculation of rough surface contact is performed. The effectiveness of the statistical model is addressed by comparison with full-detail finite element simulations of rough surface contact using SGP. Throughout the paper, our focus is on the difference of contact predictions based on size-dependent plasticity as compared to conventional size-independent plasticity.

  7. Statistical Estimation of Orbital Debris Populations with a Spectrum of Object Size

    Science.gov (United States)

    Xu, Y. -l; Horstman, M.; Krisko, P. H.; Liou, J. -C; Matney, M.; Stansbery, E. G.; Stokely, C. L.; Whitlock, D.

    2008-01-01

    Orbital debris is a real concern for the safe operations of satellites. In general, the hazard of debris impact is a function of the size and spatial distributions of the debris populations. To describe and characterize the debris environment as reliably as possible, the current NASA Orbital Debris Engineering Model (ORDEM2000) is being upgraded to a new version based on new and better quality data. The data-driven ORDEM model covers a wide range of object sizes from 10 microns to greater than 1 meter. This paper reviews the statistical process for the estimation of the debris populations in the new ORDEM upgrade, and discusses the representation of large-size (greater than or equal to 1 m and greater than or equal to 10 cm) populations by SSN catalog objects and the validation of the statistical approach. Also, it presents results for the populations with sizes of greater than or equal to 3.3 cm, greater than or equal to 1 cm, greater than or equal to 100 micrometers, and greater than or equal to 10 micrometers. The orbital debris populations used in the new version of ORDEM are inferred from data based upon appropriate reference (or benchmark) populations instead of the binning of the multi-dimensional orbital-element space. This paper describes all of the major steps used in the population-inference procedure for each size-range. Detailed discussions on data analysis, parameter definition, the correlation between parameters and data, and uncertainty assessment are included.

  8. Analysis of fragment size distributions in collisions of monocharged ions with the C{sub 60} molecule

    Energy Technology Data Exchange (ETDEWEB)

    Rentenier, A; Moretto-Capelle, P; Bordenave-Montesquieu, D; Bordenave-Montesquieu, A [LCAR-IRSAMC, UMR 5589 Universite Paul Sabatier-CNRS, 118 rte de Narbonne, 31062 Toulouse Cedex (France)

    2005-04-14

    Fragmentation of the C{sub 60} molecule is investigated using a multicorrelation technique. We first focus on the transition from asymmetrical dissociation (AD) to multifragmentation (MF). These processes are studied in collisions between H{sup +}{sub x}(x = 1-3) hydrogenic projectiles and C{sub 60} fullerene in the gas phase, in the 2-130 keV collisional energy range. A rather sharp transition from pure AD to predominant MF is observed when plotting the AD/(AD + MF) ratio against the average deposited energy E{sub dep}; it occurs in the 80-240 eV E{sub dep} range; this ratio is also found to be independent of the projectile species (scaling law). The evolution of the size distribution shape is also discussed and compared with other data available in the literature. A pure power law is never reached in the present experimental conditions. Finally, an event-by-event analysis of the fragmentation data is developed for the first time in the study of the C{sub 60} molecule fragmentation and discussed in terms of the predictions of the percolation model near a critical behaviour. Moments of order 2, 3 and 5 are determined for each correlation event. Moments of order 3 and 5 follow a linear behaviour when plotted against the moment of order 2, as predicted, and the exponent {tau} that is extracted takes a value near 2. The Campi scatter plot is also determined and discussed for total and multiplicity-selected events. Both slopes of the two branches in the Campi plots and {tau} value are near those that are expected in the percolation of a 2D lattice.

  9. Effect of woodland patch size on rodent seed predation in a fragmented landscape

    Directory of Open Access Journals (Sweden)

    J. Loman

    2007-05-01

    Full Text Available Predation on large woody plant seeds; chestnuts, acorns and sloe kernels, was studied in deciduous forests of two size classes: small woodlots (<1 ha and large woods (at least 25 ha in southern Sweden. Seeds used for the study were artificially distributed on the forest ground and seed predation measured as seed removal. Predation rate was similar in both types of woods. However, rodent density was higher in small woodlots and a correction for differences in rodent density showed that predation rate per individual rodent was higher in the large woods. This suggests that the small woodlots (including the border zone and their adjacent fields have more rodent food per area unit. A small woodlot cannot be considered a representative sample of a large continuous forest, even if the habitats appear similar. There was a strong effect of rodent density on seed predation rate. This suggests that rodents are major seed predators in this habitat.

  10. SOME IMPORTANT STATISTICAL PROPERTIES, INFORMATION MEASURES AND ESTIMATIONS OF SIZE BIASED GENERALIZED GAMMA DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    J. A. Reshi

    2014-12-01

    Full Text Available In this paper, a new class of Size-biased Generalized Gamma (SBGG distribution is defined. A Size-biased Generalized Gamma (SBGG distribution, a particular case of weighted Generalized Gamma distribution, taking the weights as the variate values has been defined. The important statistical properties including hazard functions, reverse hazard functions, mode, moment generating function, characteristic function, Shannon’s entropy, generalized entropy and Fisher’s information matrix of the new model have been derived and studied. Here, we also study SBGG entropy estimation, Akaike and Bayesian information criterion. A likelihood ratio test for size-biasedness is conducted. The estimation of parameters is obtained by employing the classical methods of estimation especially method of moments and maximum likelihood estimator.

  11. Effects of fuel particle size and fission-fragment-enhanced irradiation creep on the in-pile behavior in CERCER composite pellets

    Science.gov (United States)

    Zhao, Yunmei; Ding, Shurong; Zhang, Xunchao; Wang, Canglong; Yang, Lei

    2016-12-01

    The micro-scale finite element models for CERCER pellets with different-sized fuel particles are developed. With consideration of a grain-scale mechanistic irradiation swelling model in the fuel particles and the irradiation creep in the matrix, numerical simulations are performed to explore the effects of the particle size and the fission-fragment-enhanced irradiation creep on the thermo-mechanical behavior of CERCER pellets. The enhanced irradiation creep effect is applied in the 10 μm-thick fission fragment damage matrix layer surrounding the fuel particles. The obtained results indicate that (1) lower maximum temperature occurs in the cases with smaller-sized particles, and the effects of particle size on the mechanical behavior in pellets are intricate; (2) the first principal stress and radial axial stress remain compressive in the fission fragment damage layer at higher burnup, thus the mechanism of radial cracking found in the experiment can be better explained.

  12. Constrained statistical inference: sample-size tables for ANOVA and regression.

    Science.gov (United States)

    Vanbrabant, Leonard; Van De Schoot, Rens; Rosseel, Yves

    2014-01-01

    Researchers in the social and behavioral sciences often have clear expectations about the order/direction of the parameters in their statistical model. For example, a researcher might expect that regression coefficient β1 is larger than β2 and β3. The corresponding hypothesis is H: β1 > {β2, β3} and this is known as an (order) constrained hypothesis. A major advantage of testing such a hypothesis is that power can be gained and inherently a smaller sample size is needed. This article discusses this gain in sample size reduction, when an increasing number of constraints is included into the hypothesis. The main goal is to present sample-size tables for constrained hypotheses. A sample-size table contains the necessary sample-size at a pre-specified power (say, 0.80) for an increasing number of constraints. To obtain sample-size tables, two Monte Carlo simulations were performed, one for ANOVA and one for multiple regression. Three results are salient. First, in an ANOVA the needed sample-size decreases with 30-50% when complete ordering of the parameters is taken into account. Second, small deviations from the imposed order have only a minor impact on the power. Third, at the maximum number of constraints, the linear regression results are comparable with the ANOVA results. However, in the case of fewer constraints, ordering the parameters (e.g., β1 > β2) results in a higher power than assigning a positive or a negative sign to the parameters (e.g., β1 > 0).

  13. Statistical design for biospecimen cohort size in proteomics-based biomarker discovery and verification studies.

    Science.gov (United States)

    Skates, Steven J; Gillette, Michael A; LaBaer, Joshua; Carr, Steven A; Anderson, Leigh; Liebler, Daniel C; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R; Rodriguez, Henry; Boja, Emily S

    2013-12-01

    Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor, and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung, and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC) with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step toward building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research.

  14. Determination of reference limits: statistical concepts and tools for sample size calculation.

    Science.gov (United States)

    Wellek, Stefan; Lackner, Karl J; Jennen-Steinmetz, Christine; Reinhard, Iris; Hoffmann, Isabell; Blettner, Maria

    2014-12-01

    Reference limits are estimators for 'extreme' percentiles of the distribution of a quantitative diagnostic marker in the healthy population. In most cases, interest will be in the 90% or 95% reference intervals. The standard parametric method of determining reference limits consists of computing quantities of the form X̅±c·S. The proportion of covered values in the underlying population coincides with the specificity obtained when a measurement value falling outside the corresponding reference region is classified as diagnostically suspect. Nonparametrically, reference limits are estimated by means of so-called order statistics. In both approaches, the precision of the estimate depends on the sample size. We present computational procedures for calculating minimally required numbers of subjects to be enrolled in a reference study. The much more sophisticated concept of reference bands replacing statistical reference intervals in case of age-dependent diagnostic markers is also discussed.

  15. Renormalization-group theory for finite-size scaling in extreme statistics.

    Science.gov (United States)

    Györgyi, G; Moloney, N R; Ozogány, K; Rácz, Z; Droz, M

    2010-04-01

    We present a renormalization-group (RG) approach to explain universal features of extreme statistics applied here to independent identically distributed variables. The outlines of the theory have been described in a previous paper, the main result being that finite-size shape corrections to the limit distribution can be obtained from a linearization of the RG transformation near a fixed point, leading to the computation of stable perturbations as eigenfunctions. Here we show details of the RG theory which exhibit remarkable similarities to the RG known in statistical physics. Besides the fixed points explaining universality, and the least stable eigendirections accounting for convergence rates and shape corrections, the similarities include marginally stable perturbations which turn out to be generic for the Fisher-Tippett-Gumbel class. Distribution functions containing unstable perturbations are also considered. We find that, after a transitory divergence, they return to the universal fixed line at the same or at a different point depending on the type of perturbation.

  16. Towards standardisation of cell-free DNA measurement in plasma: controls for extraction efficiency, fragment size bias and quantification.

    Science.gov (United States)

    Devonshire, Alison S; Whale, Alexandra S; Gutteridge, Alice; Jones, Gerwyn; Cowen, Simon; Foy, Carole A; Huggett, Jim F

    2014-10-01

    Circulating cell-free DNA (cfDNA) is becoming an important clinical analyte for prenatal testing, cancer diagnosis and cancer monitoring. The extraction stage is critical in ensuring clinical sensitivity of analytical methods measuring minority nucleic acid fractions, such as foetal-derived sequences in predominantly maternal cfDNA. Consequently, quality controls are required for measurement of extraction efficiency, fragment size bias and yield for validation of cfDNA methods. We evaluated the utility of an external DNA spike for monitoring these parameters in a study comparing three specific cfDNA extraction methods [QIAamp circulating nucleic acid (CNA) kit, NucleoSpin Plasma XS (NS) kit and FitAmp plasma/serum DNA isolation (FA) kit] with the commonly used QIAamp DNA blood mini (DBM) kit. We found that the extraction efficiencies of the kits ranked in the order CNA kit > DBM kit > NS kit > FA kit, and the CNA and NS kits gave a better representation of smaller DNA fragments in the extract than the DBM kit. We investigated means of improved reporting of cfDNA yield by comparing quantitative PCR measurements of seven different reference gene assays in plasma samples and validating these with digital PCR. We noted that the cfDNA quantities based on measurement of some target genes (e.g. TERT) were, on average, more than twofold higher than those of other assays (e.g. ERV3). We conclude that analysis and averaging of multiple reference genes using a GeNorm approach gives a more reliable estimate of total cfDNA quantity.

  17. A Total Quality-Control Plan with Right-Sized Statistical Quality-Control.

    Science.gov (United States)

    Westgard, James O

    2017-03-01

    A new Clinical Laboratory Improvement Amendments option for risk-based quality-control (QC) plans became effective in January, 2016. Called an Individualized QC Plan, this option requires the laboratory to perform a risk assessment, develop a QC plan, and implement a QC program to monitor ongoing performance of the QC plan. Difficulties in performing a risk assessment may limit validity of an Individualized QC Plan. A better alternative is to develop a Total QC Plan including a right-sized statistical QC procedure to detect medically important errors. Westgard Sigma Rules provides a simple way to select the right control rules and the right number of control measurements.

  18. Finite-size scaling of two-point statistics and the turbulent energy cascade generators.

    Science.gov (United States)

    Cleve, Jochen; Dziekan, Thomas; Schmiegel, Jürgen; Barndorff-Nielsen, Ole E; Pearson, Bruce R; Sreenivasan, Katepalli R; Greiner, Martin

    2005-02-01

    Within the framework of random multiplicative energy cascade models of fully developed turbulence, finite-size-scaling expressions for two-point correlators and cumulants are derived, taking into account the observationally unavoidable conversion from an ultrametric to an Euclidean two-point distance. The comparison with two-point statistics of the surrogate energy dissipation, extracted from various wind tunnel and atmospheric boundary layer records, allows an accurate deduction of multiscaling exponents and cumulants, even at moderate Reynolds numbers for which simple power-law fits are not feasible. The extracted exponents serve as input for parametric estimates of the probabilistic cascade generator. Various cascade generators are evaluated.

  19. Statistical power calculation and sample size determination for environmental studies with data below detection limits

    Science.gov (United States)

    Shao, Quanxi; Wang, You-Gan

    2009-09-01

    Power calculation and sample size determination are critical in designing environmental monitoring programs. The traditional approach based on comparing the mean values may become statistically inappropriate and even invalid when substantial proportions of the response values are below the detection limits or censored because strong distributional assumptions have to be made on the censored observations when implementing the traditional procedures. In this paper, we propose a quantile methodology that is robust to outliers and can also handle data with a substantial proportion of below-detection-limit observations without the need of imputing the censored values. As a demonstration, we applied the methods to a nutrient monitoring project, which is a part of the Perth Long-Term Ocean Outlet Monitoring Program. In this example, the sample size required by our quantile methodology is, in fact, smaller than that by the traditional t-test, illustrating the merit of our method.

  20. A statistical analysis of North East Atlantic (submicron) aerosol size distributions

    Science.gov (United States)

    Dall'Osto, M.; Monahan, C.; Greaney, R.; Beddows, D. C. S.; Harrison, R. M.; Ceburnis, D.; O'Dowd, C. D.

    2011-12-01

    The Global Atmospheric Watch research station at Mace Head (Ireland) offers the possibility to sample some of the cleanest air masses being imported into Europe as well as some of the most polluted being exported out of Europe. We present a statistical cluster analysis of the physical characteristics of aerosol size distributions in air ranging from the cleanest to the most polluted for the year 2008. Data coverage achieved was 75% throughout the year. By applying the Hartigan-Wong k-Means method, 12 clusters were identified as systematically occurring. These 12 clusters could be further combined into 4 categories with similar characteristics, namely: coastal nucleation category (occurring 21.3 % of the time), open ocean nucleation category (occurring 32.6% of the time), background clean marine category (occurring 26.1% of the time) and anthropogenic category (occurring 20% of the time) aerosol size distributions. The coastal nucleation category is characterised by a clear and dominant nucleation mode at sizes less than 10 nm while the open ocean nucleation category is characterised by a dominant Aitken mode between 15 nm and 50 nm. The background clean marine aerosol exhibited a clear bimodality in the sub-micron size distribution, with although it should be noted that either the Aitken mode or the accumulation mode may dominate the number concentration. However, peculiar background clean marine size distributions with coarser accumulation modes are also observed during winter months. By contrast, the continentally-influenced size distributions are generally more monomodal (accumulation), albeit with traces of bimodality. The open ocean category occurs more often during May, June and July, corresponding with the North East (NE) Atlantic high biological period. Combined with the relatively high percentage frequency of occurrence (32.6%), this suggests that the marine biota is an important source of new nano aerosol particles in NE Atlantic Air.

  1. A statistical analysis of North East Atlantic (submicron aerosol size distributions

    Directory of Open Access Journals (Sweden)

    M. Dall'Osto

    2011-12-01

    Full Text Available The Global Atmospheric Watch research station at Mace Head (Ireland offers the possibility to sample some of the cleanest air masses being imported into Europe as well as some of the most polluted being exported out of Europe. We present a statistical cluster analysis of the physical characteristics of aerosol size distributions in air ranging from the cleanest to the most polluted for the year 2008. Data coverage achieved was 75% throughout the year. By applying the Hartigan-Wong k-Means method, 12 clusters were identified as systematically occurring. These 12 clusters could be further combined into 4 categories with similar characteristics, namely: coastal nucleation category (occurring 21.3 % of the time, open ocean nucleation category (occurring 32.6% of the time, background clean marine category (occurring 26.1% of the time and anthropogenic category (occurring 20% of the time aerosol size distributions. The coastal nucleation category is characterised by a clear and dominant nucleation mode at sizes less than 10 nm while the open ocean nucleation category is characterised by a dominant Aitken mode between 15 nm and 50 nm. The background clean marine aerosol exhibited a clear bimodality in the sub-micron size distribution, with although it should be noted that either the Aitken mode or the accumulation mode may dominate the number concentration. However, peculiar background clean marine size distributions with coarser accumulation modes are also observed during winter months. By contrast, the continentally-influenced size distributions are generally more monomodal (accumulation, albeit with traces of bimodality. The open ocean category occurs more often during May, June and July, corresponding with the North East (NE Atlantic high biological period. Combined with the relatively high percentage frequency of occurrence (32.6%, this suggests that the marine biota is an important source of new nano aerosol particles in NE Atlantic Air.

  2. A visitor's guide to effect sizes: statistical significance versus practical (clinical) importance of research findings.

    Science.gov (United States)

    Hojat, Mohammadreza; Xu, Gang

    2004-01-01

    Effect Sizes (ES) are an increasingly important index used to quantify the degree of practical significance of study results. This paper gives an introduction to the computation and interpretation of effect sizes from the perspective of the consumer of the research literature. The key points made are: 1. ES is a useful indicator of the practical (clinical) importance of research results that can be operationally defined from being "negligible" to "moderate", to "important". 2. The ES has two advantages over statistical significance testing: (a) it is independent of the size of the sample; (b) it is a scale-free index. Therefore, ES can be uniformly interpreted in different studies regardless of the sample size and the original scales of the variables. 3. Calculations of the ES are illustrated by using examples of comparisons between two means, correlation coefficients, chi-square tests and two proportions, along with appropriate formulas. 4. Operational definitions for the ES s are given, along with numerical examples for the purpose of illustration.

  3. Solar granulation and statistical crystallography: A modeling approach using size-shape relations

    Science.gov (United States)

    Noever, D. A.

    1994-01-01

    The irregular polygonal pattern of solar granulation is analyzed for size-shape relations using statistical crystallography. In contrast to previous work which has assumed perfectly hexagonal patterns for granulation, more realistic accounting of cell (granule) shapes reveals a broader basis for quantitative analysis. Several features emerge as noteworthy: (1) a linear correlation between number of cell-sides and neighboring shapes (called Aboav-Weaire's law); (2) a linear correlation between both average cell area and perimeter and the number of cell-sides (called Lewis's law and a perimeter law, respectively) and (3) a linear correlation between cell area and squared perimeter (called convolution index). This statistical picture of granulation is consistent with a finding of no correlation in cell shapes beyond nearest neighbors. A comparative calculation between existing model predictions taken from luminosity data and the present analysis shows substantial agreements for cell-size distributions. A model for understanding grain lifetimes is proposed which links convective times to cell shape using crystallographic results.

  4. A statistical analysis of North East Atlantic (submicron aerosol size distributions

    Directory of Open Access Journals (Sweden)

    M. Dall'Osto

    2011-08-01

    Full Text Available The Global Atmospheric Watch research station at Mace Head (Ireland offers the possibility to sample some of the cleanest air masses being imported into Europe as well as some of the most polluted being exported out of Europe. We present a statistical Cluster~analysis of the physical characteristics of aerosol size distributions in air ranging from the cleanest to the most polluted for the year 2008. Data coverage achieved was 75 % throughout the year. By applying the Hartigan-Wong k-Means method, 12 Clusters were identified as systematically occurring and these 12 Clusters could be further combined into 4 categories with similar characteristics, namely: coastal nucleation category (occurring 21.3 % of the time, open ocean nucleation category (occurring 32.6 % of the time, background clean marine category (occurring 26.1 % of the time and anthropogenic category (occurring 20 % of the time aerosol size distributions. The coastal nucleation category is characterised by a clear and dominant nucleation mode at sizes less that 10 nm while the open ocean nucleation category is characterised by a dominant Aitken mode between 15 nm and 50 nm. The background clean marine characteristic is a clear bimodality in the size distribution, although it should be noted that either the Aitken mode or the Accumulation mode may dominate the number concentration. By contrast, the continentally-influenced size distributions are generally more mono-modal, albeit with traces of bi-modality. The open ocean category occurs more often during May, June and July, corresponding with the N. E. Atlantic high biological period. Combined with the relatively high percentage frequency of occurrence (32.6 %, this suggests that the marine biota is an important source of new aerosol particles in N. E. Atlantic Air.

  5. Development of capillary size exclusion chromatography for the analysis of monoclonal antibody fragments extracted from human vitreous humor.

    Science.gov (United States)

    Rea, Jennifer C; Lou, Yun; Cuzzi, Joel; Hu, Yuhua; de Jong, Isabella; Wang, Yajun Jennifer; Farnan, Dell

    2012-12-28

    Recombinant antigen-binding fragments (Fabs) are currently on the market and in development for the treatment of ophthalmologic indications. Recently, Quality by Design (QbD) initiatives have been implemented that emphasize understanding the relationship between quality attributes of the product and their impact on safety and efficacy. In particular, changes in product quality once the protein is administered to the patient are of particular interest. Knowledge of protein aggregation in vivo is of importance due to the possibility of antibody aggregates eliciting an immunogenic response in the patient. Presently, there are few analytical methods with adequate sensitivity to analyze Fab aggregates in human vitreous humor (HVH) because the Fab amount available for analysis is often quite low. Here, we report the development of a highly sensitive capillary size exclusion chromatography (SEC) methodology for Fab aggregate analysis in HVH. We demonstrate a process to perform capillary SEC to analyze Fabs with picogram sensitivity and an RSD of less than 8% for the relative peak area of high molecular weight species (HMWS). In addition, we have developed a Protein G affinity chromatography method to capture Fabs from HVH for capillary SEC analysis. Recovery efficiencies ranging from 86 to 99% were achieved using this recovery method with 300 μL HVH samples containing Fab1. Finally, we demonstrate the applicability of the methodology by quantifying Fab aggregates in HVH, which can potentially be used for aggregate analysis of clinically relevant samples. Copyright © 2012 Elsevier B.V. All rights reserved.

  6. Computer program for the calculation of grain size statistics by the method of moments

    Science.gov (United States)

    Sawyer, Michael B.

    1977-01-01

    A computer program is presented for a Hewlett-Packard Model 9830A desk-top calculator (1) which calculates statistics using weight or point count data from a grain-size analysis. The program uses the method of moments in contrast to the more commonly used but less inclusive graphic method of Folk and Ward (1957). The merits of the program are: (1) it is rapid; (2) it can accept data in either grouped or ungrouped format; (3) it allows direct comparison with grain-size data in the literature that have been calculated by the method of moments; (4) it utilizes all of the original data rather than percentiles from the cumulative curve as in the approximation technique used by the graphic method; (5) it is written in the computer language BASIC, which is easily modified and adapted to a wide variety of computers; and (6) when used in the HP-9830A, it does not require punching of data cards. The method of moments should be used only if the entire sample has been measured and the worker defines the measured grain-size range. (1) Use of brand names in this paper does not imply endorsement of these products by the U.S. Geological Survey.

  7. Genotyping of Madurella mycetomatis by selective amplification of restriction fragments (amplified fragment length polymorphism) and subtype correlation with geographical origin and lesion size.

    Science.gov (United States)

    van de Sande, Wendy W J; Gorkink, Roy; Simons, Guus; Ott, Alewijn; Ahmed, Abdalla O A; Verbrugh, Henri; van Belkum, Alex

    2005-09-01

    One of the causative organisms of mycetoma is the fungus Madurella mycetomatis. Previously, extensive molecular typing studies identified Sudanese isolates of this fungus as clonal, but polymorphic genetic markers have not yet been identified. Here, we report on the selective amplification of restriction fragment (AFLP) analysis of 37 Sudanese clinical isolates of M. mycetomatis. Of 93 AFLP fragments generated, 25 were polymorphic, and 12 of these 25 polymorphic fragments were found in a large fraction of the strains. Comparative analysis resulted into a tree, composed of two main (clusters I and II) and one minor cluster (cluster III). Seventy-five percent of the strains found in cluster I originated from central Sudan, while the origin of the strains in cluster II was more heterogeneous. Furthermore, the strains found in cluster I were generally obtained from lesions larger than those from which the strains found in cluster II were obtained (chi-square test for trend, P = 0.03). Among the 12 more commonly found polymorphisms, 4 showed sequence homology with known genes. Marker A7 was homologous to an endo-1,4-beta-glucanase from Aspergillus oryzae, 97% identical markers A12 and B3 matched a hypothetical protein from Gibberella zeae, and marker B4 was homologous to casein kinase I from Danio rerio. The last marker seemed to be associated with strains originating from central Sudan (P = 0.001). This is the first report on a genotypic study where genetic markers which may be used to study pathogenicity in M. mycetomatis were obtained.

  8. Genotyping of Madurella mycetomatis by Selective Amplification of Restriction Fragments (Amplified Fragment Length Polymorphism) and Subtype Correlation with Geographical Origin and Lesion Size

    Science.gov (United States)

    van de Sande, Wendy W. J.; Gorkink, Roy; Simons, Guus; Ott, Alewijn; Ahmed, Abdalla O. A.; Verbrugh, Henri; van Belkum, Alex

    2005-01-01

    One of the causative organisms of mycetoma is the fungus Madurella mycetomatis. Previously, extensive molecular typing studies identified Sudanese isolates of this fungus as clonal, but polymorphic genetic markers have not yet been identified. Here, we report on the selective amplification of restriction fragment (AFLP) analysis of 37 Sudanese clinical isolates of M. mycetomatis. Of 93 AFLP fragments generated, 25 were polymorphic, and 12 of these 25 polymorphic fragments were found in a large fraction of the strains. Comparative analysis resulted into a tree, composed of two main (clusters I and II) and one minor cluster (cluster III). Seventy-five percent of the strains found in cluster I originated from central Sudan, while the origin of the strains in cluster II was more heterogeneous. Furthermore, the strains found in cluster I were generally obtained from lesions larger than those from which the strains found in cluster II were obtained (chi-square test for trend, P = 0.03). Among the 12 more commonly found polymorphisms, 4 showed sequence homology with known genes. Marker A7 was homologous to an endo-1,4-beta-glucanase from Aspergillus oryzae, 97% identical markers A12 and B3 matched a hypothetical protein from Gibberella zeae, and marker B4 was homologous to casein kinase I from Danio rerio. The last marker seemed to be associated with strains originating from central Sudan (P = 0.001). This is the first report on a genotypic study where genetic markers which may be used to study pathogenicity in M. mycetomatis were obtained. PMID:16145076

  9. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  10. Living in forest fragments reduces group cohesion in diademed sifakas (Propithecus diadema) in eastern Madagascar by reducing food patch size.

    Science.gov (United States)

    Irwin, Mitchell T

    2007-04-01

    Forest fragmentation is thought to threaten primate populations, yet the mechanisms by which this occurs remain largely unknown. However, fragmentation is known to cause dietary shifts in several primate species, and links between food resource distribution and within-group spatial dynamics are well documented. Thus, fragmentation has the potential to indirectly affect spatial dynamics, and these changes may present additional stresses to fragmented populations. I present the results from a 12-month study of Propithecus diadema at Tsinjoarivo, eastern Madagascar, including two groups in fragments and two in continuous forest. Instantaneous data on activity and spatial position were collected during all-day focal animal follows. Fragment groups had much lower cohesion, being more likely to have no neighbor within 5 and 10 m. For continuous forest groups, cohesion was highest in the rainy season (when food patches are large) and lowest in winter (when the animals rely on small-crowned mistletoes), and the chance of having no neighbor within 5 m was positively correlated with mistletoe consumption. Thus their decreased cohesion in fragment groups is inferred to result from their increased reliance on mistletoes and other small resources, which causes them to spread out among multiple patches. This scenario is consistent with the reduced body mass of subordinate individuals (males and immatures) in fragments, and suggests the occurrence of steeper within-group fitness gradients. Further research is necessary to determine whether these patterns apply to other primates; however, since fragmentation tends to cause the loss of the largest trees, many primates in fragments may lose their largest food resources and undergo similar behavioral shifts.

  11. Statistical Correlations Between Near-Infrared Luminosities and Ring Sizes in Field Ringed Galaxies

    Science.gov (United States)

    Wu, Wentao

    2008-01-01

    Statistically complete samples of inner-pseudo-, inner-, and outer-ringed galaxies can be extracted from the Catalog of Southern Ringed Galaxies. Redshifts and near-infrared (NIR) photometric data are available for the samples, allowing the derivation of the statistical correlations between the total NIR luminosities (L NIR) and the projected ring major axes in the physical scale (D) for these galaxies. For any of the three types of rings, the correlations are approximately L NIR vprop D 1.2 among the early-type ringed galaxies (the most commonly observed ringed galaxies). The correlations among late-type ringed galaxies appear significantly different. The results contradict the previous suggestion by Kormendy (1979, ApJ, 227, 714), who gave LB vprop D 2 (LB : B-band galaxy luminosity). The relations can be used in future to test theoretical simulations of dynamical structures of ringed galaxies as well as those of ring formation under the framework of cosmological models. Currently the results indicate at most small differences in the relative contributions of disk components to total galaxy masses and in the initial disk velocity dispersions between commonly observed ringed galaxies of similar type. The correlations also suggest a new approach to effectively use ring sizes as tertiary cosmological distance indicators, to help enhance the reliability of the measurement of the Hubble Constant.

  12. Statistical and Physical Descriptions of Raindrop Size Distributions in Equatorial Malaysia from Disdrometer Observations

    Directory of Open Access Journals (Sweden)

    Hong Yin Lam

    2015-01-01

    Full Text Available This work investigates the physical characteristics of raindrop size distribution (DSD in an equatorial heavy rain region based on three years of disdrometer observations carried out at Universiti Teknologi Malaysia’s (UTM’s campus in Kuala Lumpur, Malaysia. The natural characteristics of DSD are deduced, and the statistical results are found to be in accordance with the findings obtained from others disdrometer measurements. Moreover, the parameters of the Gamma distribution and the normalized Gamma model are also derived by means of method of moment (MoM and maximum likelihood estimation (MLE. Their performances are subsequently validated using the rain rate estimation accuracy: the normalized Gamma model with the MLE-generated shape parameter µ was found to provide better accuracy in terms of long-term rainfall rate statistics, which reflects the peculiarities of the local climatology in this heavy rain region. These results not only offer a better understanding of the microphysical nature of precipitation in this heavy rain region but also provide essential information that may be useful for the scientific community regarding remote sensing and radio propagation.

  13. A statistical approach to estimate the 3D size distribution of spheres from 2D size distributions

    Science.gov (United States)

    Kong, M.; Bhattacharya, R.N.; James, C.; Basu, A.

    2005-01-01

    Size distribution of rigidly embedded spheres in a groundmass is usually determined from measurements of the radii of the two-dimensional (2D) circular cross sections of the spheres in random flat planes of a sample, such as in thin sections or polished slabs. Several methods have been devised to find a simple factor to convert the mean of such 2D size distributions to the actual 3D mean size of the spheres without a consensus. We derive an entirely theoretical solution based on well-established probability laws and not constrained by limitations of absolute size, which indicates that the ratio of the means of measured 2D and estimated 3D grain size distribution should be r/4 (=.785). Actual 2D size distribution of the radii of submicron sized, pure Fe0 globules in lunar agglutinitic glass, determined from backscattered electron images, is tested to fit the gamma size distribution model better than the log-normal model. Numerical analysis of 2D size distributions of Fe0 globules in 9 lunar soils shows that the average mean of 2D/3D ratio is 0.84, which is very close to the theoretical value. These results converge with the ratio 0.8 that Hughes (1978) determined for millimeter-sized chondrules from empirical measurements. We recommend that a factor of 1.273 (reciprocal of 0.785) be used to convert the determined 2D mean size (radius or diameter) of a population of spheres to estimate their actual 3D size. ?? 2005 Geological Society of America.

  14. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  15. A Visual Basic Program to Generate Sediment Grain-Size Statistics and Extrapolate Particle Distributions

    Science.gov (United States)

    Poppe, L. J.; Eliason, A. E.; Hastings, M. E.

    2004-05-01

    Methods that describe and summarize grain-size distributions are important to geologists because of the large amount of information contained in textural data sets. Therefore, to facilitate reduction of sedimentologic data, we have written a computer program (GSSTAT) to generate grain-size statistics and extrapolate particle distributions. Our program is written in Microsoft Visual Basic 6.0, runs on Windows 95/98/ME/NT/2000/XP computers, provides a window to facilitate execution, and allows users to select options with mouse-click events or through interactive dialogue boxes. The program permits users to select output in either inclusive graphics or moment statistics, to extrapolate distributions to the colloidal-clay boundary by three methods, and to convert between frequency and cumulative frequency percentages. Detailed documentation is available within the program. Input files to the program must be comma-delimited ASCII text and have 20 fields that include: sample identifier, latitude, longitude, and the frequency or cumulative frequency percentages of the whole-phi fractions from 11 phi through -5 phi. Individual fields may be left blank, but the sum of the phi fractions must total 100% (+/- 0.2%). The program expects the first line of the input file to be a header showing attribute names; no embedded commas are allowed in any of the fields. Error messages warn the user of potential problems. The program generates an output file in the requested destination directory and allows the user to view results in a display window to determine the occurrence of errors. The output file has a header for its first line, but now has 34 fields; the original descriptor fields plus percentages of gravel, sand, silt and clay, statistics, classification, verbal descriptions, frequency or cumulative frequency percentages of the whole- phi fractions from 13 phi through -5 phi, and a field for error messages. If the user has selected extrapolation, the two additional phi

  16. First comparison of a global microphysical aerosol model with size-resolved observational aerosol statistics

    Science.gov (United States)

    Spracklen, D. V.; Pringle, K. J.; Carslaw, K. S.; Mann, G. W.; Manktelow, P.; Heintzenberg, J.

    2006-09-01

    A statistical synthesis of marine aerosol measurements from experiments in four different oceans is used to evaluate a global aerosol microphysics model (GLOMAP). We compare the model against observed size resolved particle concentrations, probability distributions, and the temporal persistence of different size particles. We attempt to explain the observed size distributions in terms of sulfate and sea spray and quantify the possible contributions of anthropogenic sulfate and carbonaceous material to the number and mass distribution. The model predicts a bimodal size distribution that agrees well with observations as a grand average over all regions, but there are large regional differences. Notably, observed Aitken mode number concentrations are more than a factor 10 higher than in the model for the N Atlantic but a factor 7 lower than the model in the NW Pacific. We also find that modelled Aitken mode and accumulation mode geometric mean diameters are generally smaller in the model by 10-30%. Comparison with observed free tropospheric Aitken mode distributions suggests that the model underpredicts growth of these particles during descent to the MBL. Recent observations of a substantial organic component of free tropospheric aerosol could explain this discrepancy. We find that anthropogenic continental material makes a substantial contribution to N Atlantic marine boundary layer (MBL) aerosol, with typically 60-90% of sulfate across the particle size range coming from anthropogenic sources, even if we analyse air that has spent an average of >120 h away from land. However, anthropogenic primary black carbon and organic carbon particles do not explain the large discrepancies in Aitken mode number. Several explanations for the discrepancy are suggested. The lack of lower atmospheric particle formation in the model may explain low N Atlantic particle concentrations. However, the observed and modelled particle persistence at Cape Grim in the Southern Ocean, does not

  17. Testing University Rankings Statistically: Why this Perhaps is not such a Good Idea after All. Some Reflections on Statistical Power, Effect Size, Random Sampling and Imaginary Populations

    DEFF Research Database (Denmark)

    Schneider, Jesper Wiborg

    2012-01-01

    In this paper we discuss and question the use of statistical significance tests in relation to university rankings as recently suggested. We outline the assumptions behind and interpretations of statistical significance tests and relate this to examples from the recent SCImago Institutions Ranking....... By use of statistical power analyses and demonstration of effect sizes, we emphasize that importance of empirical findings lies in “differences that make a difference” and not statistical significance tests per se. Finally we discuss the crucial assumption of randomness and question the presumption...... that randomness is present in the university ranking data. We conclude that the application of statistical significance tests in relation to university rankings, as recently advocated, is problematic and can be misleading....

  18. In situ measurement of the particle size distribution of the fragmentation product of laser-shock-melted aluminum using in-line picosecond holography

    Directory of Open Access Journals (Sweden)

    Ying-Hua Li

    2016-02-01

    Full Text Available The dynamic fragmentation of shock-melted metal is a topic of increasing interest in shock physics. However, high-quality experimental studies of the phenomenon are limited, and data that are essential for developing predictive models of the phenomenon, such as the mass and particle sizes distributions, are quite sparse. In-line holography is an effective non-contact technique for measuring particle size distribution, but critical technical requirements, in particular, particle density limits, complicate its application to the subject phenomenon. These challenges have been reasonably overcome in the present study, allowing for successful in situ measurements of the size distribution of the fragmentation product from laser-shock-melted aluminum. In this letter, we report on our experiments and present the measured data.

  19. Determination of the minimum size of a statistical representative volume element from a fibre-reinforced composite based on point pattern statistics

    DEFF Research Database (Denmark)

    Hansen, Jens Zangenberg; Brøndsted, Povl

    2013-01-01

    In a previous study, Trias et al. [1] determined the minimum size of a statistical representative volume element (SRVE) of a unidirectional fibre-reinforced composite primarily based on numerical analyses of the stress/strain field. In continuation of this, the present study determines the minimum...

  20. Combined effects of grain size, flow volume and channel width on geophysical flow mobility: three-dimensional discrete element modeling of dry and dense flows of angular rock fragments

    Science.gov (United States)

    Cagnoli, Bruno; Piersanti, Antonio

    2017-02-01

    We have carried out new three-dimensional numerical simulations by using a discrete element method (DEM) to study the mobility of dry granular flows of angular rock fragments. These simulations are relevant for geophysical flows such as rock avalanches and pyroclastic flows. The model is validated by previous laboratory experiments. We confirm that (1) the finer the grain size, the larger the mobility of the center of mass of granular flows; (2) the smaller the flow volume, the larger the mobility of the center of mass of granular flows and (3) the wider the channel, the larger the mobility of the center of mass of granular flows. The grain size effect is due to the fact that finer grain size flows dissipate intrinsically less energy. This volume effect is the opposite of that experienced by the flow fronts. The original contribution of this paper consists of providing a comparison of the mobility of granular flows in six channels with a different cross section each. This results in a new scaling parameter χ that has the product of grain size and the cubic root of flow volume as the numerator and the product of channel width and flow length as the denominator. The linear correlation between the reciprocal of mobility and parameter χ is statistically highly significant. Parameter χ confirms that the mobility of the center of mass of granular flows is an increasing function of the ratio of the number of fragments per unit of flow mass to the total number of fragments in the flow. These are two characteristic numbers of particles whose effect on mobility is scale invariant.

  1. Neutral dynamics with environmental noise: Age-size statistics and species lifetimes

    Science.gov (United States)

    Kessler, David; Suweis, Samir; Formentin, Marco; Shnerb, Nadav M.

    2015-08-01

    Neutral dynamics, where taxa are assumed to be demographically equivalent and their abundance is governed solely by the stochasticity of the underlying birth-death process, has proved itself as an important minimal model that accounts for many empirical datasets in genetics and ecology. However, the restriction of the model to demographic [O (√{N }) ] noise yields relatively slow dynamics that appears to be in conflict with both short-term and long-term characteristics of the observed systems. Here we analyze two of these problems—age-size relationships and species extinction time—in the framework of a neutral theory with both demographic and environmental stochasticity. It turns out that environmentally induced variations of the demographic rates control the long-term dynamics and modify dramatically the predictions of the neutral theory with demographic noise only, yielding much better agreement with empirical data. We consider two prototypes of "zero mean" environmental noise, one which is balanced with regard to the arithmetic abundance, another balanced in the logarithmic (fitness) space, study their species lifetime statistics, and discuss their relevance to realistic models of community dynamics.

  2. An analogy of the size distribution of business firms with Bose-Einstein statistics

    Science.gov (United States)

    Hernández-Pérez, R.

    2010-09-01

    We approach the size distribution of business firms by proposing an analogy of the firms’ ranking with a boson gas, identifying the annual revenue of the firms with energy. We found that Bose-Einstein statistics fits very well to the empirical cumulative distribution function for the firms’ ranking for different countries. The fitted values for the temperature-like parameter are compared between countries and with an index of economic development, and we found that our results support the hypothesis that the temperature of the economy can be associated with the level of economic development of a country. Moreover, for most of the countries the value obtained for the fugacity-like parameter is close to 1, suggesting that the analogy could correspond to a photon gas in which the number of particles is not conserved; this is indeed the case for real-world firms’ dynamics, where new firms arrive in the economy and other firms disappear, either by merging with others or through bankruptcy.

  3. Statistical Analysis and Calculation Model of Flexibility Coefficient of Low- and Medium-Sized Arch Dam

    Directory of Open Access Journals (Sweden)

    Su Huaizhi

    2012-01-01

    Full Text Available The flexibility coefficient is popularly used to implement the macroevaluation of shape, safety, and economy for arch dam. However, the description of flexibility coefficient has not drawn a widely consensus all the time. Based on a large number of relative instance data, the relationship between influencing factor and flexibility coefficient is analyzed by means of partial least-squares regression. The partial least-squares regression equation of flexibility coefficient in certain height range between 30 m and 70 m is established. Regressive precision and equation stability are further investigated. The analytical model of statistical flexibility coefficient is provided. The flexibility coefficient criterion is determined preliminarily to evaluate the shape of low- and medium-sized arch dam. A case study is finally presented to illustrate the potential engineering application. According to the analysis result of partial least-squares regression, it is shown that there is strong relationship between flexibility coefficient and average thickness of dam, thickness-height ratio of crown cantilever, arc height ratio, and dam height, but the effect of rise-span ratio is little relatively. The considered factors in the proposed model are more comprehensive, and the applied scope is clearer than that of the traditional calculation methods. It is more suitable for the analogy analysis in engineering design and the safety evaluation for arch dam.

  4. Statistical properties of the Green function in finite size for Anderson localization models with multifractal eigenvectors

    Science.gov (United States)

    Monthus, Cécile

    2017-03-01

    For Anderson localization models with multifractal eigenvectors on disordered samples containing N sites, we analyze in a unified framework the consequences for the statistical properties of the Green function. We focus in particular on the imaginary part of the Green function at coinciding points GxxI≤ft(E-\\text{i}η \\right) and study the scaling with the size N of the moments of arbitrary indices q when the broadening follows the scaling η =\\frac{c}{{{N}δ}} . For the standard scaling regime δ =1 , we find in the two limits c\\ll 1 and c\\gg 1 that the moments are governed by the anomalous exponents Δ (q) of individual eigenfunctions, without the assumption of strong correlations between the weights of consecutive eigenstates at the same point. For the non-standard scaling regimes 0function follows some Fréchet distribution in the typical region, while rare events are important to obtain the scaling of the moments. We describe the application to the case of Gaussian multifractality and to the case of linear multifractality.

  5. Gene flow and effective population sizes of the butterfly Maculinea alcon in a highly fragmented, anthropogenic landscape

    NARCIS (Netherlands)

    Vanden Broeck, An; Maes, Dirk; Kelager, Andreas; Wynhoff, Irma; Wallis de Vries, Michiel; Nash, David R.; Oostermeijer, J.G.B.; Dyck, van Hans; Mergeay, Joachim

    2017-01-01

    Understanding connectivity among populations in fragmented landscapes is of paramount importance in species conservation because it determines their long-term viability and helps to identify and prioritize populations to conserve. Rare and sedentary species are particularly vulnerable to habitat

  6. Homeopathy: statistical significance versus the sample size in experiments with Toxoplasma gondii

    Directory of Open Access Journals (Sweden)

    Ana Lúcia Falavigna Guilherme

    2011-09-01

    , examined in its full length. This study was approved by the Ethics Committee for animal experimentation of the UEM - Protocol 036/2009. The data were compared using the tests Mann Whitney and Bootstrap [7] with the statistical software BioStat 5.0. Results and discussion: There was no significant difference when analyzed with the Mann-Whitney, even multiplying the "n" ten times (p=0.0618. The number of cysts observed in BIOT 200DH group was 4.5 ± 3.3 and 12.8 ± 9.7 in the CONTROL group. Table 1 shows the results obtained using the bootstrap analysis for each data changed from 2n until 2n+5, and their respective p-values. With the inclusion of more elements in the different groups, tested one by one, randomly, increasing gradually the samples, we observed the sample size needed to statistically confirm the results seen experimentally. Using 17 mice in group BIOT 200DH and 19 in the CONTROL group we have already observed statistical significance. This result suggests that experiments involving highly diluted substances and infection of mice with T. gondii should work with experimental groups with 17 animals at least. Despite the current and relevant ethical discussions about the number of animals used for experimental procedures the number of animals involved in each experiment must meet the characteristics of each item to be studied. In the case of experiments involving highly diluted substances, experimental animal models are still rudimentary and the biological effects observed appear to be also individualized, as described in literature for homeopathy [8]. The fact that the statistical significance was achieved by increasing the sample observed in this trial, tell us about a rare event, with a strong individual behavior, difficult to demonstrate in a result set, treated simply with a comparison of means or medians. Conclusion: Bootstrap seems to be an interesting methodology for the analysis of data obtained from experiments with highly diluted

  7. Map misclassifications can cause large errors in landscape pattern indices: examples from habitat fragmentation.

    Science.gov (United States)

    William T. Langford; Sarah E. Gergel; Thomas G. Dietterich; Warren. Cohen

    2006-01-01

    Although habitat fragmentation is one of the greatest threats to biodiversity worldwide, virtually no attention has been paid to the quantification of error in fragmentation statistics. Landscape pattern indices (LPIs), such as mean patch size and number of patches, are routinely used to quantify fragmentation and are often calculated using remote sensing imagery that...

  8. Probing the Concept of Statistical Independence of Intermediate-Mass Fragment Production in Heavy-Ion Collisions

    CERN Document Server

    Skulski, W; Schröder, W U

    1999-01-01

    It is found that the total IMF-transverse-energy (E_t) spectra in multi-IMF events are well represented by synthetic spectra obtained by folding of the single-IMF spectrum. Using the experimental IMF multiplicity distribution, the observed trends in the IMF multiplicity distribution for fixed values of E_t are reproduced. The synthetic distributions show binomial reducibility and Arrhenius-like scaling, similar to that reported in the literature. Similar results are obtained when the above folding-type synthesis is replaced with one based on mixing events with different IMF multiplicities. For statistically independent IMF emission, the observed binomial reducibility and Arrhenius-type scaling are merely reflections of the shape of the single-IMF transverse-energy spectrum. Hence, a valid interpretation of IMF distributions in terms of a particular production scenario has to explain independently the observed shape of the single-IMF E_t spectrum.

  9. Rapid optimization of antibotulinum toxin antibody fragment production by an integral approach utilizing RC-SELDI mass spectrometry and statistical design.

    Science.gov (United States)

    Park, Jun T; Bradbury, Lisa; Kragl, Frank J; Lukens, Dennis C; Valdes, James J

    2006-01-01

    A process for the rapid development and optimization of the fermentation process for an antibotulinum neurotoxin antibody fragment (bt-Fab) production expressed in Escherichia coli was achieved via a high-throughput process proteomics and statistical experimental design. This process, using retentate chromatography-surface enhanced laser desorption/ionization mass spectrometry (RC-SELDI MS), was employed for identifying and quantifying bt-Fab antibody in complex biological samples for the optimization of microbial fermentation conditions. Five variables (type of culture media, glycerol concentration, post-induction temperature, IPTG concentration, and incubation time after induction) were statistically combined using an experimental 2(5)(-1) fractional factorial design and tested for their effects on maximal bt-Fab antibody production. When the effects of individual variables and their interactions were assessed, type of media and post-induction temperature showed statistically significant increase in yield of the fermentation process for the maximal bt-Fab antibody production. This study establishes an integral approach as a valuable tool for the rapid development of manufacturing processes for producing various biological materials. To verify the RC-SELDI MS method, a Fab-specific immuno-affinity HPLC assay developed here was also employed for the quantification of the bt-Fab antibody in crude lysate samples obtained during the fermentation optimization process. Similar results were obtained.

  10. Determining sexual dimorphism in frog measurement data: integration of statistical significance, measurement error, effect size and biological significance

    Directory of Open Access Journals (Sweden)

    Hayek Lee-Ann C.

    2005-01-01

    Full Text Available Several analytic techniques have been used to determine sexual dimorphism in vertebrate morphological measurement data with no emergent consensus on which technique is superior. A further confounding problem for frog data is the existence of considerable measurement error. To determine dimorphism, we examine a single hypothesis (Ho = equal means for two groups (females and males. We demonstrate that frog measurement data meet assumptions for clearly defined statistical hypothesis testing with statistical linear models rather than those of exploratory multivariate techniques such as principal components, correlation or correspondence analysis. In order to distinguish biological from statistical significance of hypotheses, we propose a new protocol that incorporates measurement error and effect size. Measurement error is evaluated with a novel measurement error index. Effect size, widely used in the behavioral sciences and in meta-analysis studies in biology, proves to be the most useful single metric to evaluate whether statistically significant results are biologically meaningful. Definitions for a range of small, medium, and large effect sizes specifically for frog measurement data are provided. Examples with measurement data for species of the frog genus Leptodactylus are presented. The new protocol is recommended not only to evaluate sexual dimorphism for frog data but for any animal measurement data for which the measurement error index and observed or a priori effect sizes can be calculated.

  11. A Visitor's Guide to Effect Sizes--Statistical Significance versus Practical (Clinical) Importance of Research Findings

    Science.gov (United States)

    Hojat, Mohammadreza; Xu, Gang

    2004-01-01

    Effect Sizes (ES) are an increasingly important index used to quantify the degree of practical significance of study results. This paper gives an introduction to the computation and interpretation of effect sizes from the perspective of the consumer of the research literature. The key points made are: (1) "ES" is a useful indicator of the…

  12. Simulation of natural fragmentation of rings cut from warheads

    Directory of Open Access Journals (Sweden)

    John F. Moxnes

    2015-12-01

    Full Text Available Natural fragmentation of warheads that detonates causes the casing of the warhead to split into various sized fragments through shear or radial fractures depending on the toughness, density, and grain size of the material. The best known formula for the prediction of the size distribution is the Mott formulae, which is further examined by Grady and Kipp by investigating more carefully the statistical most random way of portioning a given area into a number of entities. We examine the fragmentation behavior of radially expanding steel rings cut from a 25 mm warhead by using an in house smooth particle hydrodynamic (SPH simulation code called REGULUS. Experimental results were compared with numerical results applying varying particle size and stochastic fracture strain. The numerically obtained number of fragments was consistent with experimental results. Increasing expansion velocity of the rings increases the number of fragments. Statistical variation of the material parameters influences the fragment characteristics, especially for low expansion velocities. A least square regression fit to the cumulative number of fragments by applying a generalized Mott distribution shows that the shape parameter is around 4 for the rings, which is in contrast to the Mott distribution with a shape parameter of ½. For initially polar distributed particles, we see signs of a bimodal cumulative fragment distribution. Adding statistical variation in material parameters of the fracture model causes the velocity numerical solutions to become less sensitive to changes in resolution for Cartesian distributed particles.

  13. A novel statistical method to estimate the effective SNP size in vertebrate genomes and categorized genomic regions

    Directory of Open Access Journals (Sweden)

    Zhao Zhongming

    2006-12-01

    Full Text Available Abstract Background The local environment of single nucleotide polymorphisms (SNPs contains abundant genetic information for the study of mechanisms of mutation, genome evolution, and causes of diseases. Recent studies revealed that neighboring-nucleotide biases on SNPs were strong and the genome-wide bias patterns could be represented by a small subset of the total SNPs. It remains unsolved for the estimation of the effective SNP size, the number of SNPs that are sufficient to represent the bias patterns observed from the whole SNP data. Results To estimate the effective SNP size, we developed a novel statistical method, SNPKS, which considers both the statistical and biological significances. SNPKS consists of two major steps: to obtain an initial effective size by the Kolmogorov-Smirnov test (KS test and to find an intermediate effective size by interval evaluation. The SNPKS algorithm was implemented in computer programs and applied to the real SNP data. The effective SNP size was estimated to be 38,200, 39,300, 38,000, and 38,700 in the human, chimpanzee, dog, and mouse genomes, respectively, and 39,100, 39,600, 39,200, and 42,200 in human intergenic, genic, intronic, and CpG island regions, respectively. Conclusion SNPKS is the first statistical method to estimate the effective SNP size. It runs efficiently and greatly outperforms the algorithm implemented in SNPNB. The application of SNPKS to the real SNP data revealed the similar small effective SNP size (38,000 – 42,200 in the human, chimpanzee, dog, and mouse genomes as well as in human genomic regions. The findings suggest strong influence of genetic factors across vertebrate genomes.

  14. Nanobubble Fragmentation and Bubble-Free-Channel Shear Localization in Helium-Irradiated Submicron-Sized Copper.

    Science.gov (United States)

    Ding, Ming-Shuai; Tian, Lin; Han, Wei-Zhong; Li, Ju; Ma, Evan; Shan, Zhi-Wei

    2016-11-18

    Helium bubbles are one of the typical radiation microstructures in metals and alloys, significantly influencing their deformation behavior. However, the dynamic evolution of helium bubbles under straining is less explored so far. Here, by using in situ micromechanical testing inside a transmission electron microscope, we discover that the helium bubble not only can coalesce with adjacent bubbles, but also can split into several nanoscale bubbles under tension. Alignment of the splittings along a slip line can create a bubble-free channel, which appears softer, promotes shear localization, and accelerates the failure in the shearing-off mode. Detailed analyses unveil that the unexpected bubble fragmentation is mediated by the combination of dislocation cutting and internal surface diffusion, which is an alternative microdamage mechanism of helium irradiated copper besides the bubble coalescence.

  15. Chemical and statistical interpretation of sized aerosol particles collected at an urban site in Thessaloniki, Greece.

    Science.gov (United States)

    Tsitouridou, Roxani; Papazova, Petia; Simeonova, Pavlina; Simeonov, Vasil

    2013-01-01

    The size distribution of aerosol particles (PM0.015-PM18) in relation to their soluble inorganic species and total water soluble organic compounds (WSOC) was investigated at an urban site of Thessaloniki, Northern Greece. The sampling period was from February to July 2007. The determined compounds were compared with mass concentrations of the PM fractions for nano (N: 0.015 pollution were identified and an attempt is made to find patterns of similarity between the different sized aerosols and the seasons of monitoring. It was proven that several major latent factors are responsible for the data structure despite the size of the aerosols - mineral (soil) dust, sea sprays, secondary emissions, combustion sources and industrial impact. The seasonal separation proved to be not very specific.

  16. Nuclear energy release from fragmentation

    CERN Document Server

    Li, Cheng; Tsang, M B; Zhang, Feng-Shou

    2015-01-01

    Nuclear energy released by splitting Uranium and Thorium isotopes into two, three, four, up to eight fragments with nearly equal size are studied. We found that the energy released come from equally splitting the $^{235,238}$U and $^{230,232}$Th nuclei into to three fragments is largest. The statistical multifragmentation model is employed to calculate the probability of different breakup channels for the excited nuclei. Weighing the the probability distributions of fragments multiplicity at different excitation energies for the $^{238}$U nucleus, we found that an excitation energy between 1.2 and 2 MeV/u is optimal for the $^{235}$U, $^{238}$U, $^{230}$Th and $^{232}$Th nuclei to release nuclear energy of about 0.7-0.75 MeV/u.

  17. Estimating Effect Sizes and Expected Replication Probabilities from GWAS Summary Statistics

    DEFF Research Database (Denmark)

    Holland, Dominic; Wang, Yunpeng; Thompson, Wesley K;

    2016-01-01

    for estimating the degree of polygenicity of the phenotype and predicting the proportion of chip heritability explainable by genome-wide significant SNPs in future studies with larger sample sizes. We apply the model to recent GWAS of schizophrenia (N = 82,315) and putamen volume (N = 12,596), with approximately...... based on linear-regression association coefficients. We estimate the polygenicity of schizophrenia to be 0.037 and the putamen to be 0.001, while the respective sample sizes required to approach fully explaining the chip heritability are 10(6) and 10(5). The model can be extended to incorporate prior...

  18. Skew-Laplace and Cell-Size Distribution in Microbial Axenic Cultures: Statistical Assessment and Biological Interpretation

    Directory of Open Access Journals (Sweden)

    Olga Julià

    2010-01-01

    Full Text Available We report a skew-Laplace statistical analysis of both flow cytometry scatters and cell size from microbial strains primarily grown in batch cultures, others in chemostat cultures and bacterial aquatic populations. Cytometry scatters best fit the skew-Laplace distribution while cell size as assessed by an electronic particle analyzer exhibited a moderate fitting. Unlike the cultures, the aquatic bacterial communities clearly do not fit to a skew-Laplace distribution. Due to its versatile nature, the skew-Laplace distribution approach offers an easy, efficient, and powerful tool for distribution of frequency analysis in tandem with the flow cytometric cell sorting.

  19. Typical kernel size and number of sparse random matrices over GF(q) - a statistical physics approach

    OpenAIRE

    Alamino, Roberto C.; Saad, David

    2008-01-01

    Using methods of statistical physics, we study the average number and kernel size of general sparse random matrices over GF(q), with a given connectivity profile, in the thermodynamical limit of large matrices. We introduce a mapping of $GF(q)$ matrices onto spin systems using the representation of the cyclic group of order q as the q-th complex roots of unity. This representation facilitates the derivation of the average kernel size of random matrices using the replica approach, under the re...

  20. Designs and Methods for Association Studies and Population Size Inference in Statistical Genetics

    DEFF Research Database (Denmark)

    Waltoft, Berit Lindum

    2016-01-01

    . Population genetics In population genetics two methods concerning the inference of the population size back in time are described. Both methods are based on the site iii iv frequency spectrum (SFS), and the fact that the expected SFS only depends on the time between coalescent events back in time. The rst...

  1. Critique and Improvement of CL Common Language Effect Size Statistics of McGraw and Wong.

    Science.gov (United States)

    Vargha, Andras; Delaney, Harold D.

    2000-01-01

    Proposes a new generalization of the CL index of effect size proposed by K. McGraw and S. Wong (1992) called the "A" measure of stochastic superiority. Provides exact methods for point and interval estimation and significance tests of the A=0.5 hypothesis, as well as new generalizations of CL for the multigroup and correlated samples…

  2. Statistical tests with accurate size and power for balanced linear mixed models.

    Science.gov (United States)

    Muller, Keith E; Edwards, Lloyd J; Simpson, Sean L; Taylor, Douglas J

    2007-08-30

    The convenience of linear mixed models for Gaussian data has led to their widespread use. Unfortunately, standard mixed model tests often have greatly inflated test size in small samples. Many applications with correlated outcomes in medical imaging and other fields have simple properties which do not require the generality of a mixed model. Alternately, stating the special cases as a general linear multivariate model allows analysing them with either the univariate or multivariate approach to repeated measures (UNIREP, MULTIREP). Even in small samples, an appropriate UNIREP or MULTIREP test always controls test size and has a good power approximation, in sharp contrast to mixed model tests. Hence, mixed model tests should never be used when one of the UNIREP tests (uncorrected, Huynh-Feldt, Geisser-Greenhouse, Box conservative) or MULTIREP tests (Wilks, Hotelling-Lawley, Roy's, Pillai-Bartlett) apply. Convenient methods give exact power for the uncorrected and Box conservative tests. Simulations demonstrate that new power approximations for all four UNIREP tests eliminate most inaccuracy in existing methods. In turn, free software implements the approximations to give a better choice of sample size. Two repeated measures power analyses illustrate the methods. The examples highlight the advantages of examining the entire response surface of power as a function of sample size, mean differences, and variability.

  3. Rock Fragmentation Statistics Program Based on Image Processing Technology%基于图像处理的岩体块度分析系统

    Institute of Scientific and Technical Information of China (English)

    吕林; 尹君; 胡振襄

    2011-01-01

    建立了岩体图像分割模型,基于计算机图像处理技术,在MATLAB环境下开发了适用可靠、简便高效的岩体块度分析系统,实现了岩体原始图像的获取、图像的分辨率推导、图像预处理、融合岩块的切割、岩块几何尺寸的计算以及块度分级曲线生成等过程的汁算机处理.测试结果表明,系统具有较高的测试精度,满足实际应用要求,对爆破设计的优化具有重要意义.%The paper established a rock image segmentation model and developed a reliable, simple and highly efficient rock fragmentation analysis system based on computer image processing technology in the MATLAB environment,which realized the acquisition of original rock image,derivation of the image resolution,image preprocessing,cutting of the fusion rocks, calculation of the rock geometry size and generation of size classification curve by the processing of computer.The system testing results showed that it had a high measurement precision,which can meet the practical requirements,and played an important rule in blasting design optimization as well.

  4. Statistical modelling of wildfire size and intensity: a step toward meteorological forecasting of summer extreme fire risk

    OpenAIRE

    Hernandez, C; Keribin, C.; Drobinski, P.; Turquety, S.

    2015-01-01

    International audience; In this article we investigate the use of statistical methods for wildfire risk assessment in the Mediterranean Basin using three meteorological covariates, the 2 m temperature anomaly, the 10 m wind speed and the January– June rainfall occurrence anomaly. We focus on two remotely sensed characteristic fire variables, the burnt area (BA) and the fire radiative power (FRP), which are good proxies for fire size and intensity respectively. Using the fire data we determine...

  5. Battery sizing for a stand alone passive wind system using statistical techniques

    OpenAIRE

    Belouda, Malek; Belhadj, Jamel; Sareni, Bruno; Roboam, Xavier

    2011-01-01

    In this paper, an original optimization method to jointly determine a reduced study term and an optimum battery sizing is investigated. This storage device is used to connect a passive wind turbine system with a stand alone network. A Weibull probability density function is used to generate different wind speed data. The passive wind system is composed of a wind turbine, a permanent magnet synchronous generator feeding a diode rectifier associated with a very low voltage DC battery bus. This ...

  6. Mechanics based statistical prediction of structure size and geometry effects on safety factors for composites and other quasibrittle materials

    Directory of Open Access Journals (Sweden)

    Bažant Zdeněk P.

    2008-01-01

    Full Text Available The objective of this paper1 is a rational determination of safety factors of quasibrittle structures, taking into account their size and shape. To this end, it is necessary to establish the probability density distribution function (pdf of the structural strength. For perfectly ductile and perfectly brittle materials, the proper pdf's of the nominal strength of structure are known to be Gaussian and Weibullian, respectively, and are invariable with structure size and geometry. However, for quasibrittle materials, many of which came recently to the forefront of attention, the pdf has recently been shown to depend on structure size and geometry, varying gradually from Gaussian pdf with a remote Weibull tail at small sizes to a fully Weibull pdf at large sizes. This recent result is reviewed, and then mathematically extended in two ways: 1 to a mathematical description of structural lifetime as a function of applied (time-invariable nominal stress, and 2 to a mathematical description of the statistical parameters of the pdf of structural strength as a function of structure size and shape. Experimental verification and calibration is relegated to a subsequent journal article.

  7. A new statistical scission-point model fed with microscopic ingredients to predict fission fragments distributions; Developpement d'un nouveau modele de point de scission base sur des ingredients microscopiques

    Energy Technology Data Exchange (ETDEWEB)

    Heinrich, S

    2006-07-01

    Nucleus fission process is a very complex phenomenon and, even nowadays, no realistic models describing the overall process are available. The work presented here deals with a theoretical description of fission fragments distributions in mass, charge, energy and deformation. We have reconsidered and updated the B.D. Wilking Scission Point model. Our purpose was to test if this statistic model applied at the scission point and by introducing new results of modern microscopic calculations allows to describe quantitatively the fission fragments distributions. We calculate the surface energy available at the scission point as a function of the fragments deformations. This surface is obtained from a Hartree Fock Bogoliubov microscopic calculation which guarantee a realistic description of the potential dependence on the deformation for each fragment. The statistic balance is described by the level densities of the fragment. We have tried to avoid as much as possible the input of empirical parameters in the model. Our only parameter, the distance between each fragment at the scission point, is discussed by comparison with scission configuration obtained from full dynamical microscopic calculations. Also, the comparison between our results and experimental data is very satisfying and allow us to discuss the success and limitations of our approach. We finally proposed ideas to improve the model, in particular by applying dynamical corrections. (author)

  8. Heavy metal speciation in various grain sizes of industrially contaminated street dust using multivariate statistical analysis.

    Science.gov (United States)

    Yıldırım, Gülşen; Tokalıoğlu, Şerife

    2016-02-01

    A total of 36 street dust samples were collected from the streets of the Organised Industrial District in Kayseri, Turkey. This region includes a total of 818 work places in various industrial areas. The modified BCR (the European Community Bureau of Reference) sequential extraction procedure was applied to evaluate the mobility and bioavailability of trace elements (Cd, Co, Cr, Cu, Fe, Mn, Ni, Pb and Zn) in street dusts of the study area. The BCR was classified into three steps: water/acid soluble fraction, reducible and oxidisable fraction. The remaining residue was dissolved by using aqua regia. The concentrations of the metals in street dust samples were determined by flame atomic absorption spectrometry. Also the effect of the different grain sizes (Cu (48.9)>Pb (42.8)=Cr (42.1)>Ni (41.4)>Zn (40.9)>Co (36.6)=Mn (36.3)>Fe (3.1). No significant difference was observed among metal partitioning for the three particle sizes. Correlation, principal component and cluster analysis were applied to identify probable natural and anthropogenic sources in the region. The principal component analysis results showed that this industrial district was influenced by traffic, industrial activities, air-borne emissions and natural sources. The accuracy of the results was checked by analysis of both the BCR-701 certified reference material and by recovery studies in street dust samples.

  9. Designs and Methods for Association Studies and Population Size Inference in Statistical Genetics

    DEFF Research Database (Denmark)

    Waltoft, Berit Lindum

    diseases. In the second part statistical methods for inferring population history is discussed. Knowledge on e.g. the common ancestor of the human species, possible bottlenecks back in time, and the expected number of rare variants in each genome, may be factors in the full picture of any disease aetiology....... Epidemiology In epidemiology the wording "odds ratio" is used for the estimator of any case-control study independent of the sampling of the controls. This phrase is ambiguous without specications of the sampling schemes of the controls. When controls are sampled among the non-diseased individuals at the end......). The OR is interpreted as the eect of an exposure on the probability of being diseased at the end of follow-up, while the interpretation of the IRR is the eect of an exposure on the probability of becoming diseased. Through a simulation study, the OR from a classical case-control study is shown to be an inconsistent...

  10. Behavioral and physiological responses to subgroup size and number of people in howler monkeys inhabiting a forest fragment used for nature-based tourism.

    Science.gov (United States)

    Aguilar-Melo, Adriana R; Andresen, Ellen; Cristóbal-Azkarate, Jurgi; Arroyo-Rodríguez, Victor; Chavira, Roberto; Schondube, Jorge; Serio-Silva, Juan Carlos; Cuarón, Alfredo D

    2013-11-01

    Animals' responses to potentially threatening factors can provide important information for their conservation. Group size and human presence are potentially threatening factors to primates inhabiting small reserves used for recreation. We tested these hypotheses by evaluating behavioral and physiological responses in two groups of mantled howler monkeys (Alouatta palliata mexicana) at the "Centro Ecológico y Recreativo El Zapotal", a recreational forest reserve and zoo located in the Mexican state of Chiapas. Both groups presented fission-fusion dynamics, splitting into foraging subgroups which varied in size among, but not within days. Neither subgroup size nor number of people had an effect on fecal cortisol. Out of 16 behavioral response variables tested, the studied factors had effects on six: four were affected by subgroup size and two were affected by number of people. With increasing subgroup size, monkeys increased daily path lengths, rested less, increased foraging effort, and used more plant individuals for feeding. As the number of people increased, monkeys spent more time in lower-quality habitat, and less time engaged in social interactions. Although fecal cortisol levels were not affected by the factors studied, one of the monkey groups had almost twice the level of cortisol compared to the other group. The group with higher cortisol levels also spent significantly more time in the lower-quality habitat, compared to the other group. Our results suggest that particular behavioral adjustments might allow howler monkeys at El Zapotal to avoid physiological stress due to subgroup size and number of people. However, the fact that one of the monkey groups is showing increased cortisol levels may be interpreted as a warning sign, indicating that an adjustment threshold is being reached, at least for part of the howler monkey population in this forest fragment.

  11. A statistical mechanical model for drug release: Investigations on size and porosity dependence

    Science.gov (United States)

    Gomes Filho, Márcio Sampaio; Oliveira, Fernando Albuquerque; Barbosa, Marco Aurélio Alves

    2016-10-01

    A lattice gas model is proposed for investigating the release of drug molecules in capsules covered with semi-permeable membranes. Release patterns in one and two dimensional systems are obtained with Monte Carlo simulations and adjusted to the semi-empirical Weibull distribution function. An analytical solution to the diffusion equation is used to complement and guide simulations in one dimension. Size and porosity dependence analysis was made on the two semi-empirical parameters of the Weibull function, which are related to characteristic time and release mechanism, and our results indicate that a simple scaling law occurs only for systems with almost impermeable membranes, represented in our model by capsules with a single leaking site.

  12. Distribution of the two-sample t-test statistic following blinded sample size re-estimation.

    Science.gov (United States)

    Lu, Kaifeng

    2016-05-01

    We consider the blinded sample size re-estimation based on the simple one-sample variance estimator at an interim analysis. We characterize the exact distribution of the standard two-sample t-test statistic at the final analysis. We describe a simulation algorithm for the evaluation of the probability of rejecting the null hypothesis at given treatment effect. We compare the blinded sample size re-estimation method with two unblinded methods with respect to the empirical type I error, the empirical power, and the empirical distribution of the standard deviation estimator and final sample size. We characterize the type I error inflation across the range of standardized non-inferiority margin for non-inferiority trials, and derive the adjusted significance level to ensure type I error control for given sample size of the internal pilot study. We show that the adjusted significance level increases as the sample size of the internal pilot study increases. Copyright © 2016 John Wiley & Sons, Ltd.

  13. Experimental and Statistical Evaluation of the Size Effect on the Bending Strength of Dimension Lumber of Northeast China Larch

    Directory of Open Access Journals (Sweden)

    Yong Zhong

    2016-01-01

    Full Text Available This study investigated the size effect on the bending strength (modulus of rupture—MOR of dimension lumber of Northeast China larch (Larix gmelinii; providing a basis for the further application in light wood frame construction. Experimental and statistical evaluations were conducted on the bending strength. A total of 2409 full-size dimension lumber samples were tested by static bending tests; which included three different sizes: 2 × 3; 2 × 4; and 2 × 6. Results indicate that the size has a significant effect on the MOR. Both the chi-square (χ2 and Kolmogorov-Smirnov (K-S test results show that the lognormal distribution generally fits to the MOR better than to the normal distribution. Additionally; the effects of partial safety factor (γR and live-to-dead load ratio (ρ were studied by reliability analysis. Reliability analysis results indicate that the reliability index increases nonlinearly with the decrease of γR and the rise of ρ. Finally; the design value of bending strength and its adjusting factor of size effect of 2 × 3; 2 × 4; and 2 × 6 larch dimension lumber were obtained according to the Chinese National Standards’ requirements of the reliability index.

  14. The use of summary statistics for sample size allocation for food composition surveys and an application to the potato group.

    Science.gov (United States)

    Tsukakoshi, Yoshiki; Yasui, Akemi

    2011-11-01

    To give a quantitative guide to sample size allocation for developing sampling designs for a food composition survey, we discuss sampling strategies that consider the importance of each food; namely, consumption or production, variability of composition, and the restrictions within the available resources for sample collection and analysis are considered., Here we consider two strategies: 'proportional' and 'Neyman' are discussed. Both of these incorporate consumed quantity of foods, and we review some available statistics for allocation issues. The Neyman optimal strategy allocates less sample size for starch than proportional, because the former incorporates variability in the composition. Those strategies improved accuracy in dietary nutrient intake more than equal sample size allocation. Those strategies will be useful as we often face sample size allocation problems, wherein we decide whether to sample 'five white potatoes and five taros or nine white and one taros'. Allocating sufficient sample size for important foodstuffs is essential in assuring data quality. Nevertheless, the food composition table should be as comprehensive as possible.

  15. A statistical representation of the cosmological constant from finite size effects at the apparent horizon

    CERN Document Server

    Viaggiu, Stefano

    2016-01-01

    In this paper we present a statistical description of the cosmological constant in terms of massless bosons (gravitons). To this purpose, we use our recent results implying a non vanishing temperature ${T_{\\Lambda}}$ for the cosmological constant. In particular, we found that a non vanishing $T_{\\Lambda}$ allows us to depict the cosmological constant $\\Lambda$ as composed of elementary oscillations of massless bosons of energy $\\hbar\\omega$ by means of the Bose-Einstein distribution. In this context, as happens for photons in a medium, the effective phase velocity $v_g$ of these massless excitations is not given by the speed of light $c$ but it is suppressed by a factor depending on the number of quanta present in the universe at the apparent horizon. We found interesting formulas relating the cosmological constant, the number of quanta $N$ and the mean value $\\overline{\\lambda}$ of the wavelength of the gravitons. In this context, we study the possibility to look to the gravitons system so obtained as being ...

  16. A statistical representation of the cosmological constant from finite size effects at the apparent horizon

    Science.gov (United States)

    Viaggiu, Stefano

    2016-07-01

    In this paper we present a statistical description of the cosmological constant in terms of massless bosons (gravitons). To this purpose, we use our recent results implying a non vanishing temperature {T_{Λ }} for the cosmological constant. In particular, we found that a non vanishing T_{Λ } allows us to depict the cosmological constant Λ as composed of elementary oscillations of massless bosons of energy hbar ω by means of the Bose-Einstein distribution. In this context, as happens for photons in a medium, the effective phase velocity v_g of these massless excitations is not given by the speed of light c but it is suppressed by a factor depending on the number of quanta present in the universe at the apparent horizon. We found interesting formulas relating the cosmological constant, the number of quanta N and the mean value overline{λ } of the wavelength of the gravitons. In this context, we study the possibility to look to the gravitons system so obtained as being very near to be a Bose-Einstein condensate. Finally, an attempt is done to write down the Friedmann flat equations in terms of N and overline{λ }.

  17. Statistical modelling of wildfire size and intensity: a step toward meteorological forecasting of summer extreme fire risk

    Science.gov (United States)

    Hernandez, C.; Keribin, C.; Drobinski, P.; Turquety, S.

    2015-12-01

    In this article we investigate the use of statistical methods for wildfire risk assessment in the Mediterranean Basin using three meteorological covariates, the 2 m temperature anomaly, the 10 m wind speed and the January-June rainfall occurrence anomaly. We focus on two remotely sensed characteristic fire variables, the burnt area (BA) and the fire radiative power (FRP), which are good proxies for fire size and intensity respectively. Using the fire data we determine an adequate parametric distribution function which fits best the logarithm of BA and FRP. We reconstruct the conditional density function of both variables with respect to the chosen meteorological covariates. These conditional density functions for the size and intensity of a single event give information on fire risk and can be used for the estimation of conditional probabilities of exceeding certain thresholds. By analysing these probabilities we find two fire risk regimes different from each other at the 90 % confidence level: a "background" summer fire risk regime and an "extreme" additional fire risk regime, which corresponds to higher probability of occurrence of larger fire size or intensity associated with specific weather conditions. Such a statistical approach may be the ground for a future fire risk alert system.

  18. FES Training in Aging: interim results show statistically significant improvements in mobility and muscle fiber size

    Directory of Open Access Journals (Sweden)

    Helmut Kern

    2012-03-01

    Full Text Available Aging is a multifactorial process that is characterized by decline in muscle mass and performance. Several factors, including reduced exercise, poor nutrition and modified hormonal metabolism, are responsible for changes in the rates of protein synthesis and degradation that drive skeletal muscle mass reduction with a consequent decline of force generation and mobility functional performances. Seniors with normal life style were enrolled: two groups in Vienna (n=32 and two groups in Bratislava: (n=19. All subjects were healthy and declared not to have any specific physical/disease problems. The two Vienna groups of seniors exercised for 10 weeks with two different types of training (leg press at the hospital or home-based functional electrical stimulation, h-b FES. Demografic data (age, height and weight were recorded before and after the training period and before and after the training period the patients were submitted to mobility functional analyses and muscle biopsies. The mobility functional analyses were: 1. gait speed (10m test fastest speed, in m/s; 2. time which the subject needed to rise from a chair for five times (5x Chair-Rise, in s; 3. Timed –Up-Go- Test, in s; 4. Stair-Test, in s; 5. isometric measurement of quadriceps force (Torque/kg, in Nm/kg; and 6. Dynamic Balance in mm. Preliminary analyses of muscle biopsies from quadriceps in some of the Vienna and Bratislava patients present morphometric results consistent with their functional behaviors. The statistically significant improvements in functional testings here reported demonstrates the effectiveness of h-b FES, and strongly support h-b FES, as a safe home-based method to improve contractility and performances of ageing muscles.

  19. Vibrational spectra and fragmentation pathways of size-selected, D2-tagged ammonium/methylammonium bisulfate clusters.

    Science.gov (United States)

    Johnson, Christopher J; Johnson, Mark A

    2013-12-19

    Particles consisting of ammonia and sulfuric acid are widely regarded as seeds for atmospheric aerosol nucleation, and incorporation of alkylamines has been suggested to substantially accelerate their growth. Despite significant efforts, little direct experimental evidence exists for the structures and chemical processes underlying multicomponent particle nucleation. Here we are concerned with the positively charged clusters of ammonia and sulfuric acid with compositions H(+)(NH3)m(H2SO4)n (2 ≤ m ≤ 5, 1 ≤ n ≤ 4), for which equilibrium geometry structures have been reported in recent computational searches. The computed harmonic vibrational spectra of such minimum energy structures can be directly compared with the experimental spectra of each cluster composition isolated in the laboratory using cryogenic ion chemistry methods. We present one-photon (i.e., linear) infrared action spectra of the isolated gas phase ions cryogenically cooled to 10 K, allowing us to resolve the characteristic vibrational signatures of these clusters. Because the available calculated spectra for different structural candidates have been obtained using different levels of theory, we reoptimized the previously reported structures with several common electronic structure methods and find excellent agreement can be achieved for the (m = 3, n = 2) cluster using CAM-B3LYP with only minor structural differences from the previously identified geometries. At the larger sizes, the experimental spectra strongly resemble that observed for 180 nm ammonium bisulfate particles. The characteristic ammonium- and bisulfate-localized bands are clearly evident at all sizes studied, indicating that the cluster structures are indeed ionic in nature. With the likely (3,2) structure in hand, we then explore the spectral and structural changes caused when methylamine is substituted for ammonia. This process is found to occur with minimal perturbation of the unsubstituted cluster. The thermal

  20. Statistical Analysis of a Large Sample Size Pyroshock Test Data Set Including Post Flight Data Assessment. Revision 1

    Science.gov (United States)

    Hughes, William O.; McNelis, Anne M.

    2010-01-01

    The Earth Observing System (EOS) Terra spacecraft was launched on an Atlas IIAS launch vehicle on its mission to observe planet Earth in late 1999. Prior to launch, the new design of the spacecraft's pyroshock separation system was characterized by a series of 13 separation ground tests. The analysis methods used to evaluate this unusually large amount of shock data will be discussed in this paper, with particular emphasis on population distributions and finding statistically significant families of data, leading to an overall shock separation interface level. The wealth of ground test data also allowed a derivation of a Mission Assurance level for the flight. All of the flight shock measurements were below the EOS Terra Mission Assurance level thus contributing to the overall success of the EOS Terra mission. The effectiveness of the statistical methodology for characterizing the shock interface level and for developing a flight Mission Assurance level from a large sample size of shock data is demonstrated in this paper.

  1. The Influence of Matrix Size on Statistical Properties of Co-Occurrence and Limiting Similarity Null Models.

    Science.gov (United States)

    Lavender, Thomas Michael; Schamp, Brandon S; Lamb, Eric G

    2016-01-01

    Null models exploring species co-occurrence and trait-based limiting similarity are increasingly used to explore the influence of competition on community assembly; however, assessments of common models have not thoroughly explored the influence of variation in matrix size on error rates, in spite of the fact that studies have explored community matrices that vary considerably in size. To determine how smaller matrices, which are of greatest concern, perform statistically, we generated biologically realistic presence-absence matrices ranging in size from 3-50 species and sites, as well as associated trait matrices. We examined co-occurrence tests using the C-Score statistic and independent swap algorithm. For trait-based limiting similarity null models, we used the mean nearest neighbour trait distance (NN) and the standard deviation of nearest neighbour distances (SDNN) as test statistics, and considered two common randomization algorithms: abundance independent trait shuffling (AITS), and abundance weighted trait shuffling (AWTS). Matrices as small as three × three resulted in acceptable type I error rates (p ) was associated with increased type I error rates, particularly for matrices with fewer than eight species. Type I error rates increased for limiting similarity tests using the AWTS randomization scheme when community matrices contained more than 35 sites; a similar randomization used in null models of phylogenetic dispersion has previously been viewed as robust. Notwithstanding other potential deficiencies related to the use of small matrices to represent communities, the application of both classes of null model should be restricted to matrices with 10 or more species to avoid the possibility of type II errors. Additionally, researchers should restrict the use of the AWTS randomization to matrices with fewer than 35 sites to avoid type I errors when testing for trait-based limiting similarity. The AITS randomization scheme performed better in terms of

  2. The Influence of Matrix Size on Statistical Properties of Co-Occurrence and Limiting Similarity Null Models.

    Directory of Open Access Journals (Sweden)

    Thomas Michael Lavender

    Full Text Available Null models exploring species co-occurrence and trait-based limiting similarity are increasingly used to explore the influence of competition on community assembly; however, assessments of common models have not thoroughly explored the influence of variation in matrix size on error rates, in spite of the fact that studies have explored community matrices that vary considerably in size. To determine how smaller matrices, which are of greatest concern, perform statistically, we generated biologically realistic presence-absence matrices ranging in size from 3-50 species and sites, as well as associated trait matrices. We examined co-occurrence tests using the C-Score statistic and independent swap algorithm. For trait-based limiting similarity null models, we used the mean nearest neighbour trait distance (NN and the standard deviation of nearest neighbour distances (SDNN as test statistics, and considered two common randomization algorithms: abundance independent trait shuffling (AITS, and abundance weighted trait shuffling (AWTS. Matrices as small as three × three resulted in acceptable type I error rates (p was associated with increased type I error rates, particularly for matrices with fewer than eight species. Type I error rates increased for limiting similarity tests using the AWTS randomization scheme when community matrices contained more than 35 sites; a similar randomization used in null models of phylogenetic dispersion has previously been viewed as robust. Notwithstanding other potential deficiencies related to the use of small matrices to represent communities, the application of both classes of null model should be restricted to matrices with 10 or more species to avoid the possibility of type II errors. Additionally, researchers should restrict the use of the AWTS randomization to matrices with fewer than 35 sites to avoid type I errors when testing for trait-based limiting similarity. The AITS randomization scheme performed better

  3. Nuclear energy release from fragmentation

    Energy Technology Data Exchange (ETDEWEB)

    Li, Cheng [The Key Laboratory of Beam Technology and Material Modification of Ministry of Education, College of Nuclear Science and Technology, Beijing Normal University, Beijing 100875 (China); Beijing Radiation Center, Beijing 100875 (China); Souza, S.R. [Instituto de Física, Universidade Federal do Rio de Janeiro Cidade Universitária, Caixa Postal 68528, 21945-970 Rio de Janeiro (Brazil); Tsang, M.B. [The Key Laboratory of Beam Technology and Material Modification of Ministry of Education, College of Nuclear Science and Technology, Beijing Normal University, Beijing 100875 (China); Beijing Radiation Center, Beijing 100875 (China); National Superconducting Cyclotron Laboratory and Physics and Astronomy Department, Michigan State University, East Lansing, MI 48824 (United States); Zhang, Feng-Shou, E-mail: fszhang@bnu.edu.cn [The Key Laboratory of Beam Technology and Material Modification of Ministry of Education, College of Nuclear Science and Technology, Beijing Normal University, Beijing 100875 (China); Beijing Radiation Center, Beijing 100875 (China); Center of Theoretical Nuclear Physics, National Laboratory of Heavy Ion Accelerator of Lanzhou, Lanzhou 730000 (China)

    2016-08-15

    It is well known that binary fission occurs with positive energy gain. In this article we examine the energetics of splitting uranium and thorium isotopes into various numbers of fragments (from two to eight) with nearly equal size. We find that the energy released by splitting {sup 230,232}Th and {sup 235,238}U into three equal size fragments is largest. The statistical multifragmentation model (SMM) is applied to calculate the probability of different breakup channels for excited nuclei. By weighing the probability distributions of fragment multiplicity at different excitation energies, we find the peaks of energy release for {sup 230,232}Th and {sup 235,238}U are around 0.7–0.75 MeV/u at excitation energy between 1.2 and 2 MeV/u in the primary breakup process. Taking into account the secondary de-excitation processes of primary fragments with the GEMINI code, these energy peaks fall to about 0.45 MeV/u.

  4. Nuclear energy release from fragmentation

    Science.gov (United States)

    Li, Cheng; Souza, S. R.; Tsang, M. B.; Zhang, Feng-Shou

    2016-08-01

    It is well known that binary fission occurs with positive energy gain. In this article we examine the energetics of splitting uranium and thorium isotopes into various numbers of fragments (from two to eight) with nearly equal size. We find that the energy released by splitting 230,232Th and 235,238U into three equal size fragments is largest. The statistical multifragmentation model (SMM) is applied to calculate the probability of different breakup channels for excited nuclei. By weighing the probability distributions of fragment multiplicity at different excitation energies, we find the peaks of energy release for 230,232Th and 235,238U are around 0.7-0.75 MeV/u at excitation energy between 1.2 and 2 MeV/u in the primary breakup process. Taking into account the secondary de-excitation processes of primary fragments with the GEMINI code, these energy peaks fall to about 0.45 MeV/u.

  5. The N-pact factor: evaluating the quality of empirical journals with respect to sample size and statistical power.

    Science.gov (United States)

    Fraley, R Chris; Vazire, Simine

    2014-01-01

    The authors evaluate the quality of research reported in major journals in social-personality psychology by ranking those journals with respect to their N-pact Factors (NF)-the statistical power of the empirical studies they publish to detect typical effect sizes. Power is a particularly important attribute for evaluating research quality because, relative to studies that have low power, studies that have high power are more likely to (a) to provide accurate estimates of effects, (b) to produce literatures with low false positive rates, and (c) to lead to replicable findings. The authors show that the average sample size in social-personality research is 104 and that the power to detect the typical effect size in the field is approximately 50%. Moreover, they show that there is considerable variation among journals in sample sizes and power of the studies they publish, with some journals consistently publishing higher power studies than others. The authors hope that these rankings will be of use to authors who are choosing where to submit their best work, provide hiring and promotion committees with a superior way of quantifying journal quality, and encourage competition among journals to improve their NF rankings.

  6. High-throughput manufacturing of size-tuned liposomes by a new microfluidics method using enhanced statistical tools for characterization.

    Science.gov (United States)

    Kastner, Elisabeth; Kaur, Randip; Lowry, Deborah; Moghaddam, Behfar; Wilkinson, Alexander; Perrie, Yvonne

    2014-12-30

    Microfluidics has recently emerged as a new method of manufacturing liposomes, which allows for reproducible mixing in miliseconds on the nanoliter scale. Here we investigate microfluidics-based manufacturing of liposomes. The aim of these studies was to assess the parameters in a microfluidic process by varying the total flow rate (TFR) and the flow rate ratio (FRR) of the solvent and aqueous phases. Design of experiment and multivariate data analysis were used for increased process understanding and development of predictive and correlative models. High FRR lead to the bottom-up synthesis of liposomes, with a strong correlation with vesicle size, demonstrating the ability to in-process control liposomes size; the resulting liposome size correlated with the FRR in the microfluidics process, with liposomes of 50 nm being reproducibly manufactured. Furthermore, we demonstrate the potential of a high throughput manufacturing of liposomes using microfluidics with a four-fold increase in the volumetric flow rate, maintaining liposome characteristics. The efficacy of these liposomes was demonstrated in transfection studies and was modelled using predictive modeling. Mathematical modelling identified FRR as the key variable in the microfluidic process, with the highest impact on liposome size, polydispersity and transfection efficiency. This study demonstrates microfluidics as a robust and high-throughput method for the scalable and highly reproducible manufacture of size-controlled liposomes. Furthermore, the application of statistically based process control increases understanding and allows for the generation of a design-space for controlled particle characteristics. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  7. How realistic is the pore size distribution calculated from adsorption isotherms if activated carbon is composed of fullerene-like fragments?

    Science.gov (United States)

    Terzyk, Artur P; Furmaniak, Sylwester; Harris, Peter J F; Gauden, Piotr A; Włoch, Jerzy; Kowalczyk, Piotr; Rychlicki, Gerhard

    2007-11-28

    A plausible model for the structure of non-graphitizing carbon is one which consists of curved, fullerene-like fragments grouped together in a random arrangement. Although this model was proposed several years ago, there have been no attempts to calculate the properties of such a structure. Here, we determine the density, pore size distribution and adsorption properties of a model porous carbon constructed from fullerene-like elements. Using the method proposed recently by Bhattacharya and Gubbins (BG), which was tested in this study for ideal and defective carbon slits, the pore size distributions (PSDs) of the initial model and two related carbon models are calculated. The obtained PSD curves show that two structures are micro-mesoporous (with different ratio of micro/mesopores) and the third is strictly microporous. Using the grand canonical Monte Carlo (GCMC) method, adsorption isotherms of Ar (87 K) are simulated for all the structures. Finally PSD curves are calculated using the Horvath-Kawazoe, non-local density functional theory (NLDFT), Nguyen and Do, and Barrett-Joyner-Halenda (BJH) approaches, and compared with those predicted by the BG method. This is the first study in which different methods of calculation of PSDs for carbons from adsorption data can be really verified, since absolute (i.e. true) PSDs are obtained using the BG method. This is also the first study reporting the results of computer simulations of adsorption on fullerene-like carbon models.

  8. The extended statistical analysis of toxicity tests using standardised effect sizes (SESs: a comparison of nine published papers.

    Directory of Open Access Journals (Sweden)

    Michael F W Festing

    Full Text Available The safety of chemicals, drugs, novel foods and genetically modified crops is often tested using repeat-dose sub-acute toxicity tests in rats or mice. It is important to avoid misinterpretations of the results as these tests are used to help determine safe exposure levels in humans. Treated and control groups are compared for a range of haematological, biochemical and other biomarkers which may indicate tissue damage or other adverse effects. However, the statistical analysis and presentation of such data poses problems due to the large number of statistical tests which are involved. Often, it is not clear whether a "statistically significant" effect is real or a false positive (type I error due to sampling variation. The author's conclusions appear to be reached somewhat subjectively by the pattern of statistical significances, discounting those which they judge to be type I errors and ignoring any biomarker where the p-value is greater than p = 0.05. However, by using standardised effect sizes (SESs a range of graphical methods and an over-all assessment of the mean absolute response can be made. The approach is an extension, not a replacement of existing methods. It is intended to assist toxicologists and regulators in the interpretation of the results. Here, the SES analysis has been applied to data from nine published sub-acute toxicity tests in order to compare the findings with those of the author's. Line plots, box plots and bar plots show the pattern of response. Dose-response relationships are easily seen. A "bootstrap" test compares the mean absolute differences across dose groups. In four out of seven papers where the no observed adverse effect level (NOAEL was estimated by the authors, it was set too high according to the bootstrap test, suggesting that possible toxicity is under-estimated.

  9. Fragmentation Main Model

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The fragmentation model combines patch size and patch continuity with diversity of vegetation types per patch and rarity of vegetation types per patch. A patch was...

  10. DNA fragmentation in apoptosis

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Cleavage of chromosomal DNA into oligonucleosomal size fragments is an integral part of apoptosis. Elegant biochemical work identified the DNA fragmentation factor (DFF) as a major apoptotic endonuclease for DNA fragmentation in vitro. Genetic studies in mice support the importance of DFF in DNA fragmentation and possibly in apoptosis in vivo. Recent work also suggests the existence of additional endonucleases for DNA degradation. Understanding the roles of individual endonucleases in apoptosis, and how they might coordinate to degrade DNA in different tissues during normal development and homeostasis, as well as in various diseased states, will be a major research focus in the near future.

  11. Evaluation of a global aerosol microphysics model against size-resolved particle statistics in the marine atmosphere

    Directory of Open Access Journals (Sweden)

    D. V. Spracklen

    2007-01-01

    Full Text Available A statistical synthesis of marine aerosol measurements from experiments in four different oceans is used to evaluate a global aerosol microphysics model (GLOMAP. We compare the model against observed size resolved particle concentrations, probability distributions, and the temporal persistence of different size particles. We attempt to explain the observed sub-micrometre size distributions in terms of sulfate and sea spray and quantify the possible contributions of anthropogenic sulfate and carbonaceous material to the number and mass distribution. The model predicts a bimodal size distribution that agrees well with observations as a grand average over all regions, but there are large regional differences. Notably, observed Aitken mode number concentrations are more than a factor 10 higher than in the model for the N Atlantic but a factor 7 lower than the model in the NW Pacific. We also find that modelled Aitken mode and accumulation mode geometric mean diameters are generally smaller in the model by 10–30%. Comparison with observed free tropospheric Aitken mode distributions suggests that the model underpredicts growth of these particles during descent to the marine boundary layer (MBL. Recent observations of a substantial organic component of free tropospheric aerosol could explain this discrepancy. We find that anthropogenic continental material makes a substantial contribution to N Atlantic MBL aerosol, with typically 60–90% of sulfate across the particle size range coming from anthropogenic sources, even if we analyse air that has spent an average of >120 h away from land. However, anthropogenic primary black carbon and organic carbon particles (at the emission size and quantity assumed here do not explain the large discrepancies in Aitken mode number. Several explanations for the discrepancy are suggested. The lack of lower atmospheric particle formation in the model may explain low N Atlantic particle concentrations. However, the

  12. Evaluation of a global aerosol microphysics model against size-resolved particle statistics in the marine atmosphere

    Science.gov (United States)

    Spracklen, D. V.; Pringle, K. J.; Carslaw, K. S.; Mann, G. W.; Manktelow, P.; Heintzenberg, J.

    2007-04-01

    A statistical synthesis of marine aerosol measurements from experiments in four different oceans is used to evaluate a global aerosol microphysics model (GLOMAP). We compare the model against observed size resolved particle concentrations, probability distributions, and the temporal persistence of different size particles. We attempt to explain the observed sub-micrometre size distributions in terms of sulfate and sea spray and quantify the possible contributions of anthropogenic sulfate and carbonaceous material to the number and mass distribution. The model predicts a bimodal size distribution that agrees well with observations as a grand average over all regions, but there are large regional differences. Notably, observed Aitken mode number concentrations are more than a factor 10 higher than in the model for the N Atlantic but a factor 7 lower than the model in the NW Pacific. We also find that modelled Aitken mode and accumulation mode geometric mean diameters are generally smaller in the model by 10-30%. Comparison with observed free tropospheric Aitken mode distributions suggests that the model underpredicts growth of these particles during descent to the marine boundary layer (MBL). Recent observations of a substantial organic component of free tropospheric aerosol could explain this discrepancy. We find that anthropogenic continental material makes a substantial contribution to N Atlantic MBL aerosol, with typically 60-90% of sulfate across the particle size range coming from anthropogenic sources, even if we analyse air that has spent an average of >120 h away from land. However, anthropogenic primary black carbon and organic carbon particles (at the emission size and quantity assumed here) do not explain the large discrepancies in Aitken mode number. Several explanations for the discrepancy are suggested. The lack of lower atmospheric particle formation in the model may explain low N Atlantic particle concentrations. However, the observed and modelled

  13. Spectral statistics, finite-size scaling and multifractal analysis of quasiperiodic chain with p-wave pairing

    Science.gov (United States)

    Wang, Yucheng; Wang, Yancheng; Chen, Shu

    2016-11-01

    We study the spectral and wavefunction properties of a one-dimensional incommensurate system with p-wave pairing and unveil that the system demonstrates a series of particular properties in its ciritical region. By studying the spectral statistics, we show that the bandwidth distribution and level spacing distribution in the critical region follow inverse power laws, which however break down in the extended and localized regions. By performing a finite-size scaling analysis, we can obtain some critical exponents of the system and find these exponents fulfilling a hyperscaling law in the whole critical region. We also carry out a multifractal analysis on system's wavefuntions by using a box-counting method and unveil the wavefuntions displaying different behaviors in the critical, extended and localized regions.

  14. Relationship between Pore-size Distribution and Flexibility of Adsorbent Materials: Statistical Mechanics and Future Material Characterization Techniques.

    Science.gov (United States)

    Siderius, Daniel W; Mahynski, Nathan A; Shen, Vincent K

    2017-05-01

    Measurement of the pore-size distribution (PSD) via gas adsorption and the so-called "kernel method" is a widely used characterization technique for rigid adsorbents. Yet, standard techniques and analytical equipment are not appropriate to characterize the emerging class of flexible adsorbents that deform in response to the stress imparted by an adsorbate gas, as the PSD is a characteristic of the material that varies with the gas pressure and any other external stresses. Here, we derive the PSD for a flexible adsorbent using statistical mechanics in the osmotic ensemble to draw analogy to the kernel method for rigid materials. The resultant PSD is a function of the ensemble constraints including all imposed stresses and, most importantly, the deformation free energy of the adsorbent material. Consequently, a pressure-dependent PSD is a descriptor of the deformation characteristics of an adsorbent and may be the basis of future material characterization techniques. We discuss how, given a technique for resolving pressure-dependent PSDs, the present statistical mechanical theory could enable a new generation of analytical tools that measure and characterize certain intrinsic material properties of flexible adsorbents via otherwise simple adsorption experiments.

  15. Sample Size and Statistical Conclusions from Tests of Fit to the Rasch Model According to the Rasch Unidimensional Measurement Model (Rumm) Program in Health Outcome Measurement.

    Science.gov (United States)

    Hagell, Peter; Westergren, Albert

    Sample size is a major factor in statistical null hypothesis testing, which is the basis for many approaches to testing Rasch model fit. Few sample size recommendations for testing fit to the Rasch model concern the Rasch Unidimensional Measurement Models (RUMM) software, which features chi-square and ANOVA/F-ratio based fit statistics, including Bonferroni and algebraic sample size adjustments. This paper explores the occurrence of Type I errors with RUMM fit statistics, and the effects of algebraic sample size adjustments. Data with simulated Rasch model fitting 25-item dichotomous scales and sample sizes ranging from N = 50 to N = 2500 were analysed with and without algebraically adjusted sample sizes. Results suggest the occurrence of Type I errors with N less then or equal to 500, and that Bonferroni correction as well as downward algebraic sample size adjustment are useful to avoid such errors, whereas upward adjustment of smaller samples falsely signal misfit. Our observations suggest that sample sizes around N = 250 to N = 500 may provide a good balance for the statistical interpretation of the RUMM fit statistics studied here with respect to Type I errors and under the assumption of Rasch model fit within the examined frame of reference (i.e., about 25 item parameters well targeted to the sample).

  16. Improving effect size estimation and statistical power with multi-echo fMRI and its impact on understanding the neural systems supporting mentalizing.

    Science.gov (United States)

    Lombardo, Michael V; Auyeung, Bonnie; Holt, Rosemary J; Waldman, Jack; Ruigrok, Amber N V; Mooney, Natasha; Bullmore, Edward T; Baron-Cohen, Simon; Kundu, Prantik

    2016-11-15

    Functional magnetic resonance imaging (fMRI) research is routinely criticized for being statistically underpowered due to characteristically small sample sizes and much larger sample sizes are being increasingly recommended. Additionally, various sources of artifact inherent in fMRI data can have detrimental impact on effect size estimates and statistical power. Here we show how specific removal of non-BOLD artifacts can improve effect size estimation and statistical power in task-fMRI contexts, with particular application to the social-cognitive domain of mentalizing/theory of mind. Non-BOLD variability identification and removal is achieved in a biophysical and statistically principled manner by combining multi-echo fMRI acquisition and independent components analysis (ME-ICA). Without smoothing, group-level effect size estimates on two different mentalizing tasks were enhanced by ME-ICA at a median rate of 24% in regions canonically associated with mentalizing, while much more substantial boosts (40-149%) were observed in non-canonical cerebellar areas. Effect size boosting occurs via reduction of non-BOLD noise at the subject-level and consequent reductions in between-subject variance at the group-level. Smoothing can attenuate ME-ICA-related effect size improvements in certain circumstances. Power simulations demonstrate that ME-ICA-related effect size enhancements enable much higher-powered studies at traditional sample sizes. Cerebellar effects observed after applying ME-ICA may be unobservable with conventional imaging at traditional sample sizes. Thus, ME-ICA allows for principled design-agnostic non-BOLD artifact removal that can substantially improve effect size estimates and statistical power in task-fMRI contexts. ME-ICA could mitigate some issues regarding statistical power in fMRI studies and enable novel discovery of aspects of brain organization that are currently under-appreciated and not well understood.

  17. A master curve analysis of F82H using statistical and constraint loss size adjustments of small specimen data

    Science.gov (United States)

    Odette, G. R.; Yamamoto, T.; Kishimoto, H.; Sokolov, M.; Spätig, P.; Yang, W. J.; Rensman, J.-W.; Lucas, G. E.

    2004-08-01

    We assembled a fracture toughness database for the IEA heat of F82H based on a variety of specimen sizes with a nominal ASTM E1921 master curve (MC) reference temperature T0=-119±3 °C. However, the data are not well represented by a MC. T0 decreases systematically with a decreasing deformation limit Mlim starting at ≈200, which is much higher than the E1921 censoring limit of 30, indicating large constraint loss in small specimens. The small scale yielding T0 at high Mlim is ≈98±5 °C. While, the scatter was somewhat larger than predicted, after model-based adjustments for the effects of constraint loss, the data are in reasonably good agreement with a MC with T0=-98 °C. This supports to use of MC methods to characterize irradiation embrittlement, as long as both constraint loss and statistical size effects are properly accounted for. Finally, we note various issues, including sources of the possible excess scatter, which remain to be fully assessed.

  18. A master curve analysis of F82H using statistical and constraint loss size adjustments of small specimen data

    Energy Technology Data Exchange (ETDEWEB)

    Odette, G.R. E-mail: odette@engineering.ucsb.edu; Yamamoto, T.; Kishimoto, H.; Sokolov, M.; Spaetig, P.; Yang, W.J.; Rensman, J.-W.; Lucas, G.E

    2004-08-01

    We assembled a fracture toughness database for the IEA heat of F82H based on a variety of specimen sizes with a nominal ASTM E1921 master curve (MC) reference temperature T{sub 0}=-119{+-}3 deg. C. However, the data are not well represented by a MC. T{sub 0} decreases systematically with a decreasing deformation limit M{sub lim} starting at {approx}200, which is much higher than the E1921 censoring limit of 30, indicating large constraint loss in small specimens. The small scale yielding T{sub 0} at high M{sub lim} is {approx}98{+-}5 deg. C. While, the scatter was somewhat larger than predicted, after model-based adjustments for the effects of constraint loss, the data are in reasonably good agreement with a MC with T{sub 0}=-98 deg. C. This supports to use of MC methods to characterize irradiation embrittlement, as long as both constraint loss and statistical size effects are properly accounted for. Finally, we note various issues, including sources of the possible excess scatter, which remain to be fully assessed.

  19. Heavy meson fragmentation at LHC

    Directory of Open Access Journals (Sweden)

    M. A. Gomshi Nobary

    2003-06-01

    Full Text Available   Large Hadron Collider (LHC at CERN will provide excellent opportunity to study the production and decay of heavy mesons and baryons with high statistics. We aim at the heavy mesons in this work and calculate their fragmentation functions consistent with this machine and present their total fragmentation probabilities and average fragmentation parameters.

  20. Optical/Near-IR Polarization Survey of Sh 2-29: Magnetic Fields, Dense Cloud Fragmentations and Anomalous Dust Grain Sizes

    CERN Document Server

    Santos, Fábio P; Roman-Lopes, Alexandre; Reis, Wilson; Román-Zúñiga, Carlos G

    2013-01-01

    Sh 2-29 is a conspicuous star-forming region marked by the presence of massive embedded stars as well as several notable interstellar structures. In this research, our goals were to determine the role of magnetic fields and to study the size distribution of interstellar dust particles within this turbulent environment. We have used a set of optical and near-infrared polarimetric data obtained at OPD/LNA (Brazil) and CTIO (Chile), correlated with extinction maps, 2MASS data and images from DSS and Spitzer. The region's most striking feature is a swept out interstellar cavity whose polarimetric maps indicate that magnetic field lines were dragged outwards, pilling up along its borders. This led to a higher magnetic strength value ($\\approx400\\,\\mu$G) and an abrupt increase in polarization degree, probably due to an enhancement in alignment efficiency. Furthermore, dense cloud fragmentations with peak $A_{V}$ between 20 and 37 mag were probably triggered by its expansion. The presence of $24\\,\\mu$m point-like so...

  1. Comparison of calculated and experimental results of fragmenting cylinder experiments

    Energy Technology Data Exchange (ETDEWEB)

    WILSON,L.T.; REEDAL,D.R.; KIPP,MARLIN E.; MARTINEZ,REINA R.; GRADY,D.E.

    2000-06-02

    The Grady-Kipp fragmentation model provides a physically based method for determining the fracture and breakup of materials under high loading rates. Recently, this model has been implemented into the CTH Shock Physics Code and has been used to simulate several published experiments. Materials studied in this paper are AerMet 100 steel and a 90% tungsten alloy. The experimental geometry consists of a right circular cylinder filled with an explosive main charge that is initiated at its center. The sudden expansion of the resulting detonation products causes fracture of the cylinder. Strain rates seen in the cylinder are on the order of 10{sup 4} s{sup {minus}1}. The average fragment sizes calculated with the Grady-Kipp fragmentation model successfully replicate the mean fragment size obtained from the experimental fragment distribution. When Poisson statistics are applied to the calculated local average fragment sizes, good correlation is also observed with the shape of the experimental cumulative fragment distribution. The experimental fragmentation results, CTH numerical simulations, and correlation of these numerical results with the experimental data are described.

  2. A Self-Organizing Map-Based Approach to Generating Reduced-Size, Statistically Similar Climate Datasets

    Science.gov (United States)

    Cabell, R.; Delle Monache, L.; Alessandrini, S.; Rodriguez, L.

    2015-12-01

    Climate-based studies require large amounts of data in order to produce accurate and reliable results. Many of these studies have used 30-plus year data sets in order to produce stable and high-quality results, and as a result, many such data sets are available, generally in the form of global reanalyses. While the analysis of these data lead to high-fidelity results, its processing can be very computationally expensive. This computational burden prevents the utilization of these data sets for certain applications, e.g., when rapid response is needed in crisis management and disaster planning scenarios resulting from release of toxic material in the atmosphere. We have developed a methodology to reduce large climate datasets to more manageable sizes while retaining statistically similar results when used to produce ensembles of possible outcomes. We do this by employing a Self-Organizing Map (SOM) algorithm to analyze general patterns of meteorological fields over a regional domain of interest to produce a small set of "typical days" with which to generate the model ensemble. The SOM algorithm takes as input a set of vectors and generates a 2D map of representative vectors deemed most similar to the input set and to each other. Input predictors are selected that are correlated with the model output, which in our case is an Atmospheric Transport and Dispersion (T&D) model that is highly dependent on surface winds and boundary layer depth. To choose a subset of "typical days," each input day is assigned to its closest SOM map node vector and then ranked by distance. Each node vector is treated as a distribution and days are sampled from them by percentile. Using a 30-node SOM, with sampling every 20th percentile, we have been able to reduce 30 years of the Climate Forecast System Reanalysis (CFSR) data for the month of October to 150 "typical days." To estimate the skill of this approach, the "Measure of Effectiveness" (MOE) metric is used to compare area and overlap

  3. A thermodynamic theory of dynamic fragmentation

    Energy Technology Data Exchange (ETDEWEB)

    Yew, Ching H. [Texas Univ., Austin, TX (United States); Taylor, P.A. [Sandia National Labs., Albuquerque, NM (United States)

    1993-08-01

    We present a theory of dynamic fragmentation of brittle materials based on thermodynamic arguments. We recover the expressions for average fragment size and number as originally derived by Grady. We extend the previous work by obtaining descriptions of fragment size distribution and compressibility change due to the fragmentation process. The size distribution is assumed to be proportional to the spectral power of the strain history and a sample distribution is presented for a fragmentation process corresponding to a constant rate strain history. The description of compressibility change should be useful in computational studies of fragmentation. These results should provide insight into the process of fragmentation of brittle materials from hypervelocity impact.

  4. DYNAMIC BREAKAGE AND FRAGMENTATION OF BRITTLE SINGLE PARTICLE OF VARIOUS SIZES AND STRENGTH%脆性单颗粒之动态损伤和破碎

    Institute of Scientific and Technical Information of China (English)

    周锦添; 武生智; 黄凯珠

    2003-01-01

    Breakage, comminution, crushing and fragmentation of brittle solids under dynamic impacts are an important applied mechanics and rock mechanics problem. In civil engineering applications, impact-induced-fragmentation relates to crushing of rock mass during mining process, tunneling, and aggregate production. The main objective of this paper is to outline some of our recent experimental, analytical and numerical efforts in studying the dynamic fragmentation process in brittle particle. First, an analytical solution of a solid sphere compressed dynamically between two rigid flat platens is derived analytically. Secondly, a sequence of double impact tests on spheres were conducted using an impactor Dynatup 8250 at HKUST, with both impact velocity and contact force at the impactor measured accurately. Finally, a newly developed computer program, DIFAR, is used to investigate the mechanism of dynamic fragmentation. The paper will summarize and discuss briefly all three aspects of our comprehensive approach.

  5. Statistical methodology for discrete fracture model - including fracture size, orientation uncertainty together with intensity uncertainty and variability

    Energy Technology Data Exchange (ETDEWEB)

    Darcel, C. (Itasca Consultants SAS (France)); Davy, P.; Le Goc, R.; Dreuzy, J.R. de; Bour, O. (Geosciences Rennes, UMR 6118 CNRS, Univ. def Rennes, Rennes (France))

    2009-11-15

    Investigations led for several years at Laxemar and Forsmark reveal the large heterogeneity of geological formations and associated fracturing. This project aims at reinforcing the statistical DFN modeling framework adapted to a site scale. This leads therefore to develop quantitative methods of characterization adapted to the nature of fracturing and data availability. We start with the hypothesis that the maximum likelihood DFN model is a power-law model with a density term depending on orientations. This is supported both by literature and specifically here by former analyses of the SKB data. This assumption is nevertheless thoroughly tested by analyzing the fracture trace and lineament maps. Fracture traces range roughly between 0.5 m and 10 m - i e the usual extension of the sample outcrops. Between the raw data and final data used to compute the fracture size distribution from which the size distribution model will arise, several steps are necessary, in order to correct data from finite-size, topographical and sampling effects. More precisely, a particular attention is paid to fracture segmentation status and fracture linkage consistent with the DFN model expected. The fracture scaling trend observed over both sites displays finally a shape parameter k{sub t} close to 1.2 with a density term (alpha{sub 2d}) between 1.4 and 1.8. Only two outcrops clearly display a different trend with k{sub t} close to 3 and a density term (alpha{sub 2d}) between 2 and 3.5. The fracture lineaments spread over the range between 100 meters and a few kilometers. When compared with fracture trace maps, these datasets are already interpreted and the linkage process developed previously has not to be done. Except for the subregional lineament map from Forsmark, lineaments display a clear power-law trend with a shape parameter k{sub t} equal to 3 and a density term between 2 and 4.5. The apparent variation in scaling exponent, from the outcrop scale (k{sub t} = 1.2) on one side, to

  6. A sensitivity analysis of the modified chi-square ratio statistic for equivalence testing of aerodynamic particle size distribution.

    Science.gov (United States)

    Weber, Benjamin; Lee, Sau L; Lionberger, Robert; Li, Bing V; Tsong, Yi; Hochhaus, Guenther

    2013-04-01

    Demonstration of equivalence in aerodynamic particle size distribution (APSD) is one key component for establishing bioequivalence of orally inhaled drug products. We previously proposed a modified version of the Chi-square ratio statistic (mCSRS) for APSD equivalence testing and demonstrated that the median of the distribution of the mCSRS (MmCSRS) is a robust metric when test (T) and reference (R) cascade impactor (CI) profiles are identical. Here, we systematically evaluate the behavior of the MmCSRS when T and R CI profiles differ from each other in their mean deposition and variability on a single and multiple sites. All CI profiles were generated by Monte-Carlo simulations based upon modified actual CI data. Twenty thousand sets of 30 T and 30 R CI profiles were simulated for each scenario, and the behavior of the MmCSRS was correlated to metrics that characterize the difference between T and R product in mean deposition and variability. The two key findings were, first, that the MmCSRS is more sensitive to difference between T and R CI profiles on high deposition sites, and second, that a cut-off value for APSD equivalence testing based on the MmCSRS needs to be scaled on the variability of the R product. The former is considered as beneficial for equivalence testing of CI profiles as it decreases the likelihood of failing identical CI profiles by chance, in part, due to increasing analytical variability associated with lower deposition sites. The latter is expected to be important for consistently being able to discriminate equivalent from inequivalent CI profiles.

  7. Quantum fragmentation

    CERN Document Server

    Peschanski, R

    1993-01-01

    Phenomenological and theoretical aspects of fragmentation for elementary particles (resp. nuclei) are discussed. It is shown that some concepts of classical fragmentation remain relevant in a microscopic framework, exhibiting non-trivial properties of quantum relativistic field theory (resp. lattice percolation). Email contact: pesch@amoco.saclay.cea.fr

  8. Fragmentation trees reloaded.

    Science.gov (United States)

    Böcker, Sebastian; Dührkop, Kai

    2016-01-01

    Untargeted metabolomics commonly uses liquid chromatography mass spectrometry to measure abundances of metabolites; subsequent tandem mass spectrometry is used to derive information about individual compounds. One of the bottlenecks in this experimental setup is the interpretation of fragmentation spectra to accurately and efficiently identify compounds. Fragmentation trees have become a powerful tool for the interpretation of tandem mass spectrometry data of small molecules. These trees are determined from the data using combinatorial optimization, and aim at explaining the experimental data via fragmentation cascades. Fragmentation tree computation does not require spectral or structural databases. To obtain biochemically meaningful trees, one needs an elaborate optimization function (scoring). We present a new scoring for computing fragmentation trees, transforming the combinatorial optimization into a Maximum A Posteriori estimator. We demonstrate the superiority of the new scoring for two tasks: both for the de novo identification of molecular formulas of unknown compounds, and for searching a database for structurally similar compounds, our method SIRIUS 3, performs significantly better than the previous version of our method, as well as other methods for this task. SIRIUS 3 can be a part of an untargeted metabolomics workflow, allowing researchers to investigate unknowns using automated computational methods.Graphical abstractWe present a new scoring for computing fragmentation trees from tandem mass spectrometry data based on Bayesian statistics. The best scoring fragmentation tree most likely explains the molecular formula of the measured parent ion.

  9. CONTROL OF FRAGMENTATION BY BLASTING

    Directory of Open Access Journals (Sweden)

    Branko Božić

    1998-12-01

    Full Text Available The degree of fragmentation influences the economy of the excavation operations. Characteristics of blasted rock such as fragment size, volume and mass are fundamental variables effecting the economics of a mining operation and are in effect the basis for evaluating the quality of a blast. The properties of fragmentation, such as size and shape, are very important information for the optimization of production. Three factors control the fragment size distribution: the rock structure, the quantity of explosive and its distribution within the rock mass. Over the last decade there have been considerable advances in our ability to measure and analyze blasting performance. These can now be combined with the continuing growth in computing power to develop a more effective description of rock fragmentation for use by future blasting practitioners. The paper describes a view of the fragmentation problem by blasting and the need for a new generation of engineering tools to guide the design and implementation of blasting operations.

  10. Fragmented Authoritarianism or Integrated Fragmentation

    DEFF Research Database (Denmark)

    Brødsgaard, Kjeld Erik

    of these business leaders prompts the question of whether we are seeing the development of distinct interest groups that could challenge Party and state authority and create a fragmented polity. However, through the nomenklatura system the Party has an important instrument of control to wield over business groups...... and the Party-state, I suggest the notion of integrated fragmentation....

  11. STATISTICS OF MICROLENSING CAUSTIC CROSSINGS IN Q 2237+0305: PECULIAR VELOCITY OF THE LENS GALAXY AND ACCRETION DISK SIZE

    Energy Technology Data Exchange (ETDEWEB)

    Mediavilla, E. [Instituto de Astrofísica de Canarias, Vía Láctea S/N, La Laguna E-38200 Tenerife (Spain); Jimenez-Vicente, J. [Departamento de Física Teórica y del Cosmos, Universidad de Granada, Campus de Fuentenueva E-18071 Granada (Spain); Muñoz, J. A. [Departamento de Astronomía y Astrofísica, Universidad de Valencia E-46100 Burjassot, Valencia (Spain); Mediavilla, T.; Ariza, O. [Departamento de Estadística e Investigación Operativa, Universidad de Cádiz, Avda Ramón Puyol s/n E-11202, Algeciras, Cádiz (Spain)

    2015-01-10

    We use the statistics of caustic crossings induced by microlensing in the lens system Q 2237+0305 to study the lens galaxy peculiar velocity. We calculate the caustic crossing rates for a comprehensive family of stellar mass functions and find a dependence of the average number of caustic crossings with the effective transverse velocity and the average mass, 〈n〉∝v{sub eff}/√(〈m〉), equivalent to the theoretical prediction for the case of microlenses with identical masses. We explore the possibilities of the method to measure v {sub eff} using the ∼12 yr of Optical Gravitational Lensing Experiment monitoring of the four images of Q 2237+0305. To determine a lower limit for v {sub eff}, we count, conservatively, a single caustic crossing for each one of the four high magnification events identified in the literature (plus one additional proposed by us) obtaining v{sub eff}≳240√(〈m〉/0.17 M{sub ⊙}) km s{sup −1} at 68% of confidence. From this value and the average FWHM of the four high magnification events, we obtain a lower limit of r{sub s}≳1.4√(〈m〉/0.17 M{sub ⊙}) light-days for the radius of the source (r{sub s} = FWHM/2.35). Tentative identification of three additional caustic crossing events leads to estimates of v{sub eff}≃(493±246)√(〈m〉/0.17 M{sub ⊙}) km s{sup −1} for the effective transverse velocity and of r{sub s}≃(2.7±1.3)√(〈m〉/0.17 M{sub ⊙}) light-days for the source size. The estimated transverse peculiar velocity of the galaxy is v{sub t}≃(429±246)√(〈m〉/0.17 M{sub ⊙}) km s{sup −1}.

  12. Generic behaviours in impact fragmentation

    Energy Technology Data Exchange (ETDEWEB)

    Sator, N.; Mechkov, S.; Sausset, F. [Paris-6 Univ. Pierre et Marie Curie, Lab. de Physique Theorique de la Matiere Condensee, UMR CNRS 7600, 75 - Paris (France); Mechkov, S. [Ecole Normale Superieure, Lab. de Physique Statistique, 75 - Paris (France)

    2008-02-15

    From atomic nuclei to supernovae, including plates and rocks, every cohesive system can be broken into fragments, provided that the deposited energy is sufficiently large compared to its cohesive energy. We present a simple numerical model for investigating the general properties of fragmentation. By use of molecular dynamics simulations, we study the impact fragmentation of a solid disk of interacting particles with a wall. Regardless of the particular form of the interaction potential, the fragment size distribution exhibits a power law behaviour with an exponent that increases logarithmically with the energy deposited in the system, in agreement with experiments. We expect this behaviour to be generic in fragmentation phenomena. (authors)

  13. Evidence for a Global Sampling Process in Extraction of Summary Statistics of Item Sizes in a Set.

    Science.gov (United States)

    Tokita, Midori; Ueda, Sachiyo; Ishiguchi, Akira

    2016-01-01

    Several studies have shown that our visual system may construct a "summary statistical representation" over groups of visual objects. Although there is a general understanding that human observers can accurately represent sets of a variety of features, many questions on how summary statistics, such as an average, are computed remain unanswered. This study investigated sampling properties of visual information used by human observers to extract two types of summary statistics of item sets, average and variance. We presented three models of ideal observers to extract the summary statistics: a global sampling model without sampling noise, global sampling model with sampling noise, and limited sampling model. We compared the performance of an ideal observer of each model with that of human observers using statistical efficiency analysis. Results suggest that summary statistics of items in a set may be computed without representing individual items, which makes it possible to discard the limited sampling account. Moreover, the extraction of summary statistics may not necessarily require the representation of individual objects with focused attention when the sets of items are larger than 4.

  14. Fragmentation in Biaxial Tension

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, G H; Archbold, G C; Hurricane, O A; Miller, P L

    2006-06-13

    We have carried out an experiment that places a ductile stainless steel in a state of biaxial tension at a high rate of strain. The loading of the ductile metal spherical cap is performed by the detonation of a high explosive layer with a conforming geometry to expand the metal radially outwards. Simulations of the loading and expansion of the metal predict strain rates that compare well with experimental observations. A high percentage of the HE loaded material was recovered through a soft capture process and characterization of the recovered fragments provided high quality data, including uniform strain prior to failure and fragment size. These data were used with a modified fragmentation model to determine a fragmentation energy.

  15. A statistical model for estimation of fish density including correlation in size, space, time and between species from research survey data

    DEFF Research Database (Denmark)

    Nielsen, J. Rasmus; Kristensen, Kasper; Lewy, Peter

    2014-01-01

    Trawl survey data with high spatial and seasonal coverage were analysed using a variant of the Log Gaussian Cox Process (LGCP) statistical model to estimate unbiased relative fish densities. The model estimates correlations between observations according to time, space, and fish size and includes...

  16. The ability of apolipoprotein E fragments to promote intraneuronal accumulation of amyloid beta peptide 42 is both isoform and size-specific

    Science.gov (United States)

    Dafnis, Ioannis; Argyri, Letta; Sagnou, Marina; Tzinia, Athina; Tsilibary, Effie C.; Stratikos, Efstratios; Chroni, Angeliki

    2016-01-01

    The apolipoprotein (apo) E4 isoform is the strongest risk factor for late-onset Alzheimer’s disease (AD). ApoE4 is more susceptible to proteolysis than apoE2 and apoE3 isoforms and carboxyl-terminal truncated apoE4 forms have been found in AD patients’ brain. We have previously shown that a specific apoE4 fragment, apoE4-165, promotes amyloid-peptide beta 42 (Aβ42) accumulation in human neuroblastoma SK-N-SH cells and increased intracellular reactive oxygen species formation, two events considered to occur early in AD pathogenesis. Here, we show that these effects are allele-dependent and absolutely require the apoE4 background. Furthermore, the exact length of the fragment is critical since longer or shorter length carboxyl-terminal truncated apoE4 forms do not elicit the same effects. Structural and thermodynamic analyses showed that apoE4-165 has a compact structure, in contrast to other carboxyl-terminal truncated apoE4 forms that are instead destabilized. Compared however to other allelic backgrounds, apoE4-165 is structurally distinct and less thermodynamically stable suggesting that the combination of a well-folded structure with structural plasticity is a unique characteristic of this fragment. Overall, our findings suggest that the ability of apoE fragments to promote Aβ42 intraneuronal accumulation is specific for both the apoE4 isoform and the particular structural and thermodynamic properties of the fragment. PMID:27476701

  17. Thermodynamics of fragment binding.

    Science.gov (United States)

    Ferenczy, György G; Keserű, György M

    2012-04-23

    The ligand binding pockets of proteins have preponderance of hydrophobic amino acids and are typically within the apolar interior of the protein; nevertheless, they are able to bind low complexity, polar, water-soluble fragments. In order to understand this phenomenon, we analyzed high resolution X-ray data of protein-ligand complexes from the Protein Data Bank and found that fragments bind to proteins with two near optimal geometry H-bonds on average. The linear extent of the fragment binding site was found not to be larger than 10 Å, and the H-bonding region was found to be restricted to about 5 Å on average. The number of conserved H-bonds in proteins cocrystallized with multiple different fragments is also near to 2. These fragment binding sites that are able to form limited number of strong H-bonds in a hydrophobic environment are identified as hot spots. An estimate of the free-energy gain of H-bond formation versus apolar desolvation supports that fragment sized compounds need H-bonds to achieve detectable binding. This suggests that fragment binding is mostly enthalpic that is in line with their observed binding thermodynamics documented in Isothermal Titration Calorimetry (ITC) data sets and gives a thermodynamic rationale for fragment based approaches. The binding of larger compounds tends to more rely on apolar desolvation with a corresponding increase of the entropy content of their binding free-energy. These findings explain the reported size-dependence of maximal available affinity and ligand efficiency both behaving differently in the small molecule region featured by strong H-bond formation and in the larger molecule region featured by apolar desolvation.

  18. Statistical Inference on Stochastic Dominance Efficiency. Do Omitted Risk Factors Explain the Size and Book-to-Market Effects?

    OpenAIRE

    Post, Thierry

    2003-01-01

    textabstractThis paper discusses statistical inference on the second-order stochastic dominance (SSD) efficiency of a given portfolio relative to all portfolios formed from a set of assets. We derive the asymptotic sampling distribution of the Post test statistic for SSD efficiency. Unfortunately, a test procedure based on this distribution involves low power in small samples. Bootstrapping is a more powerful approach to sampling error. We use the bootstrap to test if the Fama and French valu...

  19. Application of Fragment Ion Information as Further Evidence in Probabilistic Compound Screening Using Bayesian Statistics and Machine Learning: A Leap Toward Automation.

    Science.gov (United States)

    Woldegebriel, Michael; Zomer, Paul; Mol, Hans G J; Vivó-Truyols, Gabriel

    2016-08-02

    In this work, we introduce an automated, efficient, and elegant model to combine all pieces of evidence (e.g., expected retention times, peak shapes, isotope distributions, fragment-to-parent ratio) obtained from liquid chromatography-tandem mass spectrometry (LC-MS/MS/MS) data for screening purposes. Combining all these pieces of evidence requires a careful assessment of the uncertainties in the analytical system as well as all possible outcomes. To-date, the majority of the existing algorithms are highly dependent on user input parameters. Additionally, the screening process is tackled as a deterministic problem. In this work we present a Bayesian framework to deal with the combination of all these pieces of evidence. Contrary to conventional algorithms, the information is treated in a probabilistic way, and a final probability assessment of the presence/absence of a compound feature is computed. Additionally, all the necessary parameters except the chromatographic band broadening for the method are learned from the data in training and learning phase of the algorithm, avoiding the introduction of a large number of user-defined parameters. The proposed method was validated with a large data set and has shown improved sensitivity and specificity in comparison to a threshold-based commercial software package.

  20. Synthesis of MnCO3 nanoparticles by microemulsions: statistical evaluation of the effects of operating conditions on particle size distribution

    Science.gov (United States)

    Pagnanelli, Francesca; Granata, Giuseppe; Moscardini, Emanuela; Toro, Luigi

    2013-09-01

    Manganese carbonate nanoparticles were produced by microemulsion method. The effects of different operating conditions on nanoparticles' morphology, size, and polydispersity were determined. The reference system was made of cetyl trimethylammonium bromide, cyclohexane (as solvent), pentanol (as cosurfactant), and reactants MnCl2 and (NH4)2CO3. Investigated operating conditions were reaction time (30 and 60 min), solvent's type (isooctane and hexane), cosurfactant's dosage (3 and 4.5 g), and reactant's concentration (1.0 and 0.25 mol/L). Produced nanoparticles were characterized by X-ray diffraction, scanning electron microscope, while particle size distribution was determined by image analysis. Significance of investigated factors was assessed by statistical analysis. Cubic-rhombohedral stacked particles with average size of 69 nm and standard deviations of 31 nm were obtained. Morphology was maintained in all the investigated conditions except when cosurfactant's dosage was raised to 4.5 g. Statistical analysis showed that nanoparticles' average size can be significantly increased with respect to the reference condition by augmenting reaction time (+49 nm), by increasing the cosurfactant's dosage (+67 nm) and also by decreasing the reactant's concentration when using hexane as solvent (+110 nm). Conversely, significant diminution of average size was obtained by using isooctane (-30 nm) and hexane (-26 nm) as solvents. Polydispersity of particle size distribution can be significantly diminished by using isooctane (8 nm) and hexane (13 nm) instead of cyclohexane, by decreasing reactant's concentration in the reference condition (10 nm) and by raising cosurfactant's dosage (12 nm).

  1. Habitat Fragmentation and Native Bees: a Premature Verdict?

    Directory of Open Access Journals (Sweden)

    James H. Cane

    2001-06-01

    Full Text Available Few studies directly address the consequences of habitat fragmentation for communities of pollinating insects, particularly for the key pollinator group, bees (Hymenoptera: Apiformes. Bees typically live in habitats where nesting substrates and bloom are patchily distributed and spatially dissociated. Bee studies have all defined habitat fragments as remnant patches of floral hosts or forests, overlooking the nesting needs of bees. Several authors conclude that habitat fragmentation is broadly deleterious, but their own data show that some native species proliferate in sampled fragments. Other studies report greater densities and comparable diversities of native bees at flowers in some fragment size classes relative to undisrupted habitats, but find dramatic shifts in species composition. Insightful studies of habitat fragmentation and bees will consider fragmentation, alteration, and loss of nesting habitats, not just patches of forage plants, as well as the permeability of the surrounding matrix to interpatch movement. Inasmuch as the floral associations and nesting habits of bees are often attributes of species or subgenera, ecological interpretations hinge on authoritative identifications. Study designs must accommodate statistical problems associated with bee community samples, especially non-normal data and frequent zero values. The spatial scale of fragmentation must be appreciated: bees of medium body size can regularly fly 1-2 km from nest site to forage patch. Overall, evidence for prolonged persistence of substantial diversity and abundances of native bee communities in habitat fragments of modest size promises practical solutions for maintaining bee populations. Provided that reserve selection, design, and management can address the foraging and nesting needs of bees, networks of even small reserves may hold hope for sustaining considerable pollinator diversity and the ecological services pollinators provide.

  2. Magma Fragmentation

    Science.gov (United States)

    Gonnermann, Helge M.

    2015-05-01

    Magma fragmentation is the breakup of a continuous volume of molten rock into discrete pieces, called pyroclasts. Because magma contains bubbles of compressible magmatic volatiles, decompression of low-viscosity magma leads to rapid expansion. The magma is torn into fragments, as it is stretched into hydrodynamically unstable sheets and filaments. If the magma is highly viscous, resistance to bubble growth will instead lead to excess gas pressure and the magma will deform viscoelastically by fracturing like a glassy solid, resulting in the formation of a violently expanding gas-pyroclast mixture. In either case, fragmentation represents the conversion of potential energy into the surface energy of the newly created fragments and the kinetic energy of the expanding gas-pyroclast mixture. If magma comes into contact with external water, the conversion of thermal energy will vaporize water and quench magma at the melt-water interface, thus creating dynamic stresses that cause fragmentation and the release of kinetic energy. Lastly, shear deformation of highly viscous magma may cause brittle fractures and release seismic energy.

  3. A statistical model for estimation of fish density including correlation in size, space, time and between species from research survey data.

    Directory of Open Access Journals (Sweden)

    J Rasmus Nielsen

    Full Text Available Trawl survey data with high spatial and seasonal coverage were analysed using a variant of the Log Gaussian Cox Process (LGCP statistical model to estimate unbiased relative fish densities. The model estimates correlations between observations according to time, space, and fish size and includes zero observations and over-dispersion. The model utilises the fact the correlation between numbers of fish caught increases when the distance in space and time between the fish decreases, and the correlation between size groups in a haul increases when the difference in size decreases. Here the model is extended in two ways. Instead of assuming a natural scale size correlation, the model is further developed to allow for a transformed length scale. Furthermore, in the present application, the spatial- and size-dependent correlation between species was included. For cod (Gadus morhua and whiting (Merlangius merlangus, a common structured size correlation was fitted, and a separable structure between the time and space-size correlation was found for each species, whereas more complex structures were required to describe the correlation between species (and space-size. The within-species time correlation is strong, whereas the correlations between the species are weaker over time but strong within the year.

  4. Framing Fragmentation

    DEFF Research Database (Denmark)

    Bundgaard, Charlotte

    2009-01-01

    , contain distinctive architectural traits, not only based on rational repetition, but also supporting composition and montage as dynamic concepts. Prefab architecture is an architecture of fragmentation, individualization and changeability, and this sets up new challenges for the architect. This paper...... into separate parts or systems: skeleton, skin, services, internal cladding, etc. Each building part/system is being conceived, produced, delivered and maintained by different construction companies. Basically the building is being fragmented into separate parts living their separate lives. The architect has...... to create architectural meaning and give character to an architecture of fragmentation. Layers are both seen as conceptual as well as material frames which define certain strong properties or meanings in the architectural work. Defining layers is a way of separating and organizing; it both defines...

  5. Population pressure and farm fragmentation:

    African Journals Online (AJOL)

    user

    small but farms are further fragmented into diminutive size fields due to ... terms of household characteristics; land use and performance indicators; technology adoption .... 'best' unit of measurement of farm size, and size of enterprises within farms will ..... less common, accounting for 18 percent (3 percent) and 10 percent (7.

  6. Fluctuations of fragment observables

    CERN Document Server

    Gulminelli, F

    2006-01-01

    This contribution presents a review of our present theoretical as well as experimental knowledge of different fluctuation observables relevant to nuclear multifragmentation. The possible connection between the presence of a fluctuation peak and the occurrence of a phase transition or a critical phenomenon is critically analyzed. Many different phenomena can lead both to the creation and to the suppression of a fluctuation peak. In particular, the role of constraints due to conservation laws and to data sorting is shown to be essential. From the experimental point of view, a comparison of the available fragmentation data reveals that there is a good agreement between different data sets of basic fluctuation observables, if the fragmenting source is of comparable size. This compatibility suggests that the fragmentation process is largely independent of the reaction mechanism (central versus peripheral collisions, symmetric versus asymmetric systems, light ions versus heavy ion induced reactions). Configurationa...

  7. Enhancing the interpretation of statistical P values in toxicology studies: implementation of linear mixed models (LMMs) and standardized effect sizes (SESs).

    Science.gov (United States)

    Schmidt, Kerstin; Schmidtke, Jörg; Kohl, Christian; Wilhelm, Ralf; Schiemann, Joachim; van der Voet, Hilko; Steinberg, Pablo

    2016-03-01

    In this paper, we compare the traditional ANOVA approach to analysing data from 90-day toxicity studies with a more modern LMM approach, and we investigate the use of standardized effect sizes. The LMM approach is used to analyse weight or feed consumption data. When compared to the week-by-week ANOVA with multiple test results per week, this approach results in only one statement on differences in weight development between groups. Standardized effect sizes are calculated for the endpoints: weight, relative organ weights, haematology and clinical biochemistry. The endpoints are standardized, allowing different endpoints of the same study to be compared and providing an overall picture of group differences at a glance. Furthermore, in terms of standardized effect sizes, statistical significance and biological relevance are displayed simultaneously in a graph.

  8. Impact of pulse duration on Ho:YAG laser lithotripsy: fragmentation and dusting performance.

    Science.gov (United States)

    Bader, Markus J; Pongratz, Thomas; Khoder, Wael; Stief, Christian G; Herrmann, Thomas; Nagele, Udo; Sroka, Ronald

    2015-04-01

    In vitro investigations of Ho:YAG laser-induced stone fragmentation were performed to identify potential impacts of different pulse durations on stone fragmentation characteristics. A Ho:YAG laser system (Swiss LaserClast, EMS S.A., Nyon, Switzerland) with selectable long or short pulse mode was tested with regard to its fragmentation and laser hardware compatibility properties. The pulse duration is depending on the specific laser parameters. Fragmentation tests (hand-held, hands-free, single-pulse-induced crater) on artificial BEGO stones were performed under reproducible experimental conditions (fibre sizes: 365 and 200 µm; laser settings: 10 W through combinations of 0.5, 1, 2 J/pulse and 20, 10, 5 Hz, respectively). Differences in fragmentation rates between the two pulse duration regimes were detected with statistical significance for defined settings. Hand-held and motivated Ho:YAG laser-assisted fragmentation of BEGO stones showed no significant difference between short pulse mode and long pulse mode, neither in fragmentation rates nor in number of fragments and fragment sizes. Similarly, the results of the hands-free fragmentation tests (with and without anti-repulsion device) showed no statistical differences between long pulse and short pulse modes. The study showed that fragmentation rates for long and short pulse durations at identical power settings remain at a comparable level. Longer holmium laser pulse duration reduces stone pushback. Therefore, longer laser pulses may result in better clinical outcome of laser lithotripsy and more convenient handling during clinical use without compromising fragmentation effectiveness.

  9. Genetics of Euglossini bees (Hymenoptera in fragments of the Atlantic Forest in the region of Viçosa, MG

    Directory of Open Access Journals (Sweden)

    A. M. Waldschmidt

    Full Text Available With uncontrolled deforestation, forest fragments remain, which in most cases are in different stages of regeneration and present isolated populations. In the present study we analyzed the genetic patterns of Eulaema nigrita populations in seven Atlantic Forest fragments of different sizes and successional stages in the region of Viçosa, MG. This was done by RAPD molecular markers. We observed that the area of the fragments had no effect on the genetic variability of E. nigrita in the direction predicted by meta-population models. Medium-sized well-preserved woods presented the lowest variability, whereas large and small woods were statistically identical. The evidence supports the notion that rural areas present greater dispersal among fragments, implying greater similarity between the populations of fragments located in rural areas when compared to fragments in urban areas.

  10. Synthesis of MnCO{sub 3} nanoparticles by microemulsions: statistical evaluation of the effects of operating conditions on particle size distribution

    Energy Technology Data Exchange (ETDEWEB)

    Pagnanelli, Francesca, E-mail: francesca.pagnanelli@uniroma1.it; Granata, Giuseppe; Moscardini, Emanuela; Toro, Luigi [Sapienza University, Department of Chemistry (Italy)

    2013-09-15

    Manganese carbonate nanoparticles were produced by microemulsion method. The effects of different operating conditions on nanoparticles' morphology, size, and polydispersity were determined. The reference system was made of cetyl trimethylammonium bromide, cyclohexane (as solvent), pentanol (as cosurfactant), and reactants MnCl{sub 2} and (NH{sub 4}){sub 2}CO{sub 3}. Investigated operating conditions were reaction time (30 and 60 min), solvent's type (isooctane and hexane), cosurfactant's dosage (3 and 4.5 g), and reactant's concentration (1.0 and 0.25 mol/L). Produced nanoparticles were characterized by X-ray diffraction, scanning electron microscope, while particle size distribution was determined by image analysis. Significance of investigated factors was assessed by statistical analysis. Cubic-rhombohedral stacked particles with average size of 69 nm and standard deviations of 31 nm were obtained. Morphology was maintained in all the investigated conditions except when cosurfactant's dosage was raised to 4.5 g. Statistical analysis showed that nanoparticles' average size can be significantly increased with respect to the reference condition by augmenting reaction time (+49 nm), by increasing the cosurfactant's dosage (+67 nm) and also by decreasing the reactant's concentration when using hexane as solvent (+110 nm). Conversely, significant diminution of average size was obtained by using isooctane (-30 nm) and hexane (-26 nm) as solvents. Polydispersity of particle size distribution can be significantly diminished by using isooctane (8 nm) and hexane (13 nm) instead of cyclohexane, by decreasing reactant's concentration in the reference condition (10 nm) and by raising cosurfactant's dosage (12 nm)

  11. The relative effects of habitat loss and fragmentation on population genetic variation in the red-cockaded woodpecker (Picoides borealis).

    Science.gov (United States)

    Bruggeman, Douglas J; Wiegand, Thorsten; Fernández, Néstor

    2010-09-01

    The relative influence of habitat loss, fragmentation and matrix heterogeneity on the viability of populations is a critical area of conservation research that remains unresolved. Using simulation modelling, we provide an analysis of the influence both patch size and patch isolation have on abundance, effective population size (N(e)) and F(ST). An individual-based, spatially explicit population model based on 15 years of field work on the red-cockaded woodpecker (Picoides borealis) was applied to different landscape configurations. The variation in landscape patterns was summarized using spatial statistics based on O-ring statistics. By regressing demographic and genetics attributes that emerged across the landscape treatments against proportion of total habitat and O-ring statistics, we show that O-ring statistics provide an explicit link between population processes, habitat area, and critical thresholds of fragmentation that affect those processes. Spatial distances among land cover classes that affect biological processes translated into critical scales at which the measures of landscape structure correlated best with genetic indices. Therefore our study infers pattern from process, which contrasts with past studies of landscape genetics. We found that population genetic structure was more strongly affected by fragmentation than population size, which suggests that examining only population size may limit recognition of fragmentation effects that erode genetic variation. If effective population size is used to set recovery goals for endangered species, then habitat fragmentation effects may be sufficiently strong to prevent evaluation of recovery based on the ratio of census:effective population size alone.

  12. Small scattered fragments do not a dwarf make: biological and archaeological data indicate that prehistoric inhabitants of Palau were normal sized.

    Directory of Open Access Journals (Sweden)

    Scott M Fitzpatrick

    Full Text Available UNLABELLED: Current archaeological evidence from Palau in western Micronesia indicates that the archipelago was settled around 3000-3300 BP by normal sized populations; contrary to recent claims, they did not succumb to insular dwarfism. BACKGROUND: Previous and ongoing archaeological research of both human burial and occupation sites throughout the Palauan archipelago during the last 50 years has produced a robust data set to test hypotheses regarding initial colonization and subsequent adaptations over the past three millennia. PRINCIPAL FINDINGS: Close examination of human burials at the early (ca. 3000 BP and stratified site of Chelechol ra Orrak indicates that these were normal sized individuals. This is contrary to the recent claim of contemporaneous "small-bodied" individuals found at two cave sites by Berger et al. (2008. As we argue, their analyses are flawed on a number of different analytical levels. First, their sample size is too small and fragmentary to adequately address the variation inherent in modern humans within and outside of Palau. Second, the size and stature of all other prehistoric (both older and contemporaneous skeletal assemblages found in Palau fall within the normal parameters of modern human variation in the region, indicating this was not a case of insular dwarfism or a separate migratory group. Third, measurements taken on several skeletal elements by Berger et al. may appear to be from smaller-bodied individuals, but the sizes of these people compares well with samples from Chelechol ra Orrak. Last, archaeological, linguistic, and historical evidence demonstrates a great deal of cultural continuity in Palau through time as expected if the same population was inhabiting the archipelago. CONCLUSIONS: Prehistoric Palauan populations were normal sized and exhibit traits that fall within the normal variation for Homo sapiens-they do not support the claims by Berger et al. (2008 that there were smaller

  13. Assessing multi-taxa sensitivity to the human footprint, habitat fragmentation and loss by exploring alternative scenarios of dispersal ability and population size: A simulation approach

    Science.gov (United States)

    Brian K. Hand; Samuel A. Cushman; Erin L. Landguth; John Lucotch

    2014-01-01

    Quantifying the effects of landscape change on population connectivity is compounded by uncertainties about population size and distribution and a limited understanding of dispersal ability for most species. In addition, the effects of anthropogenic landscape change and sensitivity to regional climatic conditions interact to strongly affect habitat...

  14. Bespoke Fragments

    DEFF Research Database (Denmark)

    Kruse Aagaard, Anders

    2016-01-01

    The Ph.D. -project Bespoke Fragments seeks to explore and utilise the space emerging between the potentials of digital drawing and fabrication and the field of materials and their properties and capacities. Within this span, the project is situated in a shuttling between the virtual and the actual......, the emergence of virtual space is no longer limited to the computer's digital world, but extends into the materials' world. Creation and uncertainty are allowed as virtual parameters in both the digital and reality. Based on this notion the project suggests utilising that exact potential to develop...

  15. Organelles and chromatin fragmentation of human umbilical vein endothelial cell influence by the effects of zeta potential and size of silver nanoparticles in different manners.

    Science.gov (United States)

    Tavakol, Shima; Hoveizi, Elham; Kharrazi, Sharmin; Tavakol, Behnaz; Karimi, Shabnam; Rezayat Sorkhabadi, Seyed Mahdi

    2017-06-01

    Recently, it has been disclosed that silver nanoparticles (AgNPs) have the potential to inhibit infection and cancerous cells and eventually penetrate through injected site into the capillary due to their small size. This study focuses on the effect of size and zeta potential of bare and citrate-coated AgNPs on human umbilical vein endothelial cells (HUVECs) as main capillary cells. AgNPs with high and low concentrations and no citrate coating were synthesized by using simple wet chemical method and named as AgNP/HC, AgNP/LC, and AgNP, respectively. Citrate coated particles showed larger zeta potential of -22 mV and AgNp/HC showed the smallest size of 13.2 nm. UV-Visible spectroscopy and dynamic light scattering (DLS) were performed to evaluate particle size and hydrodynamic diameter of NPs in water and cell culture media. Results indicated that higher concentrations of citrate decreased hydrodynamic diameter and NP agglomeration. reactive oxygen species (ROS) production of all AgNPs was similar at 28 ppm although it was significantly higher than control group. Their effects on cell membrane and chromosomal structure were studied using LDH measurement and 4',6-diamidino-2-phenylindole (DAPI) staining, as well. Results demonstrated that AgNP/LC was less toxic to cells owing to higher value of IC50, minimum inhibitory concentration (MIC), and less release of LDH. Cancerous (Human Caucasian neuroblastoma) and immortal cells (Mouse embryonic fibroblast cell line) were about twice more sensitive than HUVECs to toxic effects of AgNPs. DAPI staining results showed that AgNP and AgNP/HC induced highest and lowest breaking of chromosome. Overall results suggest that viability of HUVECs will be higher than 90% when viability of cancerous cells is 50% in AgNPs chemotherapy.

  16. Increased litter size and super-ovulation rate in congenic C57BL mice carrying a polymorphic fragment of NFR/N origin at the Fecq4 locus of chromosome 9

    DEFF Research Database (Denmark)

    Liljander, Maria; Andersson, Åsa Inga Maria; Holmdahl, Rikard

    2009-01-01

    By analysing N2 mice from a cross between the inbred C57BL strain B10.Q and the NMRI-related NFR/N strain, we recently identified a quantitative trait locus (QTL) influencing litter size. This locus is now denoted Fecq4, and it is present on the murine chromosome 9. In the present paper, we....... In addition, embryos containing the Fecq4 fragment were easy to cultivate in vitro, resulting in a higher yield of embryos reaching the blastocyst stage. We propose that B10.Q.NFR/N-Fecq4 congenic mice may be used to improve breeding or super-ovulation rate in different types of genetically modified mice (on...

  17. Effects of population size on reproductive success of endangered plant Euonymus chloranthoides Yang in fragmented habitat%破碎生境中种群大小对缙云卫矛生殖成功的影响

    Institute of Scientific and Technical Information of China (English)

    胡世俊; 何平; 张春平; 张益锋

    2013-01-01

    生境破碎导致种群大小的降低,了解种群大小对生殖成功的影响对物种保护具有重要的指导意义.缙云卫矛(Euonymus chloranthoides Yang)为重庆特有濒危植物,目前缙云卫矛种群已遭受了严重的生境破碎,种群小且多处于隔离状态.本文对位于重庆北碚的该物种6个种群的生殖成功进行了研究.结果表明:小种群的自然坐果率低,种群间坐果率差异极显著(P=0.002);种群大小与坐果率之间呈显著相关(r=0.837,P=0.038);种群大小与种群幼苗比例呈显著相关(P=0.045),较小的种群中一龄级幼苗的比例也较小.这表明生境破碎后小种群不利于该物种的生殖成功,导致小种群的坐果率与幼苗比例降低.对该物种的保护要提高小种群的座果率,改善小种群的更新.%Habitat fragmentation can cause the decline of population size. To understand the effects of population size on reproductive success is of significance in species conservation. Euonymus chloranthoides Yang is an endemic and endangered plant in Chongqing of Southwest China. At present, the E. chloranthoides population has suffered from severe habitat fragmentation, with small population size and mostly in isolate status. In this paper, six E. chloranthoides populations at Beibei in Chongqing were selected to study the effects of population size on their reproductive success. Smaller populations had a lower natural fruiting rate, and the differences in the fruiting rate among the six populations were extremely significant (P = 0. 002). There was a significant correlation between the population size and fruiting rate (r = 0. 837, P = 0.038). The population size had a significant correlation with population' s seedling ratio (P = 0.045) , and the first age-class seedling ratio of the smaller populations was also smaller. The present study indicated that habitat fragmentation induced the decrease of the fruiting rate and seedling ratio of smaller populations of

  18. Isoscaling of projectile-like fragments

    Institute of Scientific and Technical Information of China (English)

    Zhong Chen; Chen Jin-Hui; Guo Wei; Ma Chun-Wang; Ma Guo-Liang; Su Qian-Min; Yan Ting-Zhi; Zuo Jia-Xu; Ma Yu-Gang; Fang De-Qing; Cai Xiang-Zhou; Chen Jin-Gen; Shen Wen-Qing; Tian Wen-Dong; Wang Kun; Wei Yi-Bin

    2006-01-01

    In this paper, the isotopic and isotonic distributions of projectile fragmentation products have been simulated by a modified statistical abrasion-ablation model and the isoscaling behaviour of projectile-like fragments has been discussed. The isoscaling parameters α andβ have been extracted respectively, for hot fragments before evaporation and cold fragments after evaporation. It looks that the evaporation has stronger effect on α than β. For cold fragments,a monotonic increase of α and |β| with the increase of Z and N is observed. The relation between isoscaling parameter and the change of isospin content is discussed.

  19. Fragmentation and shear band formation by slow compression of brittle porous media

    Science.gov (United States)

    Pál, Gergő; Jánosi, Zoltán; Kun, Ferenc; Main, Ian G.

    2016-11-01

    Localized fragmentation is an important phenomenon associated with the formation of shear bands and faults in granular media. It can be studied by empirical observation, by laboratory experiment, or by numerical simulation. Here we investigate the spatial structure and statistics of fragmentation using discrete element simulations of the strain-controlled uniaxial compression of cylindrical samples of different finite size. As the system approaches failure, damage localizes in a narrow shear band or synthetic fault "gouge" containing a large number of poorly sorted noncohesive fragments on a broad bandwidth of scales, with properties similar to those of natural and experimental faults. We determine the position and orientation of the central fault plane, the width of the shear band, and the spatial and mass distribution of fragments. The relative width of the shear band decreases as a power law of the system size, and the probability distribution of the angle of the central fault plane converges to around 30 degrees, representing an internal coefficient of friction of 0.7 or so. The mass of fragments is power law distributed, with an exponent that does not depend on scale, and is near that inferred for experimental and natural fault gouges. The fragments are in general angular, with a clear self-affine geometry. The consistency of this model with experimental and field results confirms the critical roles of preexisting heterogeneity, elastic interactions, and finite system size to grain size ratio on the development of shear bands and faults in porous media.

  20. Controllable fabrication of large-area 2D colloidal crystal masks with large size defect-free domains based on statistical experimental design

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, Yajuan, E-mail: yajuan@kth.se; Jönsson, Pär Göran; Zhao, Zhe, E-mail: zhezhao@kth.se

    2014-09-15

    Highlights: • 3000 μm{sup 2} defect-free HCP domain was successfully synthesized. • Relative humidity (RH) as well as the first rotational speed (v{sub a}) of the dual-speed procedure was identified as the quality-control parameters in spin coating. • 23% RH and v{sub a} = 1000 rpm were identified as the optimistic spin coating processing parameters for SiO{sub 2} HCP monolayer. • Statistical experimental design was demonstrated as one efficient strategy for multi-factor processing optimization. - Abstract: A large-area hexagonal packed monolayer of silica spheres with consistent defect-free domains of a size larger than 3000 μm{sup 2} was prepared by spin coating on glass substrates with the assistance of experimental design and statistical analysis. The ratio of the defect-free monolayer area to the square of sphere diameter is nearly two times of the previously reported maximum values. Several parameters involved in the spin coating systems were investigated. The results indicated that the relative humidity and the rotational speed of the first step of the spin coating had the most important impact on the ordering degree of the prepared monolayer. Furthermore, the ordering degree of the obtained monolayer increased with a decreased relative humidity. In addition, it reached an optimal value when the first rotational speed during spin coating reached a value of 1000 rpm. From this study, it can be concluded that statistical experimental design is an efficient strategy, especially for multi-factor phenomenon studies.

  1. Merging, spinning and bouncing in catastrophic collisions: Consequences for final fragment properties

    Science.gov (United States)

    Michel, P.; Benz, W.; Tanga, P.; Richardson, D. C.

    2001-11-01

    We present new simulations of collisions between asteroids which take into account the production of gravitationally reaccumulated spinning bodies, using a procedure which divides the process into two phases. Using a 3D SPH hydrocode, the fragmentation of the solid target through crack propagation is first computed. Then the simulation of the gravitational evolution and possible reaccumulation of the resulting new fragments is performed using the parallel N-body code pkdgrav. Our first simulations succeeded in reproducing fundamental properties of some well-identified asteroid families. We have now included the possibility of fragments bouncing (instead of strictly merging) when collisions occur at high speed during the gravitational phase. We present comparisons of simulations in three different impact regimes, from highly catastrophic to barely disruptive, using different values of the coefficient of restitution. The largest fragment mass resulting from the reaccumulation of smaller fragments and the ejection velocities of these fragments remain statistically similar for each regime despite the different values of the coefficient of restitution. The final fragment size distribution is also unchanged in the barely disruptive regime, whereas fewer fragments at intermediate sizes seem to be produced at higher impact energy, due to high-speed collisions between fragments during the gravitational phase which prevent merging. Distributions of fragment spins have been analyzed and results are consistent with observations, which supports the idea that disruptive impacts destroy the memory of initial spin. We also observe the natural production of satellite systems around some fragments. We plan to continue our investigations using this procedure and to improve upon the modelling of fundamental physical effects during collisions.

  2. Coal char fragmentation during pulverized coal combustion

    Energy Technology Data Exchange (ETDEWEB)

    Baxter, L.L.

    1995-07-01

    A series of investigations of coal and char fragmentation during pulverized coal combustion is reported for a suite of coals ranging in rank from lignite to low-volatile (lv) bituminous coal under combustion conditions similar to those found in commercial-scale boilers. Experimental measurements are described that utilize identical particle sizing characteristics to determine initial and final size distributions. Mechanistic interpretation of the data suggest that coal fragmentation is an insignificant event and that char fragmentation is controlled by char structure. Chars forming cenospheres fragment more extensively than solid chars. Among the chars that fragment, large particles produce more fine material than small particles. In all cases, coal and char fragmentation are seen to be sufficiently minor as to be relatively insignificant factors influencing fly ash size distribution, particle loading, and char burnout.

  3. Long-term effects of fragmentation and fragment properties on bird species richness in Hawaiian forests

    Science.gov (United States)

    David J. Flaspohler; Christian P. Giardina; Gregory P. Asner; Patrick Hart; Jonathan Price; Cassie Ka’apu Lyons; Xeronimo. Castaneda

    2010-01-01

    Forest fragmentation is a common disturbance affecting biological diversity, yet the impacts of fragmentation on many forest processes remain poorly understood. Forest restoration is likely to be more successful when it proceeds with an understanding of how native and exotic vertebrates utilize forest patches of different size. We used a system of forest fragments...

  4. Multiparametric statistics

    CERN Document Server

    Serdobolskii, Vadim Ivanovich

    2007-01-01

    This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...

  5. Fragmentation and Hadronization

    OpenAIRE

    Webber, B. R.

    1999-01-01

    Experimental data, theoretical ideas and models concerning jet fragmentation and the hadronization process are reviewed, concentrating on the following topics: factorization and small-x resummation of fragmentation functions, hadronization models, single-particle yields and spectra in Z decay, comparisons between quark and gluon jets, current and target fragmentation in deep inelastic scattering, heavy quark fragmentation, Bose-Einstein correlations and WW fragmentation.

  6. Statistics Clinic

    Science.gov (United States)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  7. A stability analysis of a modified version of the chi-square ratio statistic: implications for equivalence testing of aerodynamic particle size distribution.

    Science.gov (United States)

    Weber, Benjamin; Hochhaus, Guenther; Adams, Wallace; Lionberger, Robert; Li, Bing; Tsong, Yi; Lee, Sau L

    2013-01-01

    Demonstration of equivalence in aerodynamic particle size distribution (APSD; e.g., by comparing cascade impactor (CI) profiles) constitutes one of key in vitro tests for supporting bioequivalence between test (T) and reference (R) orally inhaled drug products (OIDPs). A chi-square ratio statistic (CSRS) was previously proposed for equivalence testing of CI profiles. However, it was reported that the CSRS could not consistently discriminate between equivalent and inequivalent CI profiles. The objective of the overall project was to develop a robust and sensitive methodology for assessing equivalence of APSD profiles of T and R OIDPs. We propose here a modified version of the CSRS (mCSRS) and evaluated systematically its behavior when T and R CI profiles were identical. Different scenarios comprising CI profiles with different number of deposition sites and shapes were generated by Monte-Carlo simulation. For each scenario, the mCSRS was applied to 20,000 independent sets of 30 T and 30 R CI profiles that were identical. Different metrics (including mean and median) of the distribution of 900 mCSRSs (30 T × 30 R) were then evaluated for their suitability as a test statistic (i.e., independent of the number of sites and shape of the CI profile) for APSD equivalence testing. The median of the distribution of 900 mCSRSs (MmCSRS) was one regardless of the number of sites and shape of the CI profile. Hence, the MmCSRS is a robust metric for CI profile equivalence testing when T and R CI profiles are identical and potentially useful for APSD equivalence testing.

  8. Comparative analyses of glass fragments from brittle fracture experiments and volcanic ash particles

    Science.gov (United States)

    Dürig, Tobias; Mele, Daniela; Dellino, Pierfrancesco; Zimanowski, Bernd

    2012-04-01

    Explosive volcanic eruptions are characterized by the rapid fragmentation of a magmatic melt into ash particles. In order to describe the energy dissipation during fragmentation it is important to understand the mechanism of material failure. A quantitative description of fragmentation is only possible under controlled laboratory conditions. Industrial silicate glasses have a high structural affinity with magmatic melts and have the advantage of being transparent, which allows the study of the evolution of fractures by optical methods on a time scale relevant for explosive volcanism. With this aim, a series of low speed edge-on hammer impact experiments on silicate glass targets has been conducted, leading to the generation of fragments in the grain-size spectra of volcanic ash. In order to verify the general transferability of the experimentally generated fragmentation dynamics to volcanic processes, the resulting products were compared, by means of statistical particle-shape analyses, to particles produced by standardized magma fragmentation experiments and to natural ash particles coming from deposits of basaltic and rhyolitic compositions from the 2004 Grimsvötn and the Quaternary Tepexitl tuff-ring eruptions, respectively. Natural ash particles from both Grimsvötn and Tepexitl show significant similarities with experimental fragments of thermally pre-stressed float glasses, indicating a dominant influence of preexisting stresses on particle shape and suggesting analogous fragmentation processes within the studied materials.

  9. A statistical comparison of cirrus particle size distributions measured using the 2-D stereo probe during the TC4, SPARTICUS, and MACPEX flight campaigns with historical cirrus datasets

    Science.gov (United States)

    Schwartz, M. Christian

    2017-08-01

    This paper addresses two straightforward questions. First, how similar are the statistics of cirrus particle size distribution (PSD) datasets collected using the Two-Dimensional Stereo (2D-S) probe to cirrus PSD datasets collected using older Particle Measuring Systems (PMS) 2-D Cloud (2DC) and 2-D Precipitation (2DP) probes? Second, how similar are the datasets when shatter-correcting post-processing is applied to the 2DC datasets? To answer these questions, a database of measured and parameterized cirrus PSDs - constructed from measurements taken during the Small Particles in Cirrus (SPARTICUS); Mid-latitude Airborne Cirrus Properties Experiment (MACPEX); and Tropical Composition, Cloud, and Climate Coupling (TC4) flight campaigns - is used.Bulk cloud quantities are computed from the 2D-S database in three ways: first, directly from the 2D-S data; second, by applying the 2D-S data to ice PSD parameterizations developed using sets of cirrus measurements collected using the older PMS probes; and third, by applying the 2D-S data to a similar parameterization developed using the 2D-S data themselves. This is done so that measurements of the same cloud volumes by parameterized versions of the 2DC and 2D-S can be compared with one another. It is thereby seen - given the same cloud field and given the same assumptions concerning ice crystal cross-sectional area, density, and radar cross section - that the parameterized 2D-S and the parameterized 2DC predict similar distributions of inferred shortwave extinction coefficient, ice water content, and 94 GHz radar reflectivity. However, the parameterization of the 2DC based on uncorrected data predicts a statistically significantly higher number of total ice crystals and a larger ratio of small ice crystals to large ice crystals than does the parameterized 2D-S. The 2DC parameterization based on shatter-corrected data also predicts statistically different numbers of ice crystals than does the parameterized 2D-S, but the

  10. Size of the accretion disk in the gravitationally lensed quasar SDSS J1004+4112 from the statistics of microlensing magnifications

    CERN Document Server

    Fian, C; Hanslmeier, A; Oscoz, A; Serra-Ricart, M; Muñoz, J A; Jiménez-Vicente, J

    2016-01-01

    We present eight monitoring seasons of the four brightest images of the gravitational lens SDSS J1004+4112 observed between December 2003 and October 2010. Using measured time delays for the images A, B and C and the model predicted time delay for image D we have removed the intrinsic quasar variability, finding microlensing events of about 0.5 and 0.7 mag of amplitude in the images C and D. From the statistics of microlensing amplitudes in images A, C, and D, we have inferred the half-light radius (at {\\lambda} rest = 2407 {\\AA}) for the accretion disk using two different methods, $R_{1/2}=8.7^{+18.5}_{-5.5} \\sqrt{M/0.3 M_\\odot}$ (histograms product) and $R_{1/2} = 4.2^{+3.2}_{-2.2} \\sqrt{M/0.3 M_\\odot}$ light-days ($\\chi^2$). The results are in agreement within uncertainties with the size predicted from the black hole mass in SDSS J1004+4112 using the thin disk theory.

  11. Statistical study of the location and size of the electron edge of the Low-Latitude Boundary Layer as observed by Cluster at mid-altitudes

    Directory of Open Access Journals (Sweden)

    Y. V. Bogdanova

    2006-10-01

    Full Text Available The nature of particle precipitations at dayside mid-altitudes can be interpreted in terms of the evolution of reconnected field lines. Due to the difference between electron and ion parallel velocities, two distinct boundary layers should be observed at mid-altitudes between the boundary between open and closed field lines and the injections in the cusp proper. At lowest latitudes, the electron-dominated boundary layer, named the "electron edge" of the Low-Latitude Boundary Layer (LLBL, contains soft-magnetosheath electrons but only high-energy ions of plasma sheet origin. A second layer, the LLBL proper, is a mixture of both ions and electrons with characteristic magnetosheath energies. The Cluster spacecraft frequently observe these two boundary layers. We present an illustrative example of a Cluster mid-altitude cusp crossing with an extended electron edge of the LLBL. This electron edge contains 10–200 eV, low-density, isotropic electrons, presumably originating from the solar wind halo population. These are occasionally observed with bursts of parallel and/or anti-parallel-directed electron beams with higher fluxes, which are possibly accelerated near the magnetopause X-line. We then use 3 years of data from mid-altitude cusp crossings (327 events to carry out a statistical study of the location and size of the electron edge of the LLBL. We find that the equatorward boundary of the LLBL electron edge is observed at 10:00–17:00 magnetic local time (MLT and is located typically between 68° and 80° invariant latitude (ILAT. The location of the electron edge shows a weak, but significant, dependence on some of the external parameters (solar wind pressure, and IMF BZ- component, in agreement with expectations from previous studies of the cusp location. The latitudinal extent of the electron edge has been estimated using new multi-spacecraft techniques. The Cluster tetrahedron crosses the electron and ion boundaries of

  12. A First Survey on the Abundance of Plastics Fragments and Particles on Two Sandy Beaches in Kuching, Sarawak, Malaysia

    Science.gov (United States)

    Noik, V. James; Mohd Tuah, P.

    2015-04-01

    Plastic fragments and particles as an emerging environmental contaminant and pollutant are gaining scientific attention in the recent decades due to the potential threats on biota. This study aims to elucidate the presence, abundance and temporal change of plastic fragments and particles from two selected beaches, namely Santubong and Trombol in Kuching on two sampling times. Morphological and polymer identification assessment on the recovered plastics was also conducted. Overall comparison statistical analysis revealed that the abundance of plastic fragments/debris on both of sampling stations were insignificantly different (p>0.05). Likewise, statistical analysis on the temporal changes on the abundance yielded no significant difference for most of the sampling sites on each respective station, except STB-S2. Morphological studies revealed physical features of plastic fragments and debris were diverse in shapes, sizes, colors and surface fatigues. FTIR fingerprinting analysis shows that polypropylene and polyethylene were the dominant plastic polymers debris on both beaches.

  13. Pollen and gene flow in fragmented habitats

    NARCIS (Netherlands)

    Kwak, Manja M.; Velterop, Odilia; van Andel, Jelte

    1998-01-01

    . Habitat fragmentation affects both plants and pollinators. Habitat fragmentation leads to changes in species richness, population number and size, density, and shape, thus to changes in the spatial arrangement of flowers. These changes influence the amount of food for flower-visiting insects and t

  14. Pollen and gene flow in fragmented habitats

    NARCIS (Netherlands)

    Kwak, Manja M.; Velterop, Odilia; van Andel, Jelte

    1998-01-01

    . Habitat fragmentation affects both plants and pollinators. Habitat fragmentation leads to changes in species richness, population number and size, density, and shape, thus to changes in the spatial arrangement of flowers. These changes influence the amount of food for flower-visiting insects and t

  15. Analysis of Transmissions Scheduling with Packet Fragmentation

    Directory of Open Access Journals (Sweden)

    Nir Menakerman

    2001-12-01

    Full Text Available We investigate a scheduling problem in which packets, or datagrams, may be fragmented. While there are a few applications to scheduling with datagram fragmentation, our model of the problem is derived from a scheduling problem present in data over CATV networks. In the scheduling problem datagrams of variable lengths must be assigned (packed into fixed length time slots. One of the capabilities of the system is the ability to break a datagram into several fragments. When a datagram is fragmented, extra bits are added to the original datagram to enable the reassembly of all the fragments. We convert the scheduling problem into the problem of bin packing with item fragmentation, which we define in the following way: we are asked to pack a list of items into a minimum number of unit capacity bins. Each item may be fragmented in which case overhead units are added to the size of every fragment. The cost associated with fragmentation renders the problem NP-hard, therefore an approximation algorithm is needed. We define a version of the well-known Next-Fit algorithm, capable of fragmenting items, and investigate its performance. We present both worst case and average case results and compare them to the case where fragmentation is not allowed.

  16. The role of pebble fragmentation in planetesimal formation II. Numerical simulations

    CERN Document Server

    Jansson, Karl Wahlberg; Syed, Mohtashim Bukhari; Blum, Jürgen

    2016-01-01

    Some scenarios for planetesimal formation go through a phase of collapse of gravitationally bound clouds of mm-cm-sized pebbles. Such clouds can form for example through the streaming instability in protoplanetary disks. We model the collapse process with a statistical model to obtain the internal structure of planetesimals with solid radii between 10 and 1,000 km. In the collapse, pebbles collide and, depending on relative speed, collisions have different outcomes. A mixture of particle sizes inside a planetesimal leads to better packing capabilities and higher densities. In this paper we apply results from new laboratory experiments of dust aggregate collisions (presented in a companion paper) to model collision outcomes. We find that the internal structure of a planetesimal is strongly dependent on both its mass and the applied fragmentation model. Low-mass planetesimals have no/few fragmenting pebble collisions in the collapse phase and end up as porous pebble-piles. The amount of fragmenting collisions i...

  17. Complexity, Diminishing Marginal Returns, and Serial Mesopotamian Fragmentation

    Directory of Open Access Journals (Sweden)

    William R. Thompson

    2015-08-01

    Full Text Available Following up on an earlier paper demonstrating statistically significant relationships between measures of recurring political-economic crises (hinterland incursions, trade collapses, economic contractions, and regime transitions and a measure of climate deterioration (the interaction of falling Tigris-Euphrates river levels and years of warming/ drying, the inter-relationships among these variables are examined more closely for the 3400–1000 bce period. Theoretically focused on a test of Tainter’s diminishing marginal return theory of societal collapse, additional indicators are introduced encompassing population (urban population size, urban popula-tion growth rate as a proxy for diminishing marginal returns, two measures of centralization/ fragmentation (including imperial size, and the indicators used for the climate interaction term in the earlier paper. The multivariate logit outcome for interactions among and between the 11 variables reinforces the earlier findings linking climate deterioration to political-economic crises, extends the climate deterioration linkage to fragmentation and population decline, and finds relatively strong support for the Tainter derived expectation that diminishing marginal returns and fragmentation are closely linked but that both are less closely linked to recurring political-economic crises than might otherwise have been anticipated.

  18. Single chain Fab (scFab fragment

    Directory of Open Access Journals (Sweden)

    Brenneis Mariam

    2007-03-01

    Full Text Available Abstract Background The connection of the variable part of the heavy chain (VH and and the variable part of the light chain (VL by a peptide linker to form a consecutive polypeptide chain (single chain antibody, scFv was a breakthrough for the functional production of antibody fragments in Escherichia coli. Being double the size of fragment variable (Fv fragments and requiring assembly of two independent polypeptide chains, functional Fab fragments are usually produced with significantly lower yields in E. coli. An antibody design combining stability and assay compatibility of the fragment antigen binding (Fab with high level bacterial expression of single chain Fv fragments would be desirable. The desired antibody fragment should be both suitable for expression as soluble antibody in E. coli and antibody phage display. Results Here, we demonstrate that the introduction of a polypeptide linker between the fragment difficult (Fd and the light chain (LC, resulting in the formation of a single chain Fab fragment (scFab, can lead to improved production of functional molecules. We tested the impact of various linker designs and modifications of the constant regions on both phage display efficiency and the yield of soluble antibody fragments. A scFab variant without cysteins (scFabΔC connecting the constant part 1 of the heavy chain (CH1 and the constant part of the light chain (CL were best suited for phage display and production of soluble antibody fragments. Beside the expression system E. coli, the new antibody format was also expressed in Pichia pastoris. Monovalent and divalent fragments (DiFabodies as well as multimers were characterised. Conclusion A new antibody design offers the generation of bivalent Fab derivates for antibody phage display and production of soluble antibody fragments. This antibody format is of particular value for high throughput proteome binder generation projects, due to the avidity effect and the possible use of

  19. Fragmentation of relativistic oxygen nuclei in interactions with a proton

    CERN Document Server

    Glagolev, V V; Lipin, V D; Lutpullaev, S L; Olimov, K K; Yuldashev, A A; Yuldashev, B S; Olimov, Kh.K.

    2001-01-01

    The data on investigation of inelastic interactions of 16O nuclei with a proton at 3.25 A GeV/c momentum by the bubble chamber method are presented. The separate characteristics as fragments isotopic composition and as topo-logical cross sections of fragmentation channels are given. The processes of light fragments formation and breakup of 16O nucleus on multicharge fragments have been investigated. The comparison of experimental data with the calculations by statistical multifragmentation model was conducted.

  20. Fragmentation measurement using image processing

    Directory of Open Access Journals (Sweden)

    Farhang Sereshki

    2016-12-01

    Full Text Available In this research, first of all, the existing problems in fragmentation measurement are reviewed for the sake of its fast and reliable evaluation. Then, the available methods used for evaluation of blast results are mentioned. The produced errors especially in recognizing the rock fragments in computer-aided methods, and also, the importance of determination of their sizes in the image analysis methods are described. After reviewing the previous work done, an algorithm is proposed for the automated determination of rock particles’ boundary in the Matlab software. This method can determinate automatically the particles boundary in the minimum time. The results of proposed method are compared with those of Split Desktop and GoldSize software in two automated and manual states. Comparing the curves extracted from different methods reveals that the proposed approach is accurately applicable in measuring the size distribution of laboratory samples, while the manual determination of boundaries in the conventional software is very time-consuming, and the results of automated netting of fragments are very different with the real value due to the error in separation of the objects.

  1. Scaling properties of crack branching and brittle fragmentation

    Directory of Open Access Journals (Sweden)

    Uvarov S.

    2011-01-01

    Full Text Available The present study is focused on the correlation of scaling properties of crack branching and brittle fragmentation with damage accumulation and a change in the fracture mechanism. The experimental results obtained from the glass fragmentation tests indicate that the size distribution of fragments has a fractal character and is described by a power law.

  2. Molecular energies from an incremental fragmentation method

    Science.gov (United States)

    Meitei, Oinam Romesh; Heßelmann, Andreas

    2016-02-01

    The systematic molecular fragmentation method by Collins and Deev [J. Chem. Phys. 125, 104104 (2006)] has been used to calculate total energies and relative conformational energies for a number of small and extended molecular systems. In contrast to the original approach by Collins, we have tested the accuracy of the fragmentation method by utilising an incremental scheme in which the energies at the lowest level of the fragmentation are calculated on an accurate quantum chemistry level while lower-cost methods are used to correct the low-level energies through a high-level fragmentation. In this work, the fragment energies at the lowest level of fragmentation were calculated using the random-phase approximation (RPA) and two recently developed extensions to the RPA while the incremental corrections at higher levels of the fragmentation were calculated using standard density functional theory (DFT) methods. The complete incremental fragmentation method has been shown to reproduce the supermolecule results with a very good accuracy, almost independent on the molecular type, size, or type of decomposition. The fragmentation method has also been used in conjunction with the DFT-SAPT (symmetry-adapted perturbation theory) method which enables a breakdown of the total nonbonding energy contributions into individual interaction energy terms. Finally, the potential problems of the method connected with the use of capping hydrogen atoms are analysed and two possible solutions are supplied.

  3. CLP-based protein fragment assembly

    CERN Document Server

    Palu', Alessandro Dal; Fogolari, Federico; Pontelli, Enrico; 10.1017/S1471068410000372

    2010-01-01

    The paper investigates a novel approach, based on Constraint Logic Programming (CLP), to predict the 3D conformation of a protein via fragments assembly. The fragments are extracted by a preprocessor-also developed for this work- from a database of known protein structures that clusters and classifies the fragments according to similarity and frequency. The problem of assembling fragments into a complete conformation is mapped to a constraint solving problem and solved using CLP. The constraint-based model uses a medium discretization degree Ca-side chain centroid protein model that offers efficiency and a good approximation for space filling. The approach adapts existing energy models to the protein representation used and applies a large neighboring search strategy. The results shows the feasibility and efficiency of the method. The declarative nature of the solution allows to include future extensions, e.g., different size fragments for better accuracy.

  4. A Meta-Meta-Analysis: Empirical Review of Statistical Power, Type I Error Rates, Effect Sizes, and Model Selection of Meta-Analyses Published in Psychology

    Science.gov (United States)

    Cafri, Guy; Kromrey, Jeffrey D.; Brannick, Michael T.

    2010-01-01

    This article uses meta-analyses published in "Psychological Bulletin" from 1995 to 2005 to describe meta-analyses in psychology, including examination of statistical power, Type I errors resulting from multiple comparisons, and model choice. Retrospective power estimates indicated that univariate categorical and continuous moderators, individual…

  5. Habitat fragmentation effects depend on complex interactions between population size and dispersal ability: Modeling influences of roads, agriculture and residential development across a range of life-history characteristics [chapter 20

    Science.gov (United States)

    Samuel A. Cushman; Bradley W. Compton; Kevin McGarigal

    2010-01-01

    Habitat loss and fragmentation are widely believed to be the most important drivers of extinction (Leakey and Lewin 1995). The habitats in which organisms live are spatially structured at a number of scales, and these patterns interact with organism perception and behavior to drive population dynamics and community structure (Johnson et al. 1992). Anthropogenic habitat...

  6. Mechanisms in Impact Fragmentation

    OpenAIRE

    Wittel, Falk K.; Carmona, Humberto A.; Kun, Ferenc; Herrmann, Hans J.

    2015-01-01

    The brittle fragmentation of spheres is studied numerically by a 3D Discrete Element Model. Large scale computer simulations are performed with models that consist of agglomerates of many spherical particles, interconnected by beam-truss elements. We focus on a detailed description of the fragmentation process and study several fragmentation mechanisms involved. The evolution of meridional cracks is studied in detail. These cracks are found to initiate in the inside of the specimen with quasi...

  7. Contextual analysis of fragmentation of the anthropomorphic figurines from the Late Neolithic site of Selevac

    Directory of Open Access Journals (Sweden)

    Marko Porčić

    2016-03-01

    Full Text Available The biographical approach to material culture and the hypothesis of deliberate fragmentation of anthropomorphic figurines are used in this paper to deduce a hypothesis that there should be an association between particular fragmentation categories and context types in the archaeological record of the Late Neolithic settlements in Central Balkans. This hypothesis is tested using published data from the site of Selevac by performing correspondence analysis and chi-square test on a contingency table in which categories of fragmentation are cross-tabulated with context types. The results are statistically significant, suggesting that complete figurines are associated with houses while transversely broken figurines are associated with pits. There is also evidence that figurines were broken differentially in respect to their original size.

  8. Multivariate statistic and time series analyses of grain-size data in Quaternary sediments of Lake El'gygytgyn, NE Russia

    Directory of Open Access Journals (Sweden)

    A. Francke

    2013-01-01

    Full Text Available Lake El'gygytgyn, located in the Far East Russian Arctic, was formed by a meteorite impact about 3.58 Ma ago. In 2009, the ICDP Lake El'gygytgyn Drilling Project obtained a continuous sediment sequence of the lacustrine deposits and the upper part of the impact breccia. Here, we present grain-size data of the past 2.6 Ma. General downcore grain-size variations yield coarser sediments during warm periods and finer ones during cold periods. According to Principal Component Analyses (PCA, the climate-dependent variations in grain-size distributions mainly occur in the coarse silt and very fine silt fraction. During interglacial periods, accumulation of coarser grain sizes in the lake center is supposed to be caused by redistribution of clastic material by a wind-induced current pattern during the ice-free period. Sediment supply to the lake is triggered by the thickness of the active layer in the catchment, and the availability of water as transport medium. During glacial periods, sedimentation at Lake El'gygytgyn is hampered by the occurrence of a perennial ice-cover with sedimentation being restricted to seasonal moats and vertical conducts through the ice. Thus, the summer temperature predominantly triggers transport of coarse material into the lake center. Time series analysis that was carried out to gain insight in the frequency of the grain-size data showed grain-size variations predominately on Milankovitch's eccentricity, obliquity and precession bands. Variations in the relative power of these three oscillation bands during the Quaternary imply that climate conditions at Lake El'gygytgyn are mainly triggered by global glacial/interglacial variations (eccentricity, obliquity and local insolation forcing (precession, respectively.

  9. A Cost-Benefit Analysis of a Collections Inventory Project: A Statistical Analysis of Inventory Data from a Medium-Sized Academic Library

    Science.gov (United States)

    Sung, Jan S.; Whisler, John A.; Sung, Nackil

    2009-01-01

    Using an electronic shelf-reading system a cost-benefit analysis was conducted of an inventory/shelf-reading project in a medium-sized academic library. Analyses include time spent, cataloging discrepancies, books found with active statuses, mis-shelving rate and distance, and subsequent use of found books. Correctly re-shelving "missing"…

  10. Enhancing the interpretation of statistical P values in toxicology studies: implementation of linear mixed models (LMMs) and standardized effect sizes (SESs)

    NARCIS (Netherlands)

    Schmidt, Kerstin; Schmidtke, Jörg; Kohl, Christian; Wilhelm, Ralf; Schiemann, Joachim; Voet, van der Hilko; Steinberg, Pablo

    2016-01-01

    In this paper, we compare the traditional ANOVA approach to analysing data from 90-day toxicity studies with a more modern LMM approach, and we investigate the use of standardized effect sizes. The LMM approach is used to analyse weight or feed consumption data. When compared to the week-by-week

  11. A statistical mixture model for estimating the proportion of unreduced pollen grains in perennial ryegrass (Lolium perenne L.) via the size of pollen grains

    NARCIS (Netherlands)

    Jansen, R.C.; Nijs, A.P.M. den

    1993-01-01

    The size of pollen grains is commonly used to indicate the ploidy level of pollen grains. In this paper observations of the diameter of pollen grains are evaluated from one diploid accession of perennial ryegrass (Lolium perenne L.), which was expected to produce diploid (unreduced) pollen grains in

  12. Fragmentation in the biopharmaceutical industry.

    Science.gov (United States)

    Goldsmith, Andrew D; Varela, Francisco E

    2017-02-01

    The large number of biopharmaceutical mergers and acquisitions (M&A) that occurred over the past decade has generated questions about whether the industry is consolidating around too-few players, negatively impacting both the number of medicines developed and overall innovation. However, closer examination of the level of biopharmaceutical consolidation by prescription sales shows that the industry was more fragmented in 2015 than in 2003. The trend towards increasing fragmentation is also observed across noncommercial and independent metrics over the same time period. The number and size of M&A deals has masked an active and competitive marketplace in which market growth and the number of companies entering the market exceeded the apparent reduction in the number of players caused by acquisitions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Fragmentation of neutral carbon clusters formed by high velocity atomic collision; Fragmentation d'agregats de carbone neutres formes par collision atomique a haute vitesse

    Energy Technology Data Exchange (ETDEWEB)

    Martinet, G

    2004-05-01

    The aim of this work is to understand the fragmentation of small neutral carbon clusters formed by high velocity atomic collision on atomic gas. In this experiment, the main way of deexcitation of neutral clusters formed by electron capture with ionic species is the fragmentation. To measure the channels of fragmentation, a new detection tool based on shape analysis of current pulse delivered by semiconductor detectors has been developed. For the first time, all branching ratios of neutral carbon clusters are measured in an unambiguous way for clusters size up to 10 atoms. The measurements have been compared to a statistical model in microcanonical ensemble (Microcanonical Metropolis Monte Carlo). In this model, various structural properties of carbon clusters are required. These data have been calculated with Density Functional Theory (DFT-B3LYP) to find the geometries of the clusters and then with Coupled Clusters (CCSD(T)) formalism to obtain dissociation energies and other quantities needed to compute fragmentation calculations. The experimental branching ratios have been compared to the fragmentation model which has allowed to find an energy distribution deposited in the collision. Finally, specific cluster effect has been found namely a large population of excited states. This behaviour is completely different of the atomic carbon case for which the electron capture in the ground states predominates. (author)

  14. Arc Statistics

    CERN Document Server

    Meneghetti, M; Dahle, H; Limousin, M

    2013-01-01

    The existence of an arc statistics problem was at the center of a strong debate in the last fifteen years. With the aim to clarify if the optical depth for giant gravitational arcs by galaxy clusters in the so called concordance model is compatible with observations, several studies were carried out which helped to significantly improve our knowledge of strong lensing clusters, unveiling their extremely complex internal structure. In particular, the abundance and the frequency of strong lensing events like gravitational arcs turned out to be a potentially very powerful tool to trace the structure formation. However, given the limited size of observational and theoretical data-sets, the power of arc statistics as a cosmological tool has been only minimally exploited so far. On the other hand, the last years were characterized by significant advancements in the field, and several cluster surveys that are ongoing or planned for the near future seem to have the potential to make arc statistics a competitive cosmo...

  15. Injury Statistics

    Science.gov (United States)

    ... Certification Import Safety International Recall Guidance Civil and Criminal Penalties Federal Court Orders & Decisions Research & Statistics Research & Statistics Technical Reports Injury Statistics NEISS Injury ...

  16. Cosmic Statistics of Statistics

    OpenAIRE

    Szapudi, I.; Colombi, S.; Bernardeau, F.

    1999-01-01

    The errors on statistics measured in finite galaxy catalogs are exhaustively investigated. The theory of errors on factorial moments by Szapudi & Colombi (1996) is applied to cumulants via a series expansion method. All results are subsequently extended to the weakly non-linear regime. Together with previous investigations this yields an analytic theory of the errors for moments and connected moments of counts in cells from highly nonlinear to weakly nonlinear scales. The final analytic formu...

  17. Gut microbiome diversity influenced more by the Westernized dietary regime than the body mass index as assessed using effect size statistic.

    Science.gov (United States)

    Davis, Shannon C; Yadav, Jagjit S; Barrow, Stephanie D; Robertson, Boakai K

    2017-08-01

    Human gut microbiome dysbiosis has been associated with the onset of metabolic diseases and disorders. However, the critical factors leading to dysbiosis are poorly understood. In this study, we provide increasing evidence of the association of diet type and body mass index (BMI) and how they relatively influence the taxonomic structure of the gut microbiota with respect to the causation of gut microbiome dysbiosis. The study included randomly selected Alabama residents (n = 81), including females (n = 45) and males (n = 36). The demographics data included age (33 ± 13.3 years), height (1.7 ± 0.11 meters), and weight (82.3 ± 20.6 kg). The mean BMI was 28.3 ± 7.01, equating to an overweight BMI category. A cross-sectional case-control design encompassing the newly recognized effect size approach to bioinformatics analysis was used to analyze data from donated stool samples and accompanying nutrition surveys. We investigated the microbiome variations in the Bacteroidetes-Firmicutes ratio relative to BMI, food categories, and dietary groups at stratified abundance percentages of BMI, food categories, and dietary groups (Westernized or healthy). The Pearson Correlation coefficient as an indication of effect size across Alpha diversity indices was used to test the hypothesis (H0 ): increased BMI has greater effect on taxonomic diversity than Westernized diet type, (Ha ): increased BMI does not have a greater effect on taxonomic diversity than Westernized diet type. In conclusion, we rejected the (H0 ) as our results demonstrated that Westernized diet type had an effect size of 0.22 posing a greater impact upon the gut microbiota diversity than an increased BMI with an effect size of 0.16. This implied Westernized diet as a critical factor in causing dysbiosis as compared to an overweight or obese body mass index. © 2017 The Authors. MicrobiologyOpen published by John Wiley & Sons Ltd.

  18. Coagulation–fragmentation for a finite number of particles and application to telomere clustering in the yeast nucleus

    Energy Technology Data Exchange (ETDEWEB)

    Hozé, Nathanaël [Ecole Normale Supérieure, Institute of Biology (IBENS), Group of Computational Biology and Applied Mathematics, 46 rue d' Ulm, 75005 Paris (France); Holcman, David, E-mail: holcman@biologie.ens.fr [Ecole Normale Supérieure, Institute of Biology (IBENS), Group of Computational Biology and Applied Mathematics, 46 rue d' Ulm, 75005 Paris (France); Department of Applied Mathematics, UMR 7598, Université Pierre et Marie Curie 187, 75252 Paris Cedex 05 (France)

    2012-01-30

    We develop a coagulation–fragmentation model to study a system composed of a small number of stochastic objects moving in a confined domain, that can aggregate upon binding to form local clusters of arbitrary sizes. A cluster can also dissociate into two subclusters with a uniform probability. To study the statistics of clusters, we combine a Markov chain analysis with a partition number approach. Interestingly, we obtain explicit formulas for the size and the number of clusters in terms of hypergeometric functions. Finally, we apply our analysis to study the statistical physics of telomeres (ends of chromosomes) clustering in the yeast nucleus and show that the diffusion–coagulation–fragmentation process can predict the organization of telomeres. -- Highlights: ► We develop a coagulation–fragmentation model to study a system composed of a small number of stochastic particles. ► The stochastic objects are moving in a confined domain. ► We apply our analysis to study the statistical physics of telomeres (ends of chromosomes) clustering in the yeast nucleus. ► We show that the diffusion–coagulation–fragmentation process can predict the organization of telomeres in yeast.

  19. String fragmentation; La fragmentation des cordes

    Energy Technology Data Exchange (ETDEWEB)

    Drescher, H.J.; Werner, K. [Laboratoire de Physique Subatomique et des Technologies Associees - SUBATECH, Centre National de la Recherche Scientifique, 44 - Nantes (France)

    1997-10-01

    The classical string model is used in VENUS as a fragmentation model. For the soft domain simple 2-parton strings were sufficient, whereas for higher energies up to LHC, the perturbative regime of the QCD gives additional soft gluons, which are mapped on the string as so called kinks, energy singularities between the leading partons. The kinky string model is chosen to handle fragmentation of these strings by application of the Lorentz invariant area law. The `kinky strings` model, corresponding to the perturbative gluons coming from pQCD, takes into consideration this effect by treating the partons and gluons on the same footing. The decay law is always the Artru-Menessier area law which is the most realistic since it is invariant to the Lorentz and gauge transformations. For low mass strings a manipulation of the rupture point is necessary if the string corresponds already to an elementary particle determined by the mass and the flavor content. By means of the fragmentation model it will be possible to simulate the data from future experiments at LHC and RHIC 3 refs.

  20. Limit theorems for fragmentation processes with immigration

    CERN Document Server

    Knobloch, Robert

    2012-01-01

    In this paper we extend two limit theorems which were recently obtained for fragmentation processes to such processes with immigration. More precisely, in the setting with immigration we consider a limit theorem for the process counted with a random characteristic as well as the asymptotic behaviour of an empirical measure associated with the stopping line corresponding to the first blocks, in their respective line of descent, that are smaller than a given size. In addition, we determine the asymptotic decay rate of the size of the largest block in a homogeneous fragmentation process with immigration. The techniques used to proves these results are based on submartingale arguments.

  1. Fragmentation of monoclonal antibodies

    Science.gov (United States)

    Vlasak, Josef

    2011-01-01

    Fragmentation is a degradation pathway ubiquitously observed in proteins despite the remarkable stability of peptide bond; proteins differ only by how much and where cleavage occurs. The goal of this review is to summarize reports regarding the non-enzymatic fragmentation of the peptide backbone of monoclonal antibodies (mAbs). The sites in the polypeptide chain susceptible to fragmentation are determined by a multitude of factors. Insights are provided on the intimate chemical mechanisms that can make some bonds prone to cleavage due to the presence of specific side-chains. In addition to primary structure, the secondary, tertiary and quaternary structures have a significant impact in modulating the distribution of cleavage sites by altering local flexibility, accessibility to solvent or bringing in close proximity side chains that are remote in sequence. This review focuses on cleavage sites observed in the constant regions of mAbs, with special emphasis on hinge fragmentation. The mechanisms responsible for backbone cleavage are strongly dependent on pH and can be catalyzed by metals or radicals. The distribution of cleavage sites are different under acidic compared to basic conditions, with fragmentation rates exhibiting a minimum in the pH range 5–6; therefore, the overall fragmentation pattern observed for a mAb is a complex result of structural and solvent conditions. A critical review of the techniques used to monitor fragmentation is also presented; usually a compromise has to be made between a highly sensitive method with good fragment separation and the capability to identify the cleavage site. The effect of fragmentation on the function of a mAb must be evaluated on a case-by-case basis depending on whether cleavage sites are observed in the variable or constant regions, and on the mechanism of action of the molecule. PMID:21487244

  2. Fragmentation of Care in Ectopic Pregnancy.

    Science.gov (United States)

    Stulberg, Debra B; Dahlquist, Irma; Jarosch, Christina; Lindau, Stacy T

    2016-05-01

    Ectopic pregnancy is an important cause of maternal morbidity and mortality. Women who experience fragmented care may undergo unnecessary delays to diagnosis and treatment. Based on ectopic pregnancy cases observed in clinical practice that raised our concern about fragmentation of care, we designed an exploratory study to describe the number, characteristics, and outcomes of fragmented care among patients with ectopic pregnancy at one urban academic hospital. Chart review with descriptive statistics. Fragmented care was defined as a patient being evaluated at an outside facility for possible ectopic pregnancy and transferred, referred, or discharged before receiving care at the study institution. Of 191 women seen for possible or definite ectopic pregnancy during the study period, 42 (22 %) met the study definition of fragmented care. The study was under-powered to observe statistically significant differences across groups, but we found concerning, non-significant trends: patients with fragmented care were more likely to be Medicaid recipients (65.9 vs. 58.8 %) and to experience a complication (23.8 vs. 18.1 %) compared to those with non-fragmented care. Most patients (n = 37) received no identifiable treatment prior to transfer and arrived to the study hospital with no communication to the receiving hospital from the outside provider (n = 34). Nine patients (21 %) presented with ruptured ectopic pregnancies. The fragmentation we observed in our study may contribute to previously identified socio-economic disparities in ectopic pregnancy outcomes. If future research confirms these findings, health information exchanges and regional coordination of care may be important strategies for reducing maternal mortality.

  3. Coagulation-fragmentation for a finite number of particles and application to telomere clustering in the yeast nucleus

    Science.gov (United States)

    Hozé, Nathanaël; Holcman, David

    2012-01-01

    We develop a coagulation-fragmentation model to study a system composed of a small number of stochastic objects moving in a confined domain, that can aggregate upon binding to form local clusters of arbitrary sizes. A cluster can also dissociate into two subclusters with a uniform probability. To study the statistics of clusters, we combine a Markov chain analysis with a partition number approach. Interestingly, we obtain explicit formulas for the size and the number of clusters in terms of hypergeometric functions. Finally, we apply our analysis to study the statistical physics of telomeres (ends of chromosomes) clustering in the yeast nucleus and show that the diffusion-coagulation-fragmentation process can predict the organization of telomeres.

  4. Water cluster fragmentation probed by pickup experiments

    Science.gov (United States)

    Huang, Chuanfu; Kresin, Vitaly V.; Pysanenko, Andriy; Fárník, Michal

    2016-09-01

    Electron ionization is a common tool for the mass spectrometry of atomic and molecular clusters. Any cluster can be ionized efficiently by sufficiently energetic electrons, but concomitant fragmentation can seriously obstruct the goal of size-resolved detection. We present a new general method to assess the original neutral population of the cluster beam. Clusters undergo a sticking collision with a molecule from a crossed beam, and the velocities of neat and doped cluster ion peaks are measured and compared. By making use of longitudinal momentum conservation, one can reconstruct the sizes of the neutral precursors. Here this method is applied to H2O and D2O clusters in the detected ion size range of 3-10. It is found that water clusters do fragment significantly upon electron impact: the deduced neutral precursor size is ˜3-5 times larger than the observed cluster ions. This conclusion agrees with beam size characterization by another experimental technique: photoionization after Na-doping. Abundant post-ionization fragmentation of water clusters must therefore be an important factor in the interpretation of experimental data; interestingly, there is at present no detailed microscopic understanding of the underlying fragmentation dynamics.

  5. Embedded Fragments Registry (EFR)

    Data.gov (United States)

    Department of Veterans Affairs — In 2009, the Department of Defense estimated that approximately 40,000 service members who served in OEF/OIF may have embedded fragment wounds as the result of small...

  6. DNA fragmentation in spermatozoa

    DEFF Research Database (Denmark)

    Rex, A S; Aagaard, J.; Fedder, J

    2017-01-01

    Sperm DNA Fragmentation has been extensively studied for more than a decade. In the 1940s the uniqueness of the spermatozoa protein complex which stabilizes the DNA was discovered. In the fifties and sixties, the association between unstable chromatin structure and subfertility was investigated....... In the seventies, the impact of induced DNA damage was investigated. In the 1980s the concept of sperm DNA fragmentation as related to infertility was introduced as well as the first DNA fragmentation test: the Sperm Chromatin Structure Assay (SCSA). The terminal deoxynucleotidyl transferase nick end labelling...... (TUNEL) test followed by others was introduced in the nineties. The association between DNA fragmentation in spermatozoa and pregnancy loss has been extensively investigated spurring the need for a therapeutic tool for these patients. This gave rise to an increased interest in the aetiology of DNA damage...

  7. Energy efficiency of consecutive fragmentation processes

    CERN Document Server

    Fontbona, Joaquin; Martinez, Servet

    2010-01-01

    We present a ?rst study on the energy required to reduce a unit mass fragment by consecutively using several devices, as it happens in the mining industry. Two devices are considered, which we represent as different stochastic fragmentation processes. Following the self-similar energy model introduced by Bertoin and Martinez, we compute the average energy required to attain a size x with this two-device procedure. We then asymptotically compare, as x goes to 0 or 1, its energy requirement with that of individual fragmentation processes. In particular, we show that for certain range of parameters of the fragmentation processes and of their energy cost-functions, the consecutive use of two devices can be asymptotically more efficient than using each of them separately, or conversely.

  8. Analytical Predictions of Fragment Penetration through Hollow Concrete Masonry Units

    Directory of Open Access Journals (Sweden)

    David Bogosian

    2008-01-01

    Full Text Available Modeling steel casing fragment impacts on hollow CMU poses some problems, since the fragments will typically penetrate through the front face and may also penetrate the back face. Techniques are needed for predicting (a the size of the hole created by the penetration, (b the size of the annular region of damaged concrete around the hole, and (c the residual velocity of the fragment. A series of calculations using the AUTODYN code were performed to investigate the accuracy and reliability of the model. The model uses the smooth particle hydrodynamics (SPH approach to represent the CMU. A variety of steel fragment sizes were projected at a layer of CMU, and the resulting hole size, damage, and fragment residual velocity were tabulated. Results were validated against existing empirical relationships to insure the model's applicability, while additional analyses demonstrated trends and parametric sensitivity.

  9. The Role of Pebble Fragmentation in Planetesimal Formation. II. Numerical Simulations

    Science.gov (United States)

    Wahlberg Jansson, Karl; Johansen, Anders; Bukhari Syed, Mohtashim; Blum, Jürgen

    2017-01-01

    Some scenarios for planetesimal formation go through a phase of collapse of gravitationally bound clouds of millimeter- to centimeter-size pebbles. Such clouds can form, for example, through the streaming instability in protoplanetary disks. We model the collapse process with a statistical model to obtain the internal structure of planetesimals with solid radii between 10 and 1000 km. During the collapse, pebbles collide, and depending on their relative speeds, collisions have different outcomes. A mixture of particle sizes inside a planetesimal leads to better packing capabilities and higher densities. In this paper we apply results from new laboratory experiments of dust aggregate collisions (presented in a companion paper) to model collision outcomes. We find that the internal structure of a planetesimal is strongly dependent on both its mass and the applied fragmentation model. Low-mass planetesimals have no/few fragmenting pebble collisions in the collapse phase and end up as porous pebble piles. The number of fragmenting collisions increases with increasing cloud mass, resulting in wider particle size distributions and higher density. The collapse is nevertheless “cold” in the sense that collision speeds are damped by the high collision frequency. This ensures that a significant fraction of large pebbles survive the collapse in all but the most massive clouds. Our results are in broad agreement with the observed increase in density of Kuiper Belt objects with increasing size, as exemplified by the recent characterization of the highly porous comet 67P/Churyumov–Gerasimenko.

  10. Fragment Length of Circulating Tumor DNA.

    Directory of Open Access Journals (Sweden)

    Hunter R Underhill

    2016-07-01

    Full Text Available Malignant tumors shed DNA into the circulation. The transient half-life of circulating tumor DNA (ctDNA may afford the opportunity to diagnose, monitor recurrence, and evaluate response to therapy solely through a non-invasive blood draw. However, detecting ctDNA against the normally occurring background of cell-free DNA derived from healthy cells has proven challenging, particularly in non-metastatic solid tumors. In this study, distinct differences in fragment length size between ctDNAs and normal cell-free DNA are defined. Human ctDNA in rat plasma derived from human glioblastoma multiforme stem-like cells in the rat brain and human hepatocellular carcinoma in the rat flank were found to have a shorter principal fragment length than the background rat cell-free DNA (134-144 bp vs. 167 bp, respectively. Subsequently, a similar shift in the fragment length of ctDNA in humans with melanoma and lung cancer was identified compared to healthy controls. Comparison of fragment lengths from cell-free DNA between a melanoma patient and healthy controls found that the BRAF V600E mutant allele occurred more commonly at a shorter fragment length than the fragment length of the wild-type allele (132-145 bp vs. 165 bp, respectively. Moreover, size-selecting for shorter cell-free DNA fragment lengths substantially increased the EGFR T790M mutant allele frequency in human lung cancer. These findings provide compelling evidence that experimental or bioinformatic isolation of a specific subset of fragment lengths from cell-free DNA may improve detection of ctDNA.

  11. Fragment Length of Circulating Tumor DNA.

    Science.gov (United States)

    Underhill, Hunter R; Kitzman, Jacob O; Hellwig, Sabine; Welker, Noah C; Daza, Riza; Baker, Daniel N; Gligorich, Keith M; Rostomily, Robert C; Bronner, Mary P; Shendure, Jay

    2016-07-01

    Malignant tumors shed DNA into the circulation. The transient half-life of circulating tumor DNA (ctDNA) may afford the opportunity to diagnose, monitor recurrence, and evaluate response to therapy solely through a non-invasive blood draw. However, detecting ctDNA against the normally occurring background of cell-free DNA derived from healthy cells has proven challenging, particularly in non-metastatic solid tumors. In this study, distinct differences in fragment length size between ctDNAs and normal cell-free DNA are defined. Human ctDNA in rat plasma derived from human glioblastoma multiforme stem-like cells in the rat brain and human hepatocellular carcinoma in the rat flank were found to have a shorter principal fragment length than the background rat cell-free DNA (134-144 bp vs. 167 bp, respectively). Subsequently, a similar shift in the fragment length of ctDNA in humans with melanoma and lung cancer was identified compared to healthy controls. Comparison of fragment lengths from cell-free DNA between a melanoma patient and healthy controls found that the BRAF V600E mutant allele occurred more commonly at a shorter fragment length than the fragment length of the wild-type allele (132-145 bp vs. 165 bp, respectively). Moreover, size-selecting for shorter cell-free DNA fragment lengths substantially increased the EGFR T790M mutant allele frequency in human lung cancer. These findings provide compelling evidence that experimental or bioinformatic isolation of a specific subset of fragment lengths from cell-free DNA may improve detection of ctDNA.

  12. Observations of Titan IIIC Transtage Fragmentation Debris

    Science.gov (United States)

    Cowardin, Heather; Seitzer, P.; Abercromby, K.; Barker, E.; Buckalew, B.; Cardona, T.; Krisko, P.; Lederer, S.

    2013-01-01

    The fragmentation of a Titan IIIC Transtage (1968-081) on 21 February 1992 is one of only two known break-ups in or near geosynchronous orbit. The original rocket body and 24 pieces of debris are currently being tracked by the U. S. Space Surveillance Network (SSN). The rocket body (SSN# 3432) and several of the original fragments (SSN# 25000, 25001, 30000, and 33511) were observed in survey mode during 2004-2010 using the 0.6-m Michigan Orbital DEbris Survey Telescope (MODEST) in Chile using a broad R filter. This paper presents a size distribution for all calibrated magnitude data acquired on MODEST. Size distribution plots are also shown using historical models for small fragmentation debris (down to 10 cm) thought to be associated with the Titan Transtage break-up. In November 2010, visible broadband photometry (Johnson/Kron-Cousins BVRI) was acquired with the 0.9-m Small and Moderate Aperture Research Telescope System (SMARTS) at the Cerro Tololo Inter-American Observatory (CTIO) in Chile on several Titan fragments (SSN 25001, 33509, and 33510) and the parent rocket body (SSN 3432). Color index data are used to determine the fragment brightness distribution and how the data compares to spacecraft materials measured in the laboratory using similar photometric measurement techniques. In order to better characterize the break-up fragments, spectral measurements were acquired on three Titan fragments (one fragment observed over two different time periods) using the 6.5-m Magellan telescopes at Las Campanas Observatory in Chile. The telescopic spectra of SSN 25000 (May 2012 and January 2013), SSN 38690, and SSN 38699 are compared with laboratory acquired spectra of materials (e.g., aluminum and various paints) to determine the surface material.

  13. Determination of the archaeological origin of ceramic fragments characterized by neutron activation analysis, by means of the application of multivariable statistical analysis techniques;Determinacion del origen de fragmentos de ceramica arqueologica caracterizados con analisis por activacion neutronica, mediante la aplicacion de tecnicas de analisis estadistico multivariable

    Energy Technology Data Exchange (ETDEWEB)

    Almazan T, M. G.; Jimenez R, M.; Monroy G, F.; Tenorio, D. [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Rodriguez G, N. L. [Instituto Mexiquense de Cultura, Subdireccion de Restauracion y Conservacion, Hidalgo poniente No. 1013, 50080 Toluca, Estado de Mexico (Mexico)

    2009-07-01

    The elementary composition of archaeological ceramic fragments obtained during the explorations in San Miguel Ixtapan, Mexico State, was determined by the neutron activation analysis technique. The samples irradiation was realized in the research reactor TRIGA Mark III with a neutrons flow of 1centre dot10{sup 13}ncentre dotcm{sup -2}centre dots{sup -1}. The irradiation time was of 2 hours. Previous to the acquisition of the gamma rays spectrum the samples were allowed to decay from 12 to 14 days. The analyzed elements were: Nd, Ce, Lu, Eu, Yb, Pa(Th), Tb, La, Cr, Hf, Sc, Co, Fe, Cs, Rb. The statistical treatment of the data, consistent in the group analysis and the main components analysis allowed to identify three different origins of the archaeological ceramic, designated as: local, foreign and regional. (Author)

  14. Biological and environmental conditionings for a sperm DNA fragmentation.

    Science.gov (United States)

    Bojar, Iwona; Witczak, Mariusz; Wdowiak, Artur

    2013-01-01

    The objective of the presented study was determination of the effect of selected agents on sperm DNA fragmentation--superoxide dismutase in seminal plasma, the patients' age, and burdening with the tobacco smoking habit. An attempt was also undertaken to evaluate the effect of DNA fragmentation on the effectiveness of infertility treatment. The study covered 186 men who received treatment due to infertility. The database and statistical analyses were performed using computer software STATISTICA 7.1. A relationship was observed between sperm DNA fragmentation and superoxide dismutase activity, the higher the SOD activity, the lower the percentage of sperm fragmentation (rs=-0.324; P=0.000; r = -0.2110). A statistical relationship was found between sperm DNA fragmentation and the percentage of pregnancies obtained during the first year of treatment--patients with the lower DFI more frequently became fathers during the first year of trying, compared to the remainder (t=2.51; P=0.013). A statistically significant relationship was confirmed (rs=-0.370; P=0.000) consisting in an increase in the DFI with respondents' age. No significant differences were noted between the DFI and the tobacco smoking habit (Chi2=0.29; P=0.926). The percentage of sperm DNA fragmentation was inversely proportional to superoxide dismutase activity in seminal plasma. DNA fragmentation becomes intensified with patients' age. Cigarette smoking has no effect on sperm DNA fragmentation. DNA fragmentation exerts an effect on the effectiveness of infertility treatment.

  15. Fragmentation characteristics analysis of sandstone fragments based on impact rockburst test

    Directory of Open Access Journals (Sweden)

    Dongqiao Liu

    2014-06-01

    Full Text Available Impact rockburst test on sandstone samples with a central hole is carried out under true triaxial static loads and vertical dynamic load conditions, and rock fragments after the test are collected. The fragments of sandstone generated from strain rockburst test and uniaxial compression test are also collected. The fragments are weighed and the length, width and thickness of each piece of fragments are measured respectively. The fragment quantities with coarse, medium, fine and micro grains in different size ranges, mass and particles distributions are also analyzed. Then, the fractal dimension of fragments is calculated by the methods of size-frequency, mass-frequency and length-to-thickness ratio-frequency. It is found that the crushing degree of impact rockburst fragments is higher, accompanied with blocky characteristics observably. The mass percentage of small grains, including fine and micro grains, in impact rockburst test is higher than those in strain rockburst test and uniaxial compression test. Energy dissipation from rockburst tests is more than that from uniaxial compression test, as the quantity of micro grains generated does.

  16. Fragmentation characteristics analysis of sandstone fragments based on impact rockburst test

    Institute of Scientific and Technical Information of China (English)

    Dongqiao Liu; Dejian Li; Fei Zhao; Chengchao Wang

    2014-01-01

    Impact rockburst test on sandstone samples with a central hole is carried out under true triaxial static loads and vertical dynamic load conditions, and rock fragments after the test are collected. The fragments of sandstone generated from strain rockburst test and uniaxial compression test are also collected. The fragments are weighed and the length, width and thickness of each piece of fragments are measured respectively. The fragment quantities with coarse, medium, fine and micro grains in different size ranges, mass and particles distributions are also analyzed. Then, the fractal dimension of fragments is calculated by the methods of size-frequency, mass-frequency and length-to-thickness ratio-frequency. It is found that the crushing degree of impact rockburst fragments is higher, accompanied with blocky character-istics observably. The mass percentage of small grains, including fine and micro grains, in impact rock-burst test is higher than those in strain rockburst test and uniaxial compression test. Energy dissipation from rockburst tests is more than that from uniaxial compression test, as the quantity of micro grains generated does.

  17. DNA fragmentation status in patients with necrozoospermia.

    Science.gov (United States)

    Brahem, Sonia; Jellad, Sonia; Ibala, Samira; Saad, Ali; Mehdi, Meriem

    2012-12-01

    The aim of this study was to determine if a relationship exists between the levels of sperm DNA fragmentation and necrospermia in infertile men. Semen samples obtained from 70 men consulting for infertility evaluation were analyzed according to World Health Organization (WHO) guidelines. Patients were subdivided into three groups according to the percentage of necrotic spermatozoa: normozoospermia (80%; n = 20). DNA fragmentation was detected by the terminal desoxynucleotidyl transferase-mediated deoxyuridine triphosphate biotin nick-end labeling (TUNEL) assay. The sperm DNA fragmentation index (DFI) was 9.28 ± 2.98% in patients with a normal level of necrotic spermatozoa, 20.25 ± 3.21% in patients with moderate necrozoospermia, and 35.31 ± 5.25% in patients with severe necrozoospermia. There was a statistically significant increase of DNA fragmentation in the necrozoospermic group (P DNA fragmentation. We concluded that patients with necrozoospermia showed a high level of DNA fragmentation compared to normozoospermic men. Severe necrozoospermia (>80%) is a predictive factor for increased sperm DNA damage.

  18. Fragmentation and the Nuclear Equation of State

    Energy Technology Data Exchange (ETDEWEB)

    Bauer, Wolfgang [National Superconducting Cyclotron Laboratory and Department of Physics and Astronomy, Michigan State University, East Lansing, MI 48824-2320 (United States)]. E-mail: bauer@pa.msu.edu

    2007-05-01

    Progress on the determination of the order of the fragmentation phase transition, the location of its critical point in the nuclear matter phase diagram, the values of the critical exponents that determine the universality class of the transition, and finite size scaling effects is discussed. Evidence for the presence of Zipf-Mandelbrot-scaling in the relative size of the largest clusters is examined, and the connection to the value of the critical exponent {tau} is established.

  19. The role of planetesimal fragmentation on giant planet formation

    CERN Document Server

    Guilera, O M; Brunini, A; Santamaría, P J

    2014-01-01

    In the standard scenario of planet formation, terrestrial planets and the cores of the giant planets are formed by accretion of planetesimals. As planetary embryos grow the planetesimal velocity dispersion increases due to gravitational excitations produced by embryos. The increase of planetesimal relative velocities causes the fragmentation of them due to mutual collisions. We study the role of planetesimal fragmentation on giant planet formation. We analyze how planetesimal fragmentation modifies the growth of giant planet's cores for a wide range of planetesimal sizes and disk masses. We incorporate a model of planetesimal fragmentation into our model of in situ giant planet formation. We calculate the evolution of the solid surface density (planetesimals plus fragments) due to the accretion by the planet, migration and fragmentation. The incorporation of planetesimal fragmentation significantly modifies the process of planetary formation. If most of the mass loss in planetesimal collisions is distributed ...

  20. Fragmentation of weak non-ablating objects during entry

    Energy Technology Data Exchange (ETDEWEB)

    Canavan, G.H.

    1997-04-01

    Fragmentation of objects during entry can be treated with developed models of breakup, fragment cascade, separation, and deceleration. The results reduce to those derived earlier for strong objects. Model predictions are in agreement with the key features of numerical simulations. Model equations can be inverted analytically to infer object size, entry speed, and strength from measurements of peak power and altitude and fragmentation altitude or time.

  1. Short read DNA fragment anchoring algorithm.

    Science.gov (United States)

    Wang, Wendi; Zhang, Peiheng; Liu, Xinchun

    2009-01-30

    The emerging next-generation sequencing method based on PCR technology boosts genome sequencing speed considerably, the expense is also get decreased. It has been utilized to address a broad range of bioinformatics problems. Limited by reliable output sequence length of next-generation sequencing technologies, we are confined to study gene fragments with 30 - 50 bps in general and it is relatively shorter than traditional gene fragment length. Anchoring gene fragments in long reference sequence is an essential and prerequisite step for further assembly and analysis works. Due to the sheer number of fragments produced by next-generation sequencing technologies and the huge size of reference sequences, anchoring would rapidly becoming a computational bottleneck. We compared algorithm efficiency on BLAT, SOAP and EMBF. The efficiency is defined as the count of total output results divided by time consumed to retrieve them. The data show that our algorithm EMBF have 3 - 4 times efficiency advantage over SOAP, and at least 150 times over BLAT. Moreover, when the reference sequence size is increased, the efficiency of SOAP will get degraded as far as 30%, while EMBF have preferable increasing tendency. In conclusion, we deem that EMBF is more suitable for short fragment anchoring problem where result completeness and accuracy is predominant and the reference sequences are relatively large.

  2. Forest fragmentation and bird community dynamics: inference at regional scales

    Science.gov (United States)

    Boulinier, T.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Flather, C.H.; Pollock, K.H.

    2001-01-01

    With increasing fragmentation of natural areas and a dramatic reduction of forest cover in several parts of the world, quantifying the impact of such changes on species richness and community dynamics has been a subject of much concern. Here, we tested whether in more fragmented landscapes there was a lower number of area-sensitive species and higher local extinction and turnover rates, which could explain higher temporal variability in species richness. To investigate such potential landscape effects at a regional scale, we merged two independent, large-scale monitoring efforts: the North American Breeding Bird Survey (BBS) and the Land Use and Land Cover Classification data from the U.S. Geological Survey. We used methods that accounted for heterogeneity in the probability of detecting species to estimate species richness and temporal changes in the bird communities for BBS routes in three mid-Atlantic U.S. states. Forest breeding bird species were grouped prior to the analyses into area-sensitive and non-area-sensitive species according to previous studies. We tested predictions relating measures of forest structure at one point in time (1974) to species richness at that time and to parameters of forest bird community change over the following 22-yr-period (1975-1996). We used the mean size of forest patches to characterize landscape structure, as high correlations among landscape variables did not allow us to disentangle the relative roles of habitat fragmentation per se and habitat loss. As predicted, together with lower species richness for area-sensitive species on routes surrounded by landscapes with lower mean forest-patch size, we found higher mean year-to-year rates of local extinction. Moreover, the mean year-to-year rates of local turnover (proportion of locally new species) for area-sensitive species were also higher in landscapes with lower mean forest-patch size. These associations were not observed for the non-area-sensitive species group. These

  3. Fragments of Time

    DEFF Research Database (Denmark)

    Christiansen, Steen Ledet

    Time travel films necessarily fragment linear narratives, as scenes are revisited with differences from the first time we saw it. Popular films such as Back to the Future mine comedy from these visitations, but there are many different approaches. One extreme is Chris Marker's La Jetée - a film...... made almost completely of still images, recounting the end of the world. These stills can be viewed as fragments that have survived the end of the world and now provide the only access to the events that occured. Shane Carruth's Primer has a different approach to time travel, the narrative diegesis...

  4. The Serendipity of Fragmentation

    DEFF Research Database (Denmark)

    Leixnering, Stephan; Meyer, Renate E.

    , it was the central government’s task to coordinate, steer and control the newly emerged decentralized organizations. This raises questions about the overall design of the public sector at present. Our paper engages with the prevalent public governance phenomenon of fragmentation from a design perspective in order...... form of organizing between networks and formal organization: lacking a single center and featuring multiplex and multifaceted relations within the politico-administrative apparatus and between government and PSOs, high fragmentation, local and robust action, but latent structures of significant formal...

  5. IMPACT fragmentation model developments

    Science.gov (United States)

    Sorge, Marlon E.; Mains, Deanna L.

    2016-09-01

    The IMPACT fragmentation model has been used by The Aerospace Corporation for more than 25 years to analyze orbital altitude explosions and hypervelocity collisions. The model is semi-empirical, combining mass, energy and momentum conservation laws with empirically derived relationships for fragment characteristics such as number, mass, area-to-mass ratio, and spreading velocity as well as event energy distribution. Model results are used for several types of analysis including assessment of short-term risks to satellites from orbital altitude fragmentations, prediction of the long-term evolution of the orbital debris environment and forensic assessments of breakup events. A new version of IMPACT, version 6, has been completed and incorporates a number of advancements enabled by a multi-year long effort to characterize more than 11,000 debris fragments from more than three dozen historical on-orbit breakup events. These events involved a wide range of causes, energies, and fragmenting objects. Special focus was placed on the explosion model, as the majority of events examined were explosions. Revisions were made to the mass distribution used for explosion events, increasing the number of smaller fragments generated. The algorithm for modeling upper stage large fragment generation was updated. A momentum conserving asymmetric spreading velocity distribution algorithm was implemented to better represent sub-catastrophic events. An approach was developed for modeling sub-catastrophic explosions, those where the majority of the parent object remains intact, based on estimated event energy. Finally, significant modifications were made to the area-to-mass ratio distribution to incorporate the tendencies of different materials to fragment into different shapes. This ability enabled better matches between the observed area-to-mass ratios and those generated by the model. It also opened up additional possibilities for post-event analysis of breakups. The paper will discuss

  6. Contributions to statistics

    CERN Document Server

    Mahalanobis, P C

    1965-01-01

    Contributions to Statistics focuses on the processes, methodologies, and approaches involved in statistics. The book is presented to Professor P. C. Mahalanobis on the occasion of his 70th birthday. The selection first offers information on the recovery of ancillary information and combinatorial properties of partially balanced designs and association schemes. Discussions focus on combinatorial applications of the algebra of association matrices, sample size analogy, association matrices and the algebra of association schemes, and conceptual statistical experiments. The book then examines latt

  7. Species traits influence the genetic consequences of river fragmentation on two co-occurring redhorse (Moxostoma) species

    Science.gov (United States)

    Reid, S.M.; Wilson, C.C.; Carl, L.M.; Zorn, T.G.

    2008-01-01

    We used microsatellite DNA markers to test whether fragmentation of the Trent River (Ontario, Canada) has reduced genetic diversity and increased genetic differentiation among populations of river redhorse (Moxostoma carinatum) and shorthead redhorse (Moxostoma macrolepidotum). Allelic richness of both species was significantly greater along the free-flowing Muskegon River (Michigan, USA) than along the fragmented Trent River. Contrary to expectations, there was no evidence of a fragment length effect on genetic diversity, recent population bottlenecks, or increased relatedness among individuals in fragmented populations. High levels of linkage disequilibrium indicate extinction-recolonization population dynamics along the Trent River. For both species, pairwise FST tests identified weak but statistically significant population differentiation. In the Trent River, differentiation was significantly greater for river redhorse than for shorthead redhorse and, for both species, greater than in the Muskegon River. Moderate fragmentation effects likely reflect the permeability of the dam-lock system to redhorse movement. Differences between species indicate that as a result of smaller effective population sizes, habitat specialists and species at the periphery of their geographic range are more sensitive to river fragmentation. ?? 2008 NRC.

  8. Algebraic Statistics

    OpenAIRE

    Norén, Patrik

    2013-01-01

    Algebraic statistics brings together ideas from algebraic geometry, commutative algebra, and combinatorics to address problems in statistics and its applications. Computer algebra provides powerful tools for the study of algorithms and software. However, these tools are rarely prepared to address statistical challenges and therefore new algebraic results need often be developed. This way of interplay between algebra and statistics fertilizes both disciplines. Algebraic statistics is a relativ...

  9. Quantum fluctuation effects on nuclear fragment and atomic cluster formation

    Energy Technology Data Exchange (ETDEWEB)

    Ohnishi, Akira [Hokkaido Univ., Sapporo (Japan). Dept. of Physics; Randrup, J.

    1997-05-01

    We investigate the nuclear fragmentation and atomic cluster formation by means of the recently proposed quantal Langevin treatment. It is shown that the effect of the quantal fluctuation is in the opposite direction in nuclear fragment and atomic cluster size distribution. This tendency is understood through the effective classical temperature for the observables. (author)

  10. Fractal Fragmentation triggered by meteor impact: The Ries Crater (Germany)

    Science.gov (United States)

    Paredes Marino, Joali; Perugini, Diego; Rossi, Stefano; Kueppers, Ulrich

    2015-04-01

    FRACTAL FRAGMENTATION TRIGGERED BY METEOR IMPACT: THE RIES CRATER (GERMANY) Joali Paredes (1), Stefano Rossi (1), Diego Perugini (1), Ulrich Kueppers (2) 1. Department of Physics and Geology, University of Perugia, Italy 2. Department of Earth and Environmental Sciences, University of Munich, Germany The Nördlinger Ries is a large circular depression in western Bavaria, Germany. The depression was caused by a meteor impact, which occurred about 14.3 million-14.5 million years ago. The original crater rim had an estimated diameter of 24 kilometers. Computer modeling of the impact event indicates that the impact or probably had diameters of about 1.5 kilometers and impacted the target area at an angle around 30 to 50 degrees from the surface in a west- southwest to east-northeast direction. The impact velocity is thought to have been about 20 km/s. The meteor impact generated extensive fragmentation of preexisting rocks. In addition, melting of these rocks also occurred. The impact melt was ejected at high speed provoking its extensive fragmentation. Quenched melt fragments are ubiquitous in the outcrops. Here we study melt fragment size distributions with the aim of understanding the style of melt fragmentation during ejection and to constrain the rheological properties of such melts. Digital images of suevite (i.e. the rock generated after deposition and diagenesis of ash and fragments produced by the meteor impact) were obtained using a high-resolution optical scanner. Successively, melt fragments were traced by image analysis and the images segmented in order to obtain binary images on which impact melt fragments are in black color, embedded on a white background. Hence, the size of fragments was determined by image analysis. Fractal fragmentation theory has been applied to fragment size distributions of melt fragments in the Ries crater. Results indicate that melt fragments follow fractal distributions indicating that fragmentation of melt generated by the

  11. Intermediate fragmentation per se provides stable predator-prey metapopulation dynamics.

    Science.gov (United States)

    Cooper, Jennifer K; Li, Jiqiu; Montagnes, David J S

    2012-08-01

    The extent to which a landscape is fragmented affects persistence of predator-prey dynamics. Increasing fragmentation concomitantly imposes conditions that stabilise and destabilise metapopulations. For the first time, we explicitly assessed the hypothesis that intermediate levels provide optimal conditions for stability. We examine four structural changes arising from increased fragmentation: increased fragment number; decreased fragment size; increased connectedness (corridors scaled to fragment); increased fragment heterogeneity (based on connectedness). Using the model predator-prey system (Didinium-Paramecium) we support our hypothesis, by examining replicated metapopulations dynamics at five fragmentation levels. Although both species became extinct without fragmentation, prey survived at low and high levels, and both survived at intermediate levels. By examining time to extinction, maximum abundances, and population asynchrony we conclude that fragmentation produces structural heterogeneity (independent of environmental heterogeneity), which influences stability. Our analysis suggests why some theoretical, field and microcosm studies present conflicting views of fragmentation effects on population persistence.

  12. Fragmented Work Stories

    DEFF Research Database (Denmark)

    Humle, Didde Maria; Reff Pedersen, Anne

    2015-01-01

    by exploring how different types of fragmentation create meanings. This is done by studying the work stories of job and personnel consultants and by drawing on the results of a narrative, ethnographic study of a consultancy. The analysis demonstrates how work stories are social practices negotiated, retold...

  13. Picking Up (On) Fragments

    NARCIS (Netherlands)

    Ellis, Phil

    2015-01-01

    abstractThis article discusses the implications for archival and media archaeological research and reenactment artwork relating to a recent arts practice project: reenacttv: 30 lines / 60 seconds. It proposes that archival material is unstable but has traces and fragments that are full of creative p

  14. Fragments of the Past

    Directory of Open Access Journals (Sweden)

    Peter Szende

    2016-10-01

    Full Text Available With travel being made more accessible throughout the decades, the hospitality industry constantly evolved their practices as society and technology progressed. Hotels looked for news ways up service their customers, which led to the invention of the Servidor in 1918. Once revolutionary innovations have gone extinct, merely becoming fragments of the past.

  15. Cryobiology of coral fragments.

    Science.gov (United States)

    Hagedorn, Mary; Farrell, Ann; Carter, Virginia L

    2013-02-01

    Around the world, coral reefs are dying due to human influences, and saving habitat alone may not stop this destruction. This investigation focused on the biological processes that will provide the first steps in understanding the cryobiology of whole coral fragments. Coral fragments are a partnership of coral tissue and endosymbiotic algae, Symbiodinium sp., commonly called zooxanthellae. These data reflected their separate sensitivities to chilling and a cryoprotectant (dimethyl sulfoxide) for the coral Pocillopora damicornis, as measured by tissue loss and Pulse Amplitude Modulated fluorometry 3weeks post-treatment. Five cryoprotectant treatments maintained the viability of the coral tissue and zooxanthellae at control values (1M dimethyl sulfoxide at 1.0, 1.5 and 2.0h exposures, and 1.5M dimethyl sulfoxide at 1.0 and 1.5h exposures, P>0.05, ANOVA), whereas 2M concentrations did not (Pcoral tissue, but not in the zooxanthellae. During the winter when the fragments were chilled, the coral tissue remained relatively intact (∼25% loss) post-treatment, but the zooxanthellae numbers in the tissue declined after 5min of chilling (Pcoral tissue (∼75% loss) and zooxanthellae numbers declined in response to chilling alone (Pcoral against tissue loss after 45min of cryoprotectant exposure (P>0.05, ANOVA), but it did not protect against the loss of zooxanthellae (Pcoral fragment complex and future cryopreservation protocols must be guided by their greater sensitivity. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Picking Up (On) Fragments

    NARCIS (Netherlands)

    Ellis, Phil

    2015-01-01

    abstractThis article discusses the implications for archival and media archaeological research and reenactment artwork relating to a recent arts practice project: reenacttv: 30 lines / 60 seconds. It proposes that archival material is unstable but has traces and fragments that are full of creative p

  17. Wildlife habitat fragmentation.

    Science.gov (United States)

    John. Lehmkuhl

    2005-01-01

    A primary issue in forest wildlife management is habitat fragmentation and its effects on viability, which is the "bottom line" for plant and animal species of conservation concern. Population viability is the likelihood that a population will be able to maintain itself (remain viable) over a long period of time-usually 100 years or more. Though it is true...

  18. New information on photon fragmentation functions

    Science.gov (United States)

    Klasen, Michael; König, Florian

    2014-08-01

    Thermal photons radiated in heavy-ion collisions represent an important signal for a recently discovered new state of matter, the deconfined quark-gluon plasma. However, a clean identification of this signal requires precise knowledge of the prompt photons produced simultaneously in hard collisions of quarks and gluons, mostly through their fragmentation. In this paper, we demonstrate that PHENIX data on photons produced in proton-proton collisions with low transverse momenta allow to extract new information on this fragmentation process. While existing data do not yet convincingly favor one parameterization (BFG II) over the two other frequently used photon fragmentation functions (BFG I and GRV NLO), the data sets recorded by PHENIX and STAR at BNL RHIC in 2013 with tenfold higher statistics should allow for such an analysis.

  19. New information on photon fragmentation functions

    Energy Technology Data Exchange (ETDEWEB)

    Klasen, Michael; Koenig, Florian [Westfaelische Wilhelms-Universitaet Muenster, Institut fuer Theoretische Physik, Muenster (Germany)

    2014-08-15

    Thermal photons radiated in heavy-ion collisions represent an important signal for a recently discovered new state of matter, the deconfined quark-gluon plasma. However, a clean identification of this signal requires precise knowledge of the prompt photons produced simultaneously in hard collisions of quarks and gluons, mostly through their fragmentation. In this paper, we demonstrate that PHENIX data on photons produced in proton-proton collisions with low transverse momenta allow one to extract new information on this fragmentation process. While existing data do not yet convincingly favor one parameterization (BFG II) over the two other frequently used photon fragmentation functions (BFG I and GRV NLO), the data sets recorded by PHENIX and STAR at BNL RHIC in 2013 with tenfold higher statistics should allow for such an analysis. (orig.)

  20. Research on the photoelectric measuring method of warhead fragment velocity

    Science.gov (United States)

    Liu, Ji; Yu, Lixia; Zhang, Bin; Liu, Xiaoyan

    2016-09-01

    The velocity of warhead fragment is the key criteria to determine its mutilation efficiency. But owing to the small size, larger quantity, irregular shape, high speed, arbitrary direction, large dispersion of warhead fragment and adverse environment, the test of fragment velocity parameter is very difficult. The paper designed an optoelectronic system to measure the average velocity of warhead fragments accurately. The apparatus included two parallel laser screens spaced apart at a known fixed distance for providing time measurement between start and stop signals. The large effective screen area was composed of laser source, retro-reflector and large area photo-diode. Whenever a moving fragment interrupted two optical screens, the system would generate a target signal. Due to partial obscuration of the incident energy and the poor test condition of the explosion, fragment target signal is easily disturbed. Therefore, fragments signal processing technology has become a key technology of the system. The noise of signal was reduced by employing wavelet decomposition and reconstruction. The time of fragment passing though the target was obtained by adopting peak detection algorithm. Based on the method of search peak in different width scale and waveform trend by using optima wavelet, the problem of rolling waveform was solved. Lots of fragments experiments of the different types of the warheads were conducted. Experimental results show that: warhead fragments capture rate of system is better than 98%, which can give velocity of each fragment in the density of less than 20 pieces per m2.

  1. Usage Statistics: MedlinePlus

    Science.gov (United States)

    ... this page: https://medlineplus.gov/usestatistics.html MedlinePlus Statistics To use the sharing features on this page, ... By Quarter View image full size Quarterly User Statistics Quarter Page Views Unique Visitors Oct-Dec-98 ...

  2. Critical Features of Fragment Libraries for Protein Structure Prediction.

    Science.gov (United States)

    Trevizani, Raphael; Custódio, Fábio Lima; Dos Santos, Karina Baptista; Dardenne, Laurent Emmanuel

    2017-01-01

    The use of fragment libraries is a popular approach among protein structure prediction methods and has proven to substantially improve the quality of predicted structures. However, some vital aspects of a fragment library that influence the accuracy of modeling a native structure remain to be determined. This study investigates some of these features. Particularly, we analyze the effect of using secondary structure prediction guiding fragments selection, different fragments sizes and the effect of structural clustering of fragments within libraries. To have a clearer view of how these factors affect protein structure prediction, we isolated the process of model building by fragment assembly from some common limitations associated with prediction methods, e.g., imprecise energy functions and optimization algorithms, by employing an exact structure-based objective function under a greedy algorithm. Our results indicate that shorter fragments reproduce the native structure more accurately than the longer. Libraries composed of multiple fragment lengths generate even better structures, where longer fragments show to be more useful at the beginning of the simulations. The use of many different fragment sizes shows little improvement when compared to predictions carried out with libraries that comprise only three different fragment sizes. Models obtained from libraries built using only sequence similarity are, on average, better than those built with a secondary structure prediction bias. However, we found that the use of secondary structure prediction allows greater reduction of the search space, which is invaluable for prediction methods. The results of this study can be critical guidelines for the use of fragment libraries in protein structure prediction.

  3. Critical Features of Fragment Libraries for Protein Structure Prediction

    Science.gov (United States)

    dos Santos, Karina Baptista

    2017-01-01

    The use of fragment libraries is a popular approach among protein structure prediction methods and has proven to substantially improve the quality of predicted structures. However, some vital aspects of a fragment library that influence the accuracy of modeling a native structure remain to be determined. This study investigates some of these features. Particularly, we analyze the effect of using secondary structure prediction guiding fragments selection, different fragments sizes and the effect of structural clustering of fragments within libraries. To have a clearer view of how these factors affect protein structure prediction, we isolated the process of model building by fragment assembly from some common limitations associated with prediction methods, e.g., imprecise energy functions and optimization algorithms, by employing an exact structure-based objective function under a greedy algorithm. Our results indicate that shorter fragments reproduce the native structure more accurately than the longer. Libraries composed of multiple fragment lengths generate even better structures, where longer fragments show to be more useful at the beginning of the simulations. The use of many different fragment sizes shows little improvement when compared to predictions carried out with libraries that comprise only three different fragment sizes. Models obtained from libraries built using only sequence similarity are, on average, better than those built with a secondary structure prediction bias. However, we found that the use of secondary structure prediction allows greater reduction of the search space, which is invaluable for prediction methods. The results of this study can be critical guidelines for the use of fragment libraries in protein structure prediction. PMID:28085928

  4. Aerodynamic characteristics and respiratory deposition of fungal fragments

    Science.gov (United States)

    Cho, Seung-Hyun; Seo, Sung-Chul; Schmechel, Detlef; Grinshpun, Sergey A.; Reponen, Tiina

    The purpose of this study was to investigate the aerodynamic characteristics of fungal fragments and to estimate their respiratory deposition. Fragments and spores of three different fungal species ( Aspergillus versicolor, Penicillium melinii, and Stachybotrys chartarum) were aerosolized by the fungal spore source strength tester (FSSST). An electrical low-pressure impactor (ELPI) measured the size distribution in real-time and collected the aerosolized fungal particles simultaneously onto 12 impactor stages in the size range of 0.3-10 μm utilizing water-soluble ZEF-X10 coating of the impaction stages to prevent spore bounce. For S. chartarum, the average concentration of released fungal fragments was 380 particles cm -3, which was about 514 times higher than that of spores. A. versicolor was found to release comparable amount of spores and fragments. Microscopic analysis confirmed that S. chartarum and A. versicolor did not show any significant spore bounce, whereas the size distribution of P. melinii fragments was masked by spore bounce. Respiratory deposition was calculated using a computer-based model, LUDEP 2.07, for an adult male and a 3-month-old infant utilizing the database on the concentration and size distribution of S. chartarum and A. versicolor aerosols measured by the ELPI. Total deposition fractions for fragments and spores were 27-46% and 84-95%, respectively, showing slightly higher values in an infant than in an adult. For S. chartarum, fragments demonstrated 230-250 fold higher respiratory deposition than spores, while the number of deposited fragments and spores of A. versicolor were comparable. It was revealed that the deposition ratio (the number of deposited fragments divided by that of deposited spores) in the lower airways for an infant was 4-5 times higher than that for an adult. As fungal fragments have been shown to contain mycotoxins and antigens, further exposure assessment should include the measurement of fungal fragments for

  5. Sample size determination and power

    CERN Document Server

    Ryan, Thomas P, Jr

    2013-01-01

    THOMAS P. RYAN, PhD, teaches online advanced statistics courses for Northwestern University and The Institute for Statistics Education in sample size determination, design of experiments, engineering statistics, and regression analysis.

  6. Primary and secondary fragmentation of crystal-bearing intermediate magma

    Science.gov (United States)

    Jones, Thomas J.; McNamara, Keri; Eychenne, Julia; Rust, Alison C.; Cashman, Katharine V.; Scheu, Bettina; Edwards, Robyn

    2016-11-01

    Crystal-rich intermediate magmas are subjected to both primary and secondary fragmentation processes, each of which may produce texturally distinct tephra. Of particular interest for volcanic hazards is the extent to which each process contributes ash to volcanic plumes. One way to address this question is by fragmenting pyroclasts under controlled conditions. We fragmented pumice samples from Soufriere Hills Volcano (SHV), Montserrat, by three methods: rapid decompression in a shock tube-like apparatus, impact by a falling piston, and milling in a ball mill. Grain size distributions of the products reveal that all three mechanisms produce fractal breakage patterns, and that the fractal dimension increases from a minimum of 2.1 for decompression fragmentation (primary fragmentation) to a maximum of 2.7 by repeated impact (secondary fragmentation). To assess the details of the fragmentation process, we quantified the shape, texture and components of constituent ash particles. Ash shape analysis shows that the axial ratio increases during milling and that particle convexity increases with repeated impacts. We also quantify the extent to which the matrix is separated from the crystals, which shows that secondary processes efficiently remove adhering matrix from crystals, particularly during milling (abrasion). Furthermore, measurements of crystal size distributions before (using x-ray computed tomography) and after (by componentry of individual grain size classes) decompression-driven fragmentation show not only that crystals influence particular size fractions across the total grain size distribution, but also that free crystals are smaller in the fragmented material than in the original pumice clast. Taken together, our results confirm previous work showing both the control of initial texture on the primary fragmentation process and the contributions of secondary processes to ash formation. Critically, however, our extension of previous analyses to characterisation

  7. Bayesian statistics

    OpenAIRE

    新家, 健精

    2013-01-01

    © 2012 Springer Science+Business Media, LLC. All rights reserved. Article Outline: Glossary Definition of the Subject and Introduction The Bayesian Statistical Paradigm Three Examples Comparison with the Frequentist Statistical Paradigm Future Directions Bibliography

  8. Fragmentation of colliding planetesimals with water content

    CERN Document Server

    Maindl, Thomas I; Schäfer, Christoph; Speith, Roland

    2014-01-01

    We investigate the outcome of collisions of Ceres-sized planetesimals composed of a rocky core and a shell of water ice. These collisions are not only relevant for explaining the formation of planetary embryos in early planetary systems, but also provide insight into the formation of asteroid families and possible water transport via colliding small bodies. Earlier studies show characteristic collision velocities exceeding the bodies' mutual escape velocity which - along with the distribution of the impact angles - cover the collision outcome regimes 'partial accretion', 'erosion', and 'hit-and-run' leading to different expected fragmentation scenarios. Existing collision simulations use bodies composed of strengthless material; we study the distribution of fragments and their water contents considering the full elasto-plastic continuum mechanics equations also including brittle failure and fragmentation.

  9. Mathematical statistics

    CERN Document Server

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  10. Electroeluting DNA fragments.

    Science.gov (United States)

    Zarzosa-Alvarez, Ana L; Sandoval-Cabrera, Antonio; Torres-Huerta, Ana L; Bermudez-Cruz, Rosa M

    2010-09-05

    Purified DNA fragments are used for different purposes in Molecular Biology and they can be prepared by several procedures. Most of them require a previous electrophoresis of the DNA fragments in order to separate the band of interest. Then, this band is excised out from an agarose or acrylamide gel and purified by using either: binding and elution from glass or silica particles, DEAE-cellulose membranes, "crush and soak method", electroelution or very often expensive commercial purification kits. Thus, selecting a method will depend mostly of what is available in the laboratory. The electroelution procedure allows one to purify very clean DNA to be used in a large number of applications (sequencing, radiolabeling, enzymatic restriction, enzymatic modification, cloning etc). This procedure consists in placing DNA band-containing agarose or acrylamide slices into sample wells of the electroeluter, then applying current will make the DNA fragment to leave the agarose and thus be trapped in a cushion salt to be recovered later by ethanol precipitation.

  11. Kimberlite Wall Rock Fragmentation: Venetia K08 Pipe Development

    Science.gov (United States)

    Barnett, W.; Kurszlaukis, S.; Tait, M.; Dirks, P.

    2009-05-01

    Volcanic systems impose powerful disrupting forces on the country rock into which they intrude. The nature of the induced brittle deformation or fragmentation can be characteristic of the volcanic processes ongoing within the volcanic system, but are most typically partially removed or obscured by repeated, overprinting volcanic activity in mature pipes. Incompletely evolved pipes may therefore provide important evidence for the types and stages of wall rock fragmentation, and mechanical processes responsible for the fragmentation. Evidence for preserved stages of fragmentation is presented from a detailed study of the K08 pipe within the Cambrian Venetia kimberlite cluster, South Africa. This paper investigates the growth history of the K08 pipe and the mechanics of pipe development based on observations in the pit, drill core and thin sections, from geochemical analyses, particle size distribution analyses, and 3D modeling. Present open pit exposures of the K08 pipe comprise greater than 90% mega-breccia of country rock clasts (gneiss and schist) with fractal statistics on particle size distributions (PSD) is used to quantify sheared and non- sheared breccia zones. The calculated energy required to form the non-sheared breccia PSD implies an explosive early stage of fragmentation that pre-conditions the rock mass. The pre-conditioning would have been caused by explosions that are either phreatic or phreatomagmatic in nature. The explosions are likely to have been centered on a dyke, or pulses of preceding volatile-fluid phases, which have encountered a local hydrologically active fault. The explosions were inadequate in mechanical energy release (72% of a mine production blast) to eject material from the pipe, and the pipe may not have breached surface. The next stage of fragmentation is interpreted to have been an upward-moving collapse of the pre-conditioned hanging wall of a subterranean volcanic excavation. This would explain the mega-scale layering across

  12. Does reaction-diffusion support the duality of fragmentation effect?

    CERN Document Server

    Roques, Lionel

    2009-01-01

    There is a gap between single-species model predictions, and empirical studies, regarding the effect of habitat fragmentation per se, i.e., a process involving the breaking apart of habitat without loss of habitat. Empirical works indicate that fragmentation can have positive as well as negative effects, whereas, traditionally, single-species models predict a negative effect of fragmentation. Within the class of reaction-diffusion models, studies almost unanimously predict such a detrimental effect. In this paper, considering a single-species reaction-diffusion model with a removal -- or similarly harvesting -- term, in two dimensions, we find both positive and negative effects of fragmentation of the reserves, i.e. the protected regions where no removal occurs. Fragmented reserves lead to higher population sizes for time-constant removal terms. On the other hand, when the removal term is proportional to the population density, higher population sizes are obtained on aggregated reserves, but maximum yields ar...

  13. Fragmentation of a viscoelastic food by human mastication

    CERN Document Server

    Kobayashi, Naoki; Shiozawa, Kouichi

    2010-01-01

    Fragment-size distributions have been studied experimentally in masticated viscoelastic food (fish sausage).The mastication experiment in seven subjects was examined. We classified the obtained results into two groups, namely, a single lognormal distribution group and a lognormal distribution with exponential tail group. The facts suggest that the individual variability might affect the fragmentation pattern when the food sample has a much more complicated physical property. In particular, the latter result (lognormal distribution with exponential tail) indicates that the fragmentation pattern by human mastication for fish sausage is different from the fragmentation pattern for raw carrot shown in our previous study. The excellent data fitting by the lognormal distribution with exponential tail implies that the fragmentation process has a size-segregation-structure between large and small parts.In order to explain this structure, we propose a mastication model for fish sausage based on stochastic processes.

  14. Rock fragmentation control in opencast blasting

    Directory of Open Access Journals (Sweden)

    P.K. Singh

    2016-04-01

    Full Text Available The blasting operation plays a pivotal role in the overall economics of opencast mines. The blasting sub-system affects all the other associated sub-systems, i.e. loading, transport, crushing and milling operations. Fragmentation control through effective blast design and its effect on productivity are the challenging tasks for practicing blasting engineer due to inadequate knowledge of actual explosive energy released in the borehole, varying initiation practice in blast design and its effect on explosive energy release characteristic. This paper describes the result of a systematic study on the impact of blast design parameters on rock fragmentation at three mines in India. The mines use draglines and shovel–dumper combination for removal of overburden. Despite its pivotal role in controlling the overall economics of a mining operation, the expected blasting performance is often judged almost exclusively on the basis of poorly defined parameters such as powder factor and is often qualitative which results in very subjective assessment of blasting performance. Such an approach is very poor substitutes for accurate assessment of explosive and blasting performance. Ninety one blasts were conducted with varying blast designs and charging patterns, and their impacts on the rock fragmentation were documented. A high-speed camera was deployed to record the detonation sequences of the blasts. The efficiency of the loading machines was also correlated with the mean fragment size obtained from the fragmentation analyses.

  15. Harmonic statistics

    Science.gov (United States)

    Eliazar, Iddo

    2017-05-01

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their 'public relations' for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford's law, and 1/f noise.

  16. Fragmentation and clustering in star matter

    Directory of Open Access Journals (Sweden)

    Gulminelli F.

    2012-07-01

    Full Text Available The specificity of the crust-core phase transition in neutron star at zero and finite temperature will be discussed. It will be shown that, as a consequence of the presence of long range Coulomb interactions, the equivalence of statistical ensembles is violated and a clusterised phase is expected which is not accessible in the grancanonical ensemble. A specific analytical Nuclear Statistical Equilibrium model will be presented and some new quantitative results relevant for the supernova dynamics will be shown. Finally, the analogies and differences with the phenomenon of nuclear fragmentation will be highlighted.

  17. SCALING AND 4-QUARK FRAGMENTATION

    NARCIS (Netherlands)

    SCHOLTEN, O; BOSVELD, GD

    1991-01-01

    The conditions for a scaling behaviour from the fragmentation process leading to slow protons are discussed- The scaling referred to implies that the fragmentation functions depend on the light-cone momentum fraction only. It is shown that differences in the fragmentation functions for valence- and

  18. Statistical physics

    CERN Document Server

    Sadovskii, Michael V

    2012-01-01

    This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.

  19. Statistics 101 for Radiologists.

    Science.gov (United States)

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced.

  20. Statistical methods

    CERN Document Server

    Szulc, Stefan

    1965-01-01

    Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then

  1. Statistical optics

    CERN Document Server

    Goodman, Joseph W

    2015-01-01

    This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications.  The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i

  2. Histoplasmosis Statistics

    Science.gov (United States)

    ... Foodborne, Waterborne, and Environmental Diseases Mycotic Diseases Branch Histoplasmosis Statistics Recommend on Facebook Tweet Share Compartir How common is histoplasmosis? In the United States, an estimated 60% to ...

  3. Statistical distributions

    CERN Document Server

    Forbes, Catherine; Hastings, Nicholas; Peacock, Brian J.

    2010-01-01

    A new edition of the trusted guide on commonly used statistical distributions Fully updated to reflect the latest developments on the topic, Statistical Distributions, Fourth Edition continues to serve as an authoritative guide on the application of statistical methods to research across various disciplines. The book provides a concise presentation of popular statistical distributions along with the necessary knowledge for their successful use in data modeling and analysis. Following a basic introduction, forty popular distributions are outlined in individual chapters that are complete with re

  4. Harmonic statistics

    Energy Technology Data Exchange (ETDEWEB)

    Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

    2017-05-15

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  5. Mamíferos de médio e grande porte em um fragmento de mata atlântica, Minas Gerais, Brasil Medium and large-sized mammal in a forest fragment of atlantic forest, Minas Gerais, Brazil

    Directory of Open Access Journals (Sweden)

    Maressa Rocha do Prado

    2008-08-01

    Full Text Available O grau de ameaça e a importância ecológica dos mamíferos terrestres de médio e grande porte evidenciam a necessidade da busca de informações em inventários e diagnósticos ambientais. Objetivo deste estudo foi inventariar e avaliar a freqüência de ocorrência e riqueza de espécies de mamíferos de médio e grande porte na Estação de Pesquisa, Treinamento e Educação Ambiental (EPTEA Mata do Paraíso, em Viçosa - MG. A área de estudo foi aleatoriamente percorrida, em busca de evidências indiretas e diretas de mamíferos. Também foram utilizadas armadilhas Tomahawk e fotográficas para o registro e identificação das espécies. Para registrar a freqüência de ocorrência, estabeleceu-se 20 parcelas de 2 x 2 m ao longo de um transecto, as quais foram vistoriadas 29 vezes entre abril de 2005 e abril de 2006. A partir dos dados de freqüência de ocorrência, estimou-se a riqueza de espécies, pelo procedimento Jackknife 1, utilizando o Programa EstimateS. Foram registradas 23 espécies de mamíferos, das quais três estão ameaçadas de extinção: Chrysocyon brachyurus (Illiger, 1815, Leopardus pardalis (Linnaeus, 1758 e Leopardus tigrinus (Schreber, 1775. As espécies silvestres com maior freqüência de registro foram Cerdocyon thous (Linnaeus, 1766, L. tigrinus e L. pardalis. Foi estimada a riqueza de 15 (intervalo de confiança = 0,95 espécies de mamíferos terrestres silvestres para a EPTEA Mata do Paraíso. O presente trabalho mostra que apesar de pequena, a área de estudo desempenha um importante papel na conservação da mastofauna da região de Viçosa - MG.The degree of threat and the ecological importance of both medium and large-sized terrestrial mammals point out the necessity of searching for information on inventories and environmental diagnostics. This work aimed to inventory and evaluate the species frequency of occurrence and richness of both medium and large-sized mammals from the Estação de Pesquisa

  6. Time invariant scaling in discrete fragmentation models

    CERN Document Server

    Giraud, B G; Giraud, B G; Peschanski, R

    1994-01-01

    Linear rate equations are used to describe the cascading decay of an initial heavy cluster into fragments. We consider moments of arbitrary orders of the mass multiplicity spectrum and derive scaling properties pertaining to their time evolution. We suggest that the mass weighted multiplicity is a suitable observable for the discovery of scaling. Numerical tests validate such properties, even for moderate values of the initial mass (nuclei, percolation clusters, jets of particles etc.). Finite size effects can be simply parametrized.

  7. Picking Up (On Fragments

    Directory of Open Access Journals (Sweden)

    Phil Ellis

    2015-09-01

    Full Text Available This article discusses the implications for archival and media archaeological research and reenactment artwork relating to a recent arts practice project: reenacttv: 30 lines / 60 seconds. It proposes that archival material is unstable but has traces and fragments that are full of creative potential to re-think and re-examine past media historical events through a media archaeological approach to reenactment. The article contains images and links to videos from the final reenactment artworks as well as from rehearsals in Vienna and Bradford.

  8. An Archeology of Fragments

    Directory of Open Access Journals (Sweden)

    Gerald L. Bruns

    2014-10-01

    Full Text Available This is a short (fragmentary history of fragmentary writing from the German Romantics (F. W. Schlegel, Friedrich Hölderlin to modern and contemporary concrete or visual poetry. Such writing is (often deliberately a critique of the logic of subsumption that tries to assimilate whatever is singular and irreducible into totalities of various categorical or systematic sorts. Arguably, the fragment (parataxis is the distinctive feature of literary Modernism, which is a rejection, not of what precedes it, but of what Max Weber called “the rationalization of the world” (or Modernity whose aim is to keep everything, including all that is written, under surveillance and control.

  9. A decadal view of magma fragmentation

    Science.gov (United States)

    Cashman, K. V.; Rust, A.

    2010-12-01

    Although the past decade has seen fundamental advances in studies of explosive volcanism, the disruption to air traffic caused by the 2010 eruption of Eyjafjallajökull, Iceland, highlights the need for improved understanding of magmatic fragmentation in general, and of fine ash generation in particular. To develop a theoretical basis for predicting the fine ash content of eruptive plumes, we need to understand not only fragmentation mechanisms but also the dependence of those mechanisms on conditions of magma ascent and degassing. Experimental and analytical approaches to this problem include experimental studies of vesiculation and permeability development in silicic melts, quantitative textural studies of pyroclasts to constrain conditions that reduce fragmentation efficiency (that is, allow vesicular clasts to be preserved), direct experiments on fragmentation in both natural and analog materials, and determination of total grain size distributions (TGSDs) of pyroclastic deposits. Experiments on silicic melts have demonstrated that very high supersaturations (overpressures ΔP) may be achieved in silicic melts prior to homogeneous bubble nucleation, and that the high bubble number densities of silicic pumice require not only homogeneous nucleation but also nucleation of a mixed H2O-CO2 gas phase. In most pumice and scoria clasts, resulting vesicle populations form power law size distributions; power law exponents >3 in silicic tephras indicate that small vesicles comprise most of the vesicle volume (consistent with rapid late-stage vesiculation at high ΔP), while exponents 60-70%) and show no dependence on either melt composition or mass eruption rate; this suggests that melt porosity is more important than either decompression rate or magma rheology for clast preservation. These pyroclasts also have uniformly high permeabilities, high pore connectivity, and simple porous pathways, all of which suggest that ease of gas escape also contributed to clast

  10. Quantification of habitat fragmentation reveals extinction risk in terrestrial mammals.

    Science.gov (United States)

    Crooks, Kevin R; Burdett, Christopher L; Theobald, David M; King, Sarah R B; Di Marco, Moreno; Rondinini, Carlo; Boitani, Luigi

    2017-07-18

    Although habitat fragmentation is often assumed to be a primary driver of extinction, global patterns of fragmentation and its relationship to extinction risk have not been consistently quantified for any major animal taxon. We developed high-resolution habitat fragmentation models and used phylogenetic comparative methods to quantify the effects of habitat fragmentation on the world's terrestrial mammals, including 4,018 species across 26 taxonomic Orders. Results demonstrate that species with more fragmentation are at greater risk of extinction, even after accounting for the effects of key macroecological predictors, such as body size and geographic range size. Species with higher fragmentation had smaller ranges and a lower proportion of high-suitability habitat within their range, and most high-suitability habitat occurred outside of protected areas, further elevating extinction risk. Our models provide a quantitative evaluation of extinction risk assessments for species, allow for identification of emerging threats in species not classified as threatened, and provide maps of global hotspots of fragmentation for the world's terrestrial mammals. Quantification of habitat fragmentation will help guide threat assessment and strategic priorities for global mammal conservation.

  11. Scan Statistics

    CERN Document Server

    Glaz, Joseph

    2009-01-01

    Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.

  12. Statistical Diversions

    Science.gov (United States)

    Petocz, Peter; Sowey, Eric

    2008-01-01

    In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…

  13. Practical Statistics

    CERN Document Server

    Lyons, L

    2016-01-01

    Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.

  14. Methods for Measurement and Statistical Analysis of the Frangibility of Strengthened Glass

    Directory of Open Access Journals (Sweden)

    Zhongzhi eTang

    2015-06-01

    Full Text Available Chemically strengthened glass features a surface compression and a balancing central tension (CT in the interior of the glass. A greater CT is usually associated with a higher level of stored elastic energy in the glass. During a fracture event, release of a greater amount of stored energy can lead to frangibility, i.e., shorter crack branching distances, smaller fragment size, and ejection of small fragments from the glass. In this paper, the frangibility and fragmentation behaviors of a series of chemically strengthened glass samples are studied using two different manual testing methods and an automated tester. Both immediate and delayed fracture events were observed. A statistical method is proposed to determine the probability of frangible fracture for glasses ion exchanged under a specific set of conditions, and analysis is performed to understand the dependence of frangibility probability on sample thickness, CT, and testing method. We also propose a more rigorous set of criteria for qualifying frangibility.

  15. Bootstrap embedding: An internally consistent fragment-based method

    Science.gov (United States)

    Welborn, Matthew; Tsuchimochi, Takashi; Van Voorhis, Troy

    2016-08-01

    Strong correlation poses a difficult problem for electronic structure theory, with computational cost scaling quickly with system size. Fragment embedding is an attractive approach to this problem. By dividing a large complicated system into smaller manageable fragments "embedded" in an approximate description of the rest of the system, we can hope to ameliorate the steep cost of correlated calculations. While appealing, these methods often converge slowly with fragment size because of small errors at the boundary between fragment and bath. We describe a new electronic embedding method, dubbed "Bootstrap Embedding," a self-consistent wavefunction-in-wavefunction embedding theory that uses overlapping fragments to improve the description of fragment edges. We apply this method to the one dimensional Hubbard model and a translationally asymmetric variant, and find that it performs very well for energies and populations. We find Bootstrap Embedding converges rapidly with embedded fragment size, overcoming the surface-area-to-volume-ratio error typical of many embedding methods. We anticipate that this method may lead to a low-scaling, high accuracy treatment of electron correlation in large molecular systems.

  16. Informal Statistics Help Desk

    Science.gov (United States)

    Young, M.; Koslovsky, M.; Schaefer, Caroline M.; Feiveson, A. H.

    2017-01-01

    Back by popular demand, the JSC Biostatistics Laboratory and LSAH statisticians are offering an opportunity to discuss your statistical challenges and needs. Take the opportunity to meet the individuals offering expert statistical support to the JSC community. Join us for an informal conversation about any questions you may have encountered with issues of experimental design, analysis, or data visualization. Get answers to common questions about sample size, repeated measures, statistical assumptions, missing data, multiple testing, time-to-event data, and when to trust the results of your analyses.

  17. Introductory statistics

    CERN Document Server

    Ross, Sheldon M

    2005-01-01

    In this revised text, master expositor Sheldon Ross has produced a unique work in introductory statistics. The text's main merits are the clarity of presentation, contemporary examples and applications from diverse areas, and an explanation of intuition and ideas behind the statistical methods. To quote from the preface, ""It is only when a student develops a feel or intuition for statistics that she or he is really on the path toward making sense of data."" Ross achieves this goal through a coherent mix of mathematical analysis, intuitive discussions and examples.* Ross's clear writin

  18. Introductory statistics

    CERN Document Server

    Ross, Sheldon M

    2010-01-01

    In this 3rd edition revised text, master expositor Sheldon Ross has produced a unique work in introductory statistics. The text's main merits are the clarity of presentation, contemporary examples and applications from diverse areas, and an explanation of intuition and ideas behind the statistical methods. Concepts are motivated, illustrated and explained in a way that attempts to increase one's intuition. To quote from the preface, ""It is only when a student develops a feel or intuition for statistics that she or he is really on the path toward making sense of data."" Ross achieves this

  19. Statistical physics

    CERN Document Server

    Wannier, Gregory H

    2010-01-01

    Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for

  20. Semiconductor statistics

    CERN Document Server

    Blakemore, J S

    1962-01-01

    Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co

  1. Effects of fragmentation on the spatial ecology of the California Kingsnake (Lampropeltis californiae)

    Science.gov (United States)

    Anguiano, Michael P.; Diffendorfer, James E.

    2015-01-01

    We investigated the spatial ecology of the California Kingsnake (Lampropeltis californiae) in unfragmented and fragmented habitat with varying patch sizes and degrees of exposure to urban edges. We radiotracked 34 Kingsnakes for up to 3 yr across four site types: interior areas of unfragmented ecological reserves, the urbanized edge of these reserves, large habitat fragments, and small habitat fragments. There was no relationship between California Kingsnake movements and the degree of exposure to urban edges and fragmentation. Home range size and movement patterns of Kingsnakes on edges and fragments resembled those in unfragmented sites. Average home-range size on each site type was smaller than the smallest fragment in which snakes were tracked. The persistence of California Kingsnakes in fragmented landscapes may be related directly to their small spatial movement patterns, home-range overlap, and ability to use urban edge habitat.

  2. 城市景观林中幼龄期红锥个体大小之统计分布模型%Statistical distribution models of body sizes of young Castanopsis hystrix in an urban landscape forest

    Institute of Scientific and Technical Information of China (English)

    殷祚云; 曾令海; 何波祥; 连辉明; 张谦; 蔡燕灵; 陈一群; 蓝燕群

    2013-01-01

    Studies have been rare on several competing models of statistical distributions simultaneously confronted with data from several body size indicators of tree species, especially for native evergreen broad-leaved trees such as Castanopsis hystrix A. DC., an important timber and landscaping tree in south subtropics of China. We carefully surveyed 4 body size indicators (canopy diameter, diameter at breast height, ground diameter and tree height) of 2-year C. hystrix in an urban landscape forest of Guangzhou, South China. Here we collected a set of relatively complete statistical distributions consisting of 12 major continuous distributions with rather different function types to model the observed frequency distributions of 4 length indicators and their 6 derived area and volume ones using the methods of both maximum likelihood and minimum squares. We found that:(1) 10 body sizes each have their own best-fitted distribution models and also share several ones, with the gamma distribution best, followed by the logistic and the Weibull;(2) distribution curves in shape vary with body size dimensions, i.e., from 1-(diameter and height), 2-(area) to 3-dimension (volume), they become shorter, more positively skew and even change from unimodal to hollow shape; (3) the goodness-of-fit of expected probability distributions to observed frequency distributions is related to indicator scale, and is generally greater on log scale than on linear scale, e.g., the logCauchy distribution has better fit than the Cauchy distribution to data from 8 out of 10 indicators except the diameter and area at breast height;and (4) a new statistic CAIC×KS/R2 integrated with the consistent Akaike information criterion (CAIC), Kolmogorov-Smirnov test statistic (KS) and the determination coefficient of regression (R2) can serve as a comprehensive criterion of goodness-of-fit. Our study can provide insight into some other research areas including tree silviculture and breeding, and population ecology

  3. Stone fragmentation by ultrasound

    Indian Academy of Sciences (India)

    S K Shrivastava; Kailash

    2004-08-01

    The presence of kidney stone in the kidney causes discomfort to patients. Hence, removal of such stones is important which is commonly done these days, non-destructively, with lithotripters without surgery. Commercially, lithotripters like extra-corporeal shock wave lithotripters (ESWL) made by Siemens etc are in routine use. These methods are very cumbersome and expensive. Treatment of the patients also takes comparatively more time because of more number of sittings. Some delicate nerves and fibres in the surrounding areas of the stones present in the kidney are also damaged by high ultrasonic intensity used in such systems. In the present work, enhancement of the kidney stone fragmentation by using ultrasound is studied. The cavitation bubbles are found to implode faster, with more disintegration efficiency of the lithotripters, which give better treatment to the patients.

  4. SEER Statistics

    Science.gov (United States)

    The Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute works to provide information on cancer statistics in an effort to reduce the burden of cancer among the U.S. population.

  5. Cancer Statistics

    Science.gov (United States)

    ... Resources Conducting Clinical Trials Statistical Tools and Data Terminology Resources NCI Data Catalog Cryo-EM NCI's Role ... Contacts Other Funding Find NCI funding for small business innovation, technology transfer, and contracts Training Cancer Training ...

  6. CMS Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...

  7. Reversible Statistics

    DEFF Research Database (Denmark)

    Tryggestad, Kjell

    2004-01-01

    The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...

  8. Image Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-08

    In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.

  9. Accident Statistics

    Data.gov (United States)

    Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...

  10. Release and characteristics of fungal fragments in various conditions

    Energy Technology Data Exchange (ETDEWEB)

    Mensah-Attipoe, Jacob [Department of Environmental Science, University of Eastern Finland, Yliopistonranta 1D, P. O. Box 1627, FI-70211 Kuopio (Finland); Saari, Sampo [Department of Physics, Tampere University of Technology, Korkeakoulunkatu 3, 33720 Tampere (Finland); Veijalainen, Anna-Maria; Pasanen, Pertti [Department of Environmental Science, University of Eastern Finland, Yliopistonranta 1D, P. O. Box 1627, FI-70211 Kuopio (Finland); Keskinen, Jorma [Department of Physics, Tampere University of Technology, Korkeakoulunkatu 3, 33720 Tampere (Finland); Leskinen, Jari T.T. [SIB Labs, University of Eastern Finland, Yliopistonranta 1E, P. O. Box 1627, FI-70211, Kuopio (Finland); Reponen, Tiina, E-mail: reponeta@ucmail.uc.edu [Department of Environmental Science, University of Eastern Finland, Yliopistonranta 1D, P. O. Box 1627, FI-70211 Kuopio (Finland); Department of Environmental Health, University of Cincinnati, P.O. Box 670056, Cincinnati, OH 45267-0056 (United States)

    2016-03-15

    Intact spores and submicrometer size fragments are released from moldy building materials during growth and sporulation. It is unclear whether all fragments originate from fungal growth or if small pieces of building materials are also aerosolized as a result of microbial decomposition. In addition, particles may be formed through nucleation from secondary metabolites of fungi, such as microbial volatile organic compounds (MVOCs). In this study, we used the elemental composition of particles to characterize the origin of submicrometer fragments released from materials contaminated by fungi. Particles from three fungal species (Aspergillus versicolor, Cladosporium cladosporioides and Penicillium brevicompactum), grown on agar, wood and gypsum board were aerosolized using the Fungal Spore Source Strength Tester (FSSST) at three air velocities (5, 16 and 27 m/s). Released spores (optical size, d{sub p} ≥ 0.8 μm) and fragments (d{sub p} ≤ 0.8 μm) were counted using direct-reading optical aerosol instruments. Particles were also collected on filters, and their morphology and elemental composition analyzed using scanning electron microscopes (SEMs) coupled with an Energy-Dispersive X-ray spectroscopy (EDX). Among the studied factors, air velocity resulted in the most consistent trends in the release of fungal particles. Total concentrations of both fragments and spores increased with an increase in air velocity for all species whereas fragment–spore (F/S) ratios decreased. EDX analysis showed common elements, such as C, O, Mg and Ca, for blank material samples and fungal growth. However, N and P were exclusive to the fungal growth, and therefore were used to differentiate biological fragments from non-biological ones. Our results indicated that majority of fragments contained N and P. Because we observed increased release of fragments with increased air velocities, nucleation of MVOCs was likely not a relevant process in the formation of fungal fragments. Based

  11. Fragment-based activity space: smaller is better.

    Science.gov (United States)

    Hesterkamp, Thomas; Whittaker, Mark

    2008-06-01

    Fragment-based drug discovery has the potential to supersede traditional high throughput screening based drug discovery for molecular targets amenable to structure determination. This is because the chemical diversity coverage is better accomplished by a fragment collection of reasonable size than by larger HTS collections. Furthermore, fragments have the potential to be efficient target binders with higher probability than more elaborated drug-like compounds. The selection of the fragment screening technique is driven by sensitivity and throughput considerations, and we advocate in the present article the use of high concentration bioassays in conjunction with NMR-based hit confirmation. Subsequent ligand X-ray structure determination of the fragment ligand in complex with the target protein by co-crystallisation or crystal soaking can focus on confirmed binders.

  12. Meta-analysis of the effects of forest fragmentation on interspecific interactions.

    Science.gov (United States)

    Magrach, Ainhoa; Laurance, William F; Larrinaga, Asier R; Santamaria, Luis

    2014-10-01

    Forest fragmentation dramatically alters species persistence and distribution and affects many ecological interactions among species. Recent studies suggest that mutualisms, such as pollination and seed dispersal, are more sensitive to the negative effects of forest fragmentation than antagonisms, such as predation or herbivory. We applied meta-analytical techniques to evaluate this hypothesis and quantified the relative contributions of different components of the fragmentation process (decreases in fragment size, edge effects, increased isolation, and habitat degradation) to the overall effect. The effects of fragmentation on mutualisms were primarily driven by habitat degradation, edge effects, and fragment isolation, and, as predicted, they were consistently more negative on mutualisms than on antagonisms. For the most studied interaction type, seed dispersal, only certain components of fragmentation had significant (edge effects) or marginally significant (fragment size) effects. Seed size modulated the effect of fragmentation: species with large seeds showed stronger negative impacts of fragmentation via reduced dispersal rates. Our results reveal that different components of the habitat fragmentation process have varying impacts on key mutualisms. We also conclude that antagonistic interactions have been understudied in fragmented landscapes, most of the research has concentrated on particular types of mutualistic interactions such as seed dispersal, and that available studies of interspecific interactions have a strong geographical bias (arising mostly from studies carried out in Brazil, Chile, and the United States). © 2014 Society for Conservation Biology.

  13. Observations of Titan 3C-4 Transtage Fragmentation Debris

    Science.gov (United States)

    Cowardin, Heather; Seitzer, P.; Abercromby, K.; Barker, E.; Cardona, T.; Krisko, P.; Lederer, S.

    2013-01-01

    The fragmentation of a Titan 3C-4 Transtage (1968-081) on 21 February 1992 is one of only two known break-ups in or near geosynchronous orbit. The original rocket body and 24 pieces of debris are currently being tracked by the US Space Surveillance Network (SSN). The rocket body (SSN# 3432) and several of the original fragments (SSN# 25000, 25001, 30000, and 33511) were observed in survey mode during 2004-2010 using the 0.6-m Michigan Orbital DEbris Survey Telescope (MODEST) in Chile using a broad R filter. This paper will present a size distribution for all calibrated magnitude data acquired on MODEST. Size distribution plots will also be shown using historical models for small fragmentation debris (down to 10 cm) believed to be associated with the Titan break-up. In November 2010, visible broadband photometry (Johnson/Kron-Cousins BVRI) was acquired with the 0.9-m Small and Moderate Aperture Research Telescope System (SMARTS) at the Cerro Tololo Inter-American Observatory (CTIO) in Chile on several Titan fragments (SSN# 25001, 33509, 33510) and the parent rocket body. Color index data will be used to determine the fragment brightness distribution and how the data compares to spacecraft materials measured in the laboratory using similar photometric measurement techniques. In 2012, the SSN added 16 additional fragments to the catalogue. MODEST acquired magnitude data on ten Titan fragments in late 2012 and early 2013. The magnitude distribution of all the observed fragments are analyzed as a function of time. In order to better characterize the breakup fragments spectral measurements were acquired on the original rocket body and five Titan fragments using the 6.5-m Magellan telescopes at Las Campanas Observatory in Chile. The telescopic spectra are compared with laboratory acquired spectra of materials (e.g., Aluminum and various paints) and categorized based on known absorption features for spacecraft materials.

  14. The Spectrum of Satellite Breakup and Fragmentation

    Science.gov (United States)

    Finkleman, D.

    The objective of this paper is to expose the spectrum of satellite breakup physics and is implications for debris production and observables. Satellite response to the debris environment generally emphasizes small scale hypervelocity impact or the interaction of intense, coherent radiation with satellite surfaces or internals. There are empirical correlations of fragment size distributions based on arena tests and extremely rare observations of breakups in space. Klinkrad describes well research on material response to hypervelocity impact such as the ballistic limit for various materials and shielding walls. Smirnov, et. al., report well the phenomenology of breakups under the influence of nonuniform internal loading of monolithic bodies, such as pressurized tanks. They set forth the transformation of elastic energy into fragment kinetic energy. They establish a sound physical framework for bounding the number of fragments. We took advantage of these works in our previous papers. There is not much research into the response of nonuniform structures to hypervelocity collisions with similarly massive and complex objects. This work generally employs complex hydrodynamic and finite element computation that is not well suited to real time, operational assessment of the consequences of such encounters. We hope to diminish the void between the extremes of microscopic impact and complex hydrocodes. Our previous reports employed the framework established by Chobotov and Spencer, fundamentally equilibrium, Newtonian approach. We now explore the spectrum of interactions and debris evolutions possible with realistic combinations of these theories. The spectrum encompasses Newtonian, semi-elastic energy and momentum transfer through little or no momentum exchange and from virtually all of the mass of the colliders being involved through fractional mass involvement. We observe that the more Newtonian outcomes do not agree well with sparse observations of the few collisions that

  15. Subduction controls the distribution and fragmentation of Earth’s tectonic plates

    Science.gov (United States)

    Mallard, Claire; Coltice, Nicolas; Seton, Maria; Müller, R. Dietmar; Tackley, Paul J.

    2016-07-01

    The theory of plate tectonics describes how the surface of Earth is split into an organized jigsaw of seven large plates of similar sizes and a population of smaller plates whose areas follow a fractal distribution. The reconstruction of global tectonics during the past 200 million years suggests that this layout is probably a long-term feature of Earth, but the forces governing it are unknown. Previous studies, primarily based on the statistical properties of plate distributions, were unable to resolve how the size of the plates is determined by the properties of the lithosphere and the underlying mantle convection. Here we demonstrate that the plate layout of Earth is produced by a dynamic feedback between mantle convection and the strength of the lithosphere. Using three-dimensional spherical models of mantle convection that self-consistently produce the plate size-frequency distribution observed for Earth, we show that subduction geometry drives the tectonic fragmentation that generates plates. The spacing between the slabs controls the layout of large plates, and the stresses caused by the bending of trenches break plates into smaller fragments. Our results explain why the fast evolution in small back-arc plates reflects the marked changes in plate motions during times of major reorganizations. Our study opens the way to using convection simulations with plate-like behaviour to unravel how global tectonics and mantle convection are dynamically connected.

  16. Lead Fragment Ingestion by Birds: Shooting Down Another Myth

    Science.gov (United States)

    2010-06-17

    COVERED 00-00-2010 to 00-00-2010 4. TITLE AND SUBTITLE Lead Fragment Ingestion by Birds: Shooting Down Another Myth 5a. CONTRACT NUMBER 5b...considerable proportion maybe  A brief background to the (perceived) “problem” . . . • Birds display grit- ingesting behavior. Avian digestion in a...Birds display grit- ingesting behavior. • Lead particles in the environment (e.g., spent shot, bullet fragments) approximate the size of

  17. Effects of fragmentation on plant adaptation to urban environments.

    Science.gov (United States)

    Dubois, Jonathan; Cheptou, Pierre-Olivier

    2017-01-19

    Urban ecosystems are relatively recent and heavily human-altered terrestrial ecosystems with a surprisingly high diversity of animals, plants and other organisms. Urban habitats are also strongly fragmented and subject to higher temperatures, providing a compelling model for studying adaptation to global change. Crepis sancta (Asteraceae), an annual Mediterranean wasteland weed, occupies fragmented urban environments as well as certain unfragmented landscapes in southern France. We tested for shifts in dispersal, reproductive traits and size across a rural-urban gradient to learn whether and how selection may be driving changes in life history in urban and fragmented habitats. We specifically compared the structure of quantitative genetic variation and of neutral markers (microsatellites) between urban and rural and between fragmented and unfragmented habitats. We showed that fragmentation provides a better descriptor of trait variation than urbanization per se for dispersal traits. Fragmentation also affected reproductive traits and plant size though one rural population did conform to this scheme. Our study shows the role of fragmentation for dispersal traits shift in urban environments and a more complex pattern for other traits. We discuss the role of pollinator scarcity and an inhospitable matrix as drivers of adaptation.This article is part of the themed issue 'Human influences on evolution, and the ecological and societal consequences'.

  18. Amplified-fragment length polymorphism fingerprinting of Mycoplasma species

    DEFF Research Database (Denmark)

    Kokotovic, Branko; Friis, N.F.; Jensen, J.S.

    1999-01-01

    Amplified-fragment length polymorphism (AFLP) is a whole-genome fingerprinting method based on selective amplification of restriction fragments. The potential of the method for the characterization of mycoplasmas was investigated in a total of 50 strains of human and animal origin, including......I restriction endonucleases and subsequent ligation of corresponding site-specific adapters. The amplification of AFLP templates with a single set of nonselective primers resulted in reproducible fingerprints of approximately 60 to 80 fragments in the size range of 50 to 500 bp, The method was able...

  19. Statistical mechanics

    CERN Document Server

    Jana, Madhusudan

    2015-01-01

    Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...

  20. Statistical mechanics

    CERN Document Server

    Schwabl, Franz

    2006-01-01

    The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...

  1. STATISTICAL METHODS IN HISTORY

    Directory of Open Access Journals (Sweden)

    Orlov A. I.

    2016-01-01

    Full Text Available We have given a critical analysis of statistical models and methods for processing text information in historical records to establish the times when there were certain events, ie, to build science-based chronology. There are three main kinds of sources of knowledge of ancient history: ancient texts, the remains of material culture and traditions. The specific date of the extracted by archaeologists objects in most cases can not be found. The group of Academician A.T. Fomenko has developed and applied new statistical methods for analysis of historical texts (Chronicle, based on the intensive use of computer technology. Two major scientific results were: the majority of historical records that we know now, are duplicated (in particular, chronicles, describing the so-called "Ancient Rome" and "Middle Ages", talking about the same events; the known historical chronicles tell us about real events, separated from the present time for not more than 1000 years. It was found that chronicles describing the history of "ancient times" and "Middle Ages" and the chronicle of Chinese history and the history of various European countries do not talk about different, but about the same events. We have the attempt of a new dating of historical events and restoring the true history of human society based on new data. From the standpoint of statistical methods of historical records and images of their fragments – they are special cases of non-numeric objects of nature. Therefore, developed by the group of A.T. Fomenko computer-statistical methods are the part of non-numerical statistics. We have considered some methods of statistical analysis of chronicles applied by the group of A.T. Fomenko: correlation method of maximums; dynasties method; the method of attenuation frequency; questionnaire method codes. New chronology allows us to understand much of the battle of ideas in modern science and mass consciousness. It becomes clear the root cause of cautious

  2. Statistical inference

    CERN Document Server

    Rohatgi, Vijay K

    2003-01-01

    Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth

  3. Statistical Physics

    CERN Document Server

    Mandl, Franz

    1988-01-01

    The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient

  4. AP statistics

    CERN Document Server

    Levine-Wissing, Robin

    2012-01-01

    All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep

  5. Statistical methods

    CERN Document Server

    Freund, Rudolf J; Wilson, William J

    2010-01-01

    Statistical Methods, 3e provides students with a working introduction to statistical methods offering a wide range of applications that emphasize the quantitative skills useful across many academic disciplines. This text takes a classic approach emphasizing concepts and techniques for working out problems and intepreting results. The book includes research projects, real-world case studies, numerous examples and data exercises organized by level of difficulty. This text requires that a student be familiar with algebra. New to this edition: NEW expansion of exercises a

  6. Statistical mechanics

    CERN Document Server

    Davidson, Norman

    2003-01-01

    Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody

  7. Thermodynamical string fragmentation

    Science.gov (United States)

    Fischer, Nadine; Sjöstrand, Torbjörn

    2017-01-01

    The observation of heavy-ion-like behaviour in pp collisions at the LHC suggests that more physics mechanisms are at play than traditionally assumed. The introduction e.g. of quark-gluon plasma or colour rope formation can describe several of the observations, but as of yet there is no established paradigm. In this article we study a few possible modifications to the Pythia event generator, which describes a wealth of data but fails for a number of recent observations. Firstly, we present a new model for generating the transverse momentum of hadrons during the string fragmentation process, inspired by thermodynamics, where heavier hadrons naturally are suppressed in rate but obtain a higher average transverse momentum. Secondly, close-packing of strings is taken into account by making the temperature or string tension environment-dependent. Thirdly, a simple model for hadron rescattering is added. The effect of these modifications is studied, individually and taken together, and compared with data mainly from the LHC. While some improvements can be noted, it turns out to be nontrivial to obtain effects as big as required, and further work is called for.

  8. Thermodynamical String Fragmentation

    CERN Document Server

    Fischer, Nadine

    2016-01-01

    The observation of heavy-ion-like behaviour in pp collisions at the LHC suggests that more physics mechanisms are at play than traditionally assumed. The introduction e.g. of quark-gluon plasma or colour rope formation can describe several of the observations, but as of yet there is no established paradigm. In this article we study a few possible modifications to the Pythia event generator, which describes a wealth of data but fails for a number of recent observations. Firstly, we present a new model for generating the transverse momentum of hadrons during the string fragmentation process, inspired by thermodynamics, where heavier hadrons naturally are suppressed in rate but obtain a higher average transverse momentum. Secondly, close-packing of strings is taken into account by making the temperature or string tension environment-dependent. Thirdly, a simple model for hadron rescattering is added. The effect of these modifications is studied, individually and taken together, and compared with data mainly from...

  9. Fragmentation Considered Poisonous

    CERN Document Server

    Herzberg, Amir

    2012-01-01

    We present practical poisoning and name-server block- ing attacks on standard DNS resolvers, by off-path, spoofing adversaries. Our attacks exploit large DNS responses that cause IP fragmentation; such long re- sponses are increasingly common, mainly due to the use of DNSSEC. In common scenarios, where DNSSEC is partially or incorrectly deployed, our poisoning attacks allow 'com- plete' domain hijacking. When DNSSEC is fully de- ployed, attacker can force use of fake name server; we show exploits of this allowing off-path traffic analy- sis and covert channel. When using NSEC3 opt-out, attacker can also create fake subdomains, circumvent- ing same origin restrictions. Our attacks circumvent resolver-side defenses, e.g., port randomisation, IP ran- domisation and query randomisation. The (new) name server (NS) blocking attacks force re- solver to use specific name server. This attack allows Degradation of Service, traffic-analysis and covert chan- nel, and also facilitates DNS poisoning. We validated the attac...

  10. Dung beetle (Coleoptera, Scarabaeidae assemblage of a highly fragmented landscape of Atlantic forest: from small to the largest fragments of northeastern Brazilian region

    Directory of Open Access Journals (Sweden)

    Renato P. Salomão

    2015-06-01

    Full Text Available Human activities in tropical forests are the main causes of forest fragmentation. According to historical factor in deforestation processes, forest remnants exhibit different sizes and shapes. The aim of the present study was to evaluate the dung beetle assemblage on fragments of different degree of sizes. Sampling was performed during rainy and dry season of 2010 in six fragments of Atlantic forest, using pitfall traps baited with excrement and carrion. Also, we used two larger fragments as control. We used General Linear Models to determine whether the fragments presented distinguished dung beetle abundance and richness. Analysis of Similarities and Non-Metric Multidimensional Scaling were used to determine whether the dung beetle assemblage was grouped according to species composition. A total of 3352 individuals were collected and 19 species were identified in the six fragments sampled. Dung beetle abundance exhibited a shift according to fragment size; however, richness did not change among fragments evaluated. Also, fragments sampled and the two controls exhibited distinct species composition. The distinction on abundance of dung beetles among fragments may be related to different amount of resource available in each one. It is likely that the dung beetle richness did not distinguish among the different fragments due to the even distribution of the mammal communities in these patches, and consequent equal dung diversity. We conclude that larger fragments encompass higher abundance of dung beetle and distinct species. However, for a clearer understanding of effects of fragmentation on dung beetles in Atlantic forest, studies evaluating narrower variations of larger fragments should be conducted.

  11. Statistics; Tilastot

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-12-31

    For the year 1997 and 1998, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually includes also historical time series over a longer period (see e.g. Energiatilastot 1997, Statistics Finland, Helsinki 1998, ISSN 0784-3165). The inside of the Review`s back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO{sub 2}-emissions, Electricity supply, Energy imports by country of origin in January-September 1998, Energy exports by recipient country in January-September 1998, Consumer prices of liquid fuels, Consumer prices of hard coal, Natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, Value added taxes and fiscal charges and fees included in consumer prices of some energy sources, Energy taxes and precautionary stock fees, pollution fees on oil products

  12. Statistics; Tilastot

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-12-31

    For the year 1997 and 1998, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually includes also historical time series over a longer period (see e.g. Energiatilastot 1996, Statistics Finland, Helsinki 1997, ISSN 0784-3165). The inside of the Review`s back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO{sub 2}-emissions, Electricity supply, Energy imports by country of origin in January-June 1998, Energy exports by recipient country in January-June 1998, Consumer prices of liquid fuels, Consumer prices of hard coal, Natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, Value added taxes and fiscal charges and fees included in consumer prices of some energy sources, Energy taxes and precautionary stock fees, pollution fees on oil products

  13. Statistical Mechancis

    CERN Document Server

    Gallavotti, Giovanni

    2011-01-01

    C. Cercignani: A sketch of the theory of the Boltzmann equation.- O.E. Lanford: Qualitative and statistical theory of dissipative systems.- E.H. Lieb: many particle Coulomb systems.- B. Tirozzi: Report on renormalization group.- A. Wehrl: Basic properties of entropy in quantum mechanics.

  14. An Algebra for Program Fragments

    DEFF Research Database (Denmark)

    Kristensen, Bent Bruun; Madsen, Ole Lehrmann; Møller-Pedersen, Birger

    1985-01-01

    Program fragments are described either by strings in the concrete syntax or by constructor applications in the abstract syntax. By defining conversions between these forms, both may be intermixed. Program fragments are constructed by terminal and nonterminal symbols from the grammar and by variab...

  15. Complete axiomatizations for XPath fragments

    NARCIS (Netherlands)

    ten Cate, B.; Litak, T.; Marx, M.

    2010-01-01

    We provide complete axiomatizations for several fragments of Core XPath, the navigational core of XPath 1.0 introduced by Gottlob, Koch and Pichler. A complete axiomatization for a given fragment is a set of equivalences from which every other valid equivalence is derivable; equivalences can be thou

  16. Analytical Solution of Smoluchowski Equations in Aggregation–Fragmentation Processes

    Science.gov (United States)

    Sekiyama, Makoto; Ohtsuki, Toshiya; Yamamoto, Hiroshi

    2017-10-01

    The z-transform technique is used to analyze Smoluchowski equations of aggregation-fragmentation processes where the selection of aggregation clusters, a decomposed cluster and a generated cluster is entirely random and independent of cluster size. An analytic form of asymptotic behavior for a cluster size distribution function is derived on the basis of approximation where lower-order terms in the average cluster size are neglected. The obtained results agree well with numerical ones.

  17. SDI: Statistical dynamic interactions

    Energy Technology Data Exchange (ETDEWEB)

    Blann, M.; Mustafa, M.G. (Lawrence Livermore National Lab., CA (USA)); Peilert, G.; Stoecker, H.; Greiner, W. (Frankfurt Univ. (Germany, F.R.). Inst. fuer Theoretische Physik)

    1991-04-01

    We focus on the combined statistical and dynamical aspects of heavy ion induced reactions. The overall picture is illustrated by considering the reaction {sup 36}Ar + {sup 238}U at a projectile energy of 35 MeV/nucleon. We illustrate the time dependent bound excitation energy due to the fusion/relaxation dynamics as calculated with the Boltzmann master equation. An estimate of the mass, charge and excitation of an equilibrated nucleus surviving the fast (dynamic) fusion-relaxation process is used as input into an evaporation calculation which includes 20 heavy fragment exit channels. The distribution of excitations between residue and clusters is explicitly calculated, as is the further deexcitation of clusters to bound nuclei. These results are compared with the exclusive cluster multiplicity measurements of Kim et al., and are found to give excellent agreement. We consider also an equilibrated residue system at 25% lower initial excitation, which gives an unsatisfactory exclusive multiplicity distribution. This illustrates that exclusive fragment multiplicity may provide a thermometer for system excitation. This analysis of data involves successive binary decay with no compressional effects nor phase transitions. Several examples of primary versus final (stable) cluster decay probabilities for an A = 100 nucleus at excitations of 100 to 800 MeV are presented. From these results a large change in multifragmentation patterns may be understood as a simple phase space consequence, invoking neither phase transitions, nor equation of state information. These results are used to illustrate physical quantities which are ambiguous to deduce from experimental fragment measurements. 14 refs., 4 figs.

  18. Species-specific responses to landscape fragmentation: implications for management strategies.

    Science.gov (United States)

    Blanchet, Simon; Rey, Olivier; Etienne, Roselyne; Lek, Sovan; Loot, Géraldine

    2010-05-01

    Habitat fragmentation affects the integrity of many species, but little is known about species-specific sensitivity to fragmentation. Here, we compared the genetic structure of four freshwater fish species differing in their body size (Leuciscus cephalus; Leuciscus leuciscus; Gobio gobio and Phoxinus phoxinus) between a fragmented and a continuous landscape. We tested if, overall, fragmentation affected the genetic structure of these fish species, and if these species differed in their sensitivity to fragmentation. Fragmentation negatively affected the genetic structure of these species. Indeed, irrespective of the species identity, allelic richness and heterozygosity were lower, and population divergence was higher in the fragmented than in the continuous landscape. This response to fragmentation was highly species-specific, with the smallest fish species (P. phoxinus) being slightly affected by fragmentation. On the contrary, fish species of intermediate body size (L. leuciscus and G. gobio) were highly affected, whereas the largest fish species (L. cephalus) was intermediately affected by fragmentation. We discuss the relative role of dispersal ability and effective population size on the responses to fragmentation we report here. The weirs studied here are of considerable historical importance. We therefore conclude that restoration programmes will need to consider both this societal context and the biological characteristics of the species sharing this ecosystem.

  19. Fermi breakup and the statistical multifragmentation model

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, B.V., E-mail: brett@ita.br [Departamento de Fisica, Instituto Tecnologico de Aeronautica - CTA, 12228-900 Sao Jose dos Campos (Brazil); Donangelo, R. [Instituto de Fisica, Universidade Federal do Rio de Janeiro, Cidade Universitaria, CP 68528, 21941-972, Rio de Janeiro (Brazil); Instituto de Fisica, Facultad de Ingenieria, Universidad de la Republica, Julio Herrera y Reissig 565, 11.300 Montevideo (Uruguay); Souza, S.R. [Instituto de Fisica, Universidade Federal do Rio de Janeiro, Cidade Universitaria, CP 68528, 21941-972, Rio de Janeiro (Brazil); Instituto de Fisica, Universidade Federal do Rio Grande do Sul, Av. Bento Goncalves 9500, CP 15051, 91501-970, Porto Alegre (Brazil); Lynch, W.G.; Steiner, A.W.; Tsang, M.B. [Joint Institute for Nuclear Astrophysics, National Superconducting Cyclotron Laboratory and the Department of Physics and Astronomy, Michigan State University, East Lansing, MI 48824 (United States)

    2012-02-15

    We demonstrate the equivalence of a generalized Fermi breakup model, in which densities of excited states are taken into account, to the microcanonical statistical multifragmentation model used to describe the disintegration of highly excited fragments of nuclear reactions. We argue that such a model better fulfills the hypothesis of statistical equilibrium than the Fermi breakup model generally used to describe statistical disintegration of light mass nuclei.

  20. Experimental statistics

    CERN Document Server

    Natrella, Mary Gibbons

    2005-01-01

    Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations

  1. Vapor film collapse triggered by external pressure pulse and the fragmentation of melt droplet in FCIs

    Institute of Scientific and Technical Information of China (English)

    LIN Qian; TONG Lili; CAO Xuewu; KRIVENTSEV Vladimir

    2008-01-01

    The fragmentation process of high-temperature molten drop is a key factor to determine the ratio heat transferred to power in FCIs,which estimates the possible damage degree during the hypothetical severe accident in the nuclear reactors.In this paper,the fragmentation process of melt droplet in FCIs is investigated by theoretic analysis.The fragmentation mechanism is studied when an external pressure pulse applied to a melt droplet,which is surrounded by vapor film.The vapor film collapse which induces fragmentation of melt droplet is analyzed and modeled.And then the generated pressure is calculated.The vapor film collapse model is introduced to fragmentation correlation,and the predicted fragment size is calculated and compared with experimental data.The result shows that the developed model can predict the diameter of fragments and can be used to calculate the fragmentation process appreciatively.

  2. Effects of Habitat Structure and Fragmentation on Diversity and Abundance of Primates in Tropical Deciduous Forests in Bolivia.

    Science.gov (United States)

    Pyritz, Lennart W; Büntge, Anna B S; Herzog, Sebastian K; Kessler, Michael

    2010-10-01

    Habitat structure and anthropogenic disturbance are known to affect primate diversity and abundance. However, researchers have focused on lowland rain forests, whereas endangered deciduous forests have been neglected. We aimed to investigate the relationships between primate diversity and abundance and habitat parameters in 10 deciduous forest fragments southeast of Santa Cruz, Bolivia. We obtained primate data via line-transect surveys and visual and acoustic observations. In addition, we assessed the vegetation structure (canopy height, understory density), size, isolation time, and surrounding forest area of the fragments. We interpreted our results in the context of the historical distribution data for primates in the area before fragmentation and interviews with local people. We detected 5 of the 8 historically observed primate species: Alouatta caraya, Aotus azarae boliviensis, Callithrix melanura, Callicebus donacophilus, and Cebus libidinosus juruanus. Total species number and detection rates decreased with understory density. Detection rates also negatively correlated with forest areas in the surroundings of a fragment, which may be due to variables not assessed, i.e., fragment shape, distance to nearest town. Observations for Alouatta and Aotus were too few to conduct further statistics. Cebus and Callicebus were present in 90% and 70% of the sites, respectively, and their density did not correlate with any of the habitat variables assessed, signaling high ecological plasticity and adaptability to anthropogenic impact in these species. Detections of Callithrix were higher in areas with low forest strata. Our study provides baseline data for future fragmentation studies in Neotropical dry deciduous forests and sets a base for specific conservation measures.

  3. Multiscale patterns of movement in fragmented landscapes and consequences on demography of the snail kite in Florida

    Science.gov (United States)

    Martin, J.; Nichols, J.D.; Kitchens, W.M.; Hines, J.E.

    2006-01-01

    1. Habitat loss and fragmentation are major factors affecting vertebrate populations. A major effect of these habitat alterations is that they reduce movement of organisms. Despite the accepted importance of movement in driving the dynamics of many natural populations, movement of vertebrates in fragmented landscapes have seldom been estimated with robust statistical methods. 2. We estimated movement probabilities of snail kites Rosthramus sociabilis within the remaining wetlands in Florida. Using both radio-telemetry and banding information, we used a multistate modelling approach to estimate transition probabilities at two temporal scales (month; year) and multiple spatial scales. We examined kite movement among wetlands altered by three different levels of fragmentation: among wetlands separated by small physical barriers (e.g. road); among wetlands separated by moderate amount of matrix ( 15 km). 3. Kites moved extensively among contiguous wetlands (movement probability 0?29 per month), but significantly less among isolated wetlands (movement probability 0?10 per month). 4. Kites showed high levels of annual site fidelity to most isolated wetlands (probability ranged from 0?72 to 0?95 per year). 5. We tested the effects of patch size and interpatch distance on movement. Our modelling indicated an effect of both distance and patch size on juveniles' movement (but not adult) when examining movements among fragments. 6. Only a small proportion of kites escaped a regional drought by moving to refugia (wetlands less affected by drought). Many individuals died after the drought. During drought adult survival dropped by 16% while juvenile survival dropped by 86% (possibly because juveniles were less likely to reach refugia). 7. We hypothesize that fragmentation may decrease kite's resistance to drought by restricting exploratory behaviour.

  4. Forest Fragmentation and Driving Forces in Yingkou, Northeastern China

    Directory of Open Access Journals (Sweden)

    Lei Zhang

    2017-03-01

    Full Text Available Forest fragmentation, the process of changing original large and intact forest patches into smaller and isolated areas, significantly influences the balance of surface physical environment, biodiversity, and species richness. Sufficient knowledge of forest fragmentation is necessary to maintain ecological balance and promote sustainable resource utilization. This study combines remote sensing, geographical information systems, and landscape metrics to assess forest fragmentation at landscape and pixel levels during different time periods (2000–2005, 2005–2010, and 2010–2015 in the Yingkou region. Spatial statistical analysis is also used to analyze the relationship between forest landscape fragmentation and its determinants (e.g., natural factors, socioeconomic factors, and proximity factors. Results show that forest patches became smaller, subdivided, and isolated during 2010–2015 at the total landscape level. Local changes occurred in the southwest of the study region or around the development area. Our data also indicate that shrinkage and subdivision were the main forest fragmentation processes during three times, and attrition became the main forest fragmentation process from 2010 to 2015. These changes were significantly influenced by natural factors (e.g., elevation and slope, proximity factors (e.g., distance to city and distance to province roads, and socioeconomic factors (e.g., gross domestic product. Results presented in this study provide valuable insights into the pattern and processes of forest fragmentation and present direct implications for the protection and reasonable utilization of forest resources.

  5. Driven fragmentation of granular gases.

    Science.gov (United States)

    Cruz Hidalgo, Raúl; Pagonabarraga, Ignacio

    2008-06-01

    The dynamics of homogeneously heated granular gases which fragment due to particle collisions is analyzed. We introduce a kinetic model which accounts for correlations induced at the grain collisions and analyze both the kinetics and relevant distribution functions these systems develop. The work combines analytical and numerical studies based on direct simulation Monte Carlo calculations. A broad family of fragmentation probabilities is considered, and its implications for the system kinetics are discussed. We show that generically these driven materials evolve asymptotically into a dynamical scaling regime. If the fragmentation probability tends to a constant, the grain number diverges at a finite time, leading to a shattering singularity. If the fragmentation probability vanishes, then the number of grains grows monotonously as a power law. We consider different homogeneous thermostats and show that the kinetics of these systems depends weakly on both the grain inelasticity and driving. We observe that fragmentation plays a relevant role in the shape of the velocity distribution of the particles. When the fragmentation is driven by local stochastic events, the long velocity tail is essentially exponential independently of the heating frequency and the breaking rule. However, for a Lowe-Andersen thermostat, numerical evidence strongly supports the conjecture that the scaled velocity distribution follows a generalized exponential behavior f(c) approximately exp(-cn) , with n approximately 1.2 , regarding less the fragmentation mechanisms.

  6. Signatures of statistical decay

    CERN Document Server

    Horn, D; Bowman, D R; Galindo-Uribarri, A; Hagberg, E; Laforest, R; Pouliot, J; Walker, R B; Horn, D; Bowman, D R; Galindo-Uribarri, A; Hagberg, E; Laforest, R; Pouliot, J; Walker, R B

    1995-01-01

    The partition of decay energy between the kinetic energy of reaction products and their Q-value of formation is obtained in a statistical derivation appropriate to highly excited nuclei, and is shown to be in a constant ratio. We measure the kinetic energy fraction, R = \\Sigma E_{kin}/(\\Sigma E_{kin} + \\Sigma Q_0), over a wide range of excitation energy for well-defined systems formed in the Cl + C reaction at 35A MeV. Relationships between excitation energy, charged-particle multiplicity, and intermediate-mass-fragment multiplicity, observed in this work and in recent experiments by a number of other groups, follow from the derivation of the average kinetic energies and Q-values.

  7. Statistical concepts a second course

    CERN Document Server

    Lomax, Richard G

    2012-01-01

    Statistical Concepts consists of the last 9 chapters of An Introduction to Statistical Concepts, 3rd ed. Designed for the second course in statistics, it is one of the few texts that focuses just on intermediate statistics. The book highlights how statistics work and what they mean to better prepare students to analyze their own data and interpret SPSS and research results. As such it offers more coverage of non-parametric procedures used when standard assumptions are violated since these methods are more frequently encountered when working with real data. Determining appropriate sample sizes

  8. The spectroscopy of fission fragments

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, W.R. [Department of Physics and Astronomy, University of Manchester, Manchester, M13 9PL (United Kingdom); Collaboration: La Direction des Sciences de la Matiere du CEA (FR); Le Fonds National de la Recherche Scientifique de Belgique (BE)

    1998-12-31

    High-resolution measurements on {gamma} rays from fission fragments have provided a rich source of information, unobtainable at the moment in any other way, on the spectroscopy of neutron-rich nuclei. In recent years important data have been obtained on the yrast- and near yrast-structure of neutron-rich fission fragments. We discuss the scope of measurements which can be made on prompt gamma rays from secondary fission fragments, the techniques used in the experiments and some results recently obtained. (author) 24 refs., 8 figs., 1 tab.

  9. Antibody Fragments as Probe in Biosensor Development

    Directory of Open Access Journals (Sweden)

    Serge Muyldermans

    2008-08-01

    Full Text Available Today’s proteomic analyses are generating increasing numbers of biomarkers, making it essential to possess highly specific probes able to recognize those targets. Antibodies are considered to be the first choice as molecular recognition units due to their target specificity and affinity, which make them excellent probes in biosensor development. However several problems such as difficult directional immobilization, unstable behavior, loss of specificity and steric hindrance, may arise from using these large molecules. Luckily, protein engineering techniques offer designed antibody formats suitable for biomarker analysis. Minimization strategies of antibodies into Fab fragments, scFv or even single-domain antibody fragments like VH, VL or VHHs are reviewed. Not only the size of the probe but also other issues like choice of immobilization tag, type of solid support and probe stability are of critical importance in assay development for biosensing. In this respect, multiple approaches to specifically orient and couple antibody fragments in a generic one-step procedure directly on a biosensor substrate are discussed.

  10. Space-time ambiguity of two- and three-fragment reduced velocity correlation functions

    Energy Technology Data Exchange (ETDEWEB)

    Glasmacher, T.; Phair, L.; Bowman, D.R.; Gelbke, C.K.; Gong, W.G.; Kim, Y.D.; Lisa, M.A.; Lynch, W.G.; Peaslee, G.F.; de Souza, R.T.; Tsang, M.B.; Zhu, F. [National Superconducting Cyclotron Laboratory and Department of Physics Astronomy, Michigan State University, East Lansing, Michigan 48824 (United States)

    1995-06-01

    Reduced-velocity correlation functions between two and three intermediate mass fragments are compared for central {sup 36}Ar+{sup 197}Ar collisions at {ital E}/{ital A}=50 MeV. Previously published {ital N}-body Coulomb-trajectory calculations, capable of reproducing the measured two-fragment reduced velocity-correlation function, describe the measured three-fragment correlation function equally well. Moreover, ambiguities between source size and lifetime observed in the analysis of two-fragment correlations remain unresolved in the three-fragment correlation function.

  11. Depth statistics

    OpenAIRE

    2012-01-01

    In 1975 John Tukey proposed a multivariate median which is the 'deepest' point in a given data cloud in R^d. Later, in measuring the depth of an arbitrary point z with respect to the data, David Donoho and Miriam Gasko considered hyperplanes through z and determined its 'depth' by the smallest portion of data that are separated by such a hyperplane. Since then, these ideas has proved extremely fruitful. A rich statistical methodology has developed that is based on data depth and, more general...

  12. Statistical mechanics

    CERN Document Server

    Sheffield, Scott

    2009-01-01

    In recent years, statistical mechanics has been increasingly recognized as a central domain of mathematics. Major developments include the Schramm-Loewner evolution, which describes two-dimensional phase transitions, random matrix theory, renormalization group theory and the fluctuations of random surfaces described by dimers. The lectures contained in this volume present an introduction to recent mathematical progress in these fields. They are designed for graduate students in mathematics with a strong background in analysis and probability. This book will be of particular interest to graduate students and researchers interested in modern aspects of probability, conformal field theory, percolation, random matrices and stochastic differential equations.

  13. HIERARCHICAL FRAGMENTATION OF THE ORION MOLECULAR FILAMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Takahashi, Satoko; Ho, Paul T. P.; Su, Yu-Nung [Academia Sinica Institute of Astronomy and Astrophysics, P.O. Box 23-141, Taipei 10617, Taiwan (China); Teixeira, Paula S. [Institut fuer Astrophysik, Universitaet Wien, Tuerkenschanzstrasse 17, A-1180, Wien (Austria); Zapata, Luis A., E-mail: satoko_t@asiaa.sinica.edu.tw [Centro de Radioastronomia y Astrofisica, Universidad Nacional Autonoma de Mexico, Morelia, Michoacan 58090 (Mexico)

    2013-01-20

    We present a high angular resolution map of the 850 {mu}m continuum emission of the Orion Molecular Cloud-3 (OMC 3) obtained with the Submillimeter Array (SMA); the map is a mosaic of 85 pointings covering an approximate area of 6.'5 Multiplication-Sign 2.'0 (0.88 Multiplication-Sign 0.27 pc). We detect 12 spatially resolved continuum sources, each with an H{sub 2} mass between 0.3-5.7 M {sub Sun} and a projected source size between 1400-8200 AU. All the detected sources are on the filamentary main ridge (n{sub H{sub 2}}{>=}10{sup 6} cm{sup -3}), and analysis based on the Jeans theorem suggests that they are most likely gravitationally unstable. Comparison of multi-wavelength data sets indicates that of the continuum sources, 6/12 (50%) are associated with molecular outflows, 8/12 (67%) are associated with infrared sources, and 3/12 (25%) are associated with ionized jets. The evolutionary status of these sources ranges from prestellar cores to protostar phase, confirming that OMC-3 is an active region with ongoing embedded star formation. We detect quasi-periodical separations between the OMC-3 sources of Almost-Equal-To 17''/0.035 pc. This spatial distribution is part of a large hierarchical structure that also includes fragmentation scales of giant molecular cloud ( Almost-Equal-To 35 pc), large-scale clumps ( Almost-Equal-To 1.3 pc), and small-scale clumps ( Almost-Equal-To 0.3 pc), suggesting that hierarchical fragmentation operates within the Orion A molecular cloud. The fragmentation spacings are roughly consistent with the thermal fragmentation length in large-scale clumps, while for small-scale cores it is smaller than the local fragmentation length. These smaller spacings observed with the SMA can be explained by either a helical magnetic field, cloud rotation, or/and global filament collapse. Finally, possible evidence for sequential fragmentation is suggested in the northern part of the OMC-3 filament.

  14. DebriSat Fragment Characterization System and Processing Status

    Science.gov (United States)

    Rivero, M.; Shiotani, B.; M. Carrasquilla; Fitz-Coy, N.; Liou, J. C.; Sorge, M.; Huynh, T.; Opiela, J.; Krisko, P.; Cowardin, H.

    2016-01-01

    The DebriSat project is a continuing effort sponsored by NASA and DoD to update existing break-up models using data obtained from hypervelocity impact tests performed to simulate on-orbit collisions. After the impact tests, a team at the University of Florida has been working to characterize the fragments in terms of their mass, size, shape, color and material content. The focus of the post-impact effort has been the collection of 2 mm and larger fragments resulting from the hypervelocity impact test. To date, in excess of 125K fragments have been recovered which is approximately 40K more than the 85K fragments predicted by the existing models. While the fragment collection activities continue, there has been a transition to the characterization of the recovered fragments. Since the start of the characterization effort, the focus has been on the use of automation to (i) expedite the fragment characterization process and (ii) minimize the effects of human subjectivity on the results; e.g., automated data entry processes were developed and implemented to minimize errors during transcription of the measurement data. At all steps of the process, however, there is human oversight to ensure the integrity of the data. Additionally, repeatability and reproducibility tests have been developed and implemented to ensure that the instrumentations used in the characterization process are accurate and properly calibrated.

  15. Prolonged incubation of processed human spermatozoa will increase DNA fragmentation.

    Science.gov (United States)

    Nabi, A; Khalili, M A; Halvaei, I; Roodbari, F

    2014-05-01

    One of the causes of failure in ART is sperm DNA fragmentation which may be associated with long period of spermatozoa incubation at 37 °C. The objective was to evaluate the rate of sperm DNA fragmentation using the sperm chromatin dispersion (SCD) test after swim-up at different time intervals prior to use. In this prospective study, 21 normozoospermic specimens were analysed. The samples were incubated at 37 °C after preparation by direct swim-up. DNA fragmentation was assessed at different time intervals (0, 1, 2 and 3 h) using SCD test. Spermatozoa with no DNA fragmentation showed large- or medium-sized halos, and sperm cells with DNA fragmentation showed either a small halo or no halo. The rates of normal morphology and progressive motility after sperm processing were 72.33 ± 2.53% and 90 ± 1.02%, respectively. The rate of sperm DNA fragmentation was significantly higher after 2 h (8.81 ± 0.93%, P = 0.004) and 3 h (10.76 ± 0.89%, P fragmentation. Therefore, sperm samples intended for ART procedures should be used within 2 h of incubation at 37 °C. © 2013 Blackwell Verlag GmbH.

  16. Global patterns of fragmentation and connectivity of mammalian carnivore habitat.

    Science.gov (United States)

    Crooks, Kevin R; Burdett, Christopher L; Theobald, David M; Rondinini, Carlo; Boitani, Luigi

    2011-09-27

    Although mammalian carnivores are vulnerable to habitat fragmentation and require landscape connectivity, their global patterns of fragmentation and connectivity have not been examined. We use recently developed high-resolution habitat suitability models to conduct comparative analyses and to identify global hotspots of fragmentation and connectivity for the world's terrestrial carnivores. Species with less fragmentation (i.e. more interior high-quality habitat) had larger geographical ranges, a greater proportion of habitat within their range, greater habitat connectivity and a lower risk of extinction. Species with higher connectivity (i.e. less habitat isolation) also had a greater proportion of high-quality habitat, but had smaller, not larger, ranges, probably reflecting shorter distances between habitat patches for species with restricted distributions; such species were also more threatened, as would be expected given the negative relationship between range size and extinction risk. Fragmentation and connectivity did not differ among Carnivora families, and body mass was associated with connectivity but not fragmentation. On average, only 54.3 per cent of a species' geographical range comprised high-quality habitat, and more troubling, only 5.2 per cent of the range comprised such habitat within protected areas. Identification of global hotspots of fragmentation and connectivity will help guide strategic priorities for carnivore conservation.

  17. Mechanics of fragmentation of crocodile skin and other thin films.

    Science.gov (United States)

    Qin, Zhao; Pugno, Nicola M; Buehler, Markus J

    2014-05-27

    Fragmentation of thin layers of materials is mediated by a network of cracks on its surface. It is commonly seen in dehydrated paintings or asphalt pavements and even in graphene or other two-dimensional materials, but is also observed in the characteristic polygonal pattern on a crocodile's head. Here, we build a simple mechanical model of a thin film and investigate the generation and development of fragmentation patterns as the material is exposed to various modes of deformation. We find that the characteristic size of fragmentation, defined by the mean diameter of polygons, is strictly governed by mechanical properties of the film material. Our result demonstrates that skin fragmentation on the head of crocodiles is dominated by that it features a small ratio between the fracture energy and Young's modulus, and the patterns agree well with experimental observations. Understanding this mechanics-driven process could be applied to improve the lifetime and reliability of thin film coatings by mimicking crocodile skin.

  18. Mechanics of fragmentation of crocodile skin and other thin films

    Science.gov (United States)

    Qin, Zhao; Pugno, Nicola M.; Buehler, Markus J.

    2014-05-01

    Fragmentation of thin layers of materials is mediated by a network of cracks on its surface. It is commonly seen in dehydrated paintings or asphalt pavements and even in graphene or other two-dimensional materials, but is also observed in the characteristic polygonal pattern on a crocodile's head. Here, we build a simple mechanical model of a thin film and investigate the generation and development of fragmentation patterns as the material is exposed to various modes of deformation. We find that the characteristic size of fragmentation, defined by the mean diameter of polygons, is strictly governed by mechanical properties of the film material. Our result demonstrates that skin fragmentation on the head of crocodiles is dominated by that it features a small ratio between the fracture energy and Young's modulus, and the patterns agree well with experimental observations. Understanding this mechanics-driven process could be applied to improve the lifetime and reliability of thin film coatings by mimicking crocodile skin.

  19. Subduction controls the distribution and fragmentation of Earth’s tectonic plates.

    Science.gov (United States)

    Mallard, Claire; Coltice, Nicolas; Seton, Maria; Müller, R Dietmar; Tackley, Paul J

    2016-07-07

    The theory of plate tectonics describes how the surface of Earth is split into an organized jigsaw of seven large plates of similar sizes and a population of smaller plates whose areas follow a fractal distribution. The reconstruction of global tectonics during the past 200 million years suggests that this layout is probably a long-term feature of Earth, but the forces governing it are unknown. Previous studies, primarily based on the statistical properties of plate distributions, were unable to resolve how the size of the plates is determined by the properties of the lithosphere and the underlying mantle convection. Here we demonstrate that the plate layout of Earth is produced by a dynamic feedback between mantle convection and the strength of the lithosphere. Using three-dimensional spherical models of mantle convection that self-consistently produce the plate size–frequency distribution observed for Earth, we show that subduction geometry drives the tectonic fragmentation that generates plates. The spacing between the slabs controls the layout of large plates, and the stresses caused by the bending of trenches break plates into smaller fragments. Our results explain why the fast evolution in small back-arc plates reflects the marked changes in plate motions during times of major reorganizations. Our study opens the way to using convection simulations with plate-like behaviour to unravel how global tectonics and mantle convection are dynamically connected.

  20. Effect of disorder on shrinkage-induced fragmentation of a thin brittle layer

    Science.gov (United States)

    Halász, Zoltán; Nakahara, Akio; Kitsunezaki, So; Kun, Ferenc

    2017-09-01

    We investigate the effect of the amount of disorder on the shrinkage-induced cracking of a thin brittle layer attached to a substrate. Based on a discrete element model we study how the dynamics of cracking and the size of fragments evolve when the amount of disorder is varied. In the model a thin layer is discretized on a random lattice of Voronoi polygons attached to a substrate. Two sources of disorder are considered: structural disorder captured by the local variation of the stiffness and strength disorder represented by the random strength of cohesive elements between polygons. Increasing the amount of strength disorder, our calculations reveal a transition from a cellular crack pattern, generated by the sequential branching and merging of cracks, to a disordered ensemble of cracks where the merging of randomly nucleated microcracks dominate. In the limit of low disorder, the statistics of fragment size is described by a log-normal distribution; however, in the limit of high disorder, a power-law distribution is obtained.

  1. Fragmentation and Coverage Variation in Viral Metagenome Assemblies, and Their Effect in Diversity Calculations.

    Science.gov (United States)

    García-López, Rodrigo; Vázquez-Castellanos, Jorge Francisco; Moya, Andrés

    2015-01-01

    , calculations using contigs as different OTUs ultimately overestimate diversity when compared to diversity calculated from species coverage. In order to compare the effect of coverage and fragmentation, we generated three sets of simulated Illumina paired-end reads with different sequencing depths. We compared different assemblies performed with RayMeta, CLC Assembly Cell, MEGAHIT, SPAdes, Meta-IDBA, SOAPdenovo, Velvet, Metavelvet, and MIRA with the best attainable assemblies for each dataset (formed by arranging data using known genome coordinates) by calculating different assembly statistics. A new fragmentation score was included to estimate the degree of genome fragmentation of each taxon and adjust the coverage accordingly. The abundance in the metagenome was compared by bootstrapping the assembly data and hierarchically clustering them with the best possible assembly. Additionally, richness and diversity indexes were calculated for all the resulting assemblies and were assessed under two distributions: contigs as independent OTUs and sequences classified by species. Finally, we search for the strongest correlations between the diversity indexes and the different assembly statistics. Although fragmentation was dependent of genome coverage, it was not as heavily influenced by the assembler. The sequencing depth was the predominant attractor that influenced the success of the assemblies. The coverage increased notoriously in larger datasets, whereas fragmentation values remained lower and unsaturated. While still far from obtaining the ideal assemblies, the RayMeta, SPAdes, and the CLC assemblers managed to build the most accurate contigs with larger datasets while Meta-IDBA showed a good performance with the medium-sized dataset, even after the adjusted coverage was calculated. Their resulting assemblies showed the highest coverage scores and the lowest fragmentation values. Alpha diversity calculated from contigs as OTUs resulted in significantly higher values for all

  2. Characterization of hypervelocity metal fragments for explosive initiation

    Science.gov (United States)

    Yeager, John D.; Bowden, Patrick R.; Guildenbecher, Daniel R.; Olles, Joseph D.

    2017-07-01

    The fragment impact response of two plastic-bonded explosive (PBX) formulations was studied using explosively driven aluminum fragments. A generic aluminum-capped detonator generated sub-mm aluminum particles moving at hypersonic velocities. The ability of these fragments to initiate reaction or otherwise damage two PBX materials was assessed using go/no-go experiments at standoff distances of up to 160 mm. Lower density PBX 9407 (RDX-based) was initiable at up to 115 mm, while higher density PBX 9501 (HMX-based) was only initiable at up to 6 mm. Several techniques were used to characterize the size, distribution, and velocity of the particles. Witness plate materials, including copper and polycarbonate, and backlit high speed video were used to characterize the distribution of particles, finding that the aluminum cap did not fragment homogeneously but rather with larger particles in a ring surrounding finer particles. Finally, precise digital holography experiments were conducted to measure the three-dimensional shape and size of the fastest-moving fragments, which ranged between 100 and 700 μm and traveled between 2.2 and 3.2 km/s. Crucially, these experiments showed variability in the fragmentation in terms of the number of fragments at the leading edge of the fragment field, indicating that both single and multiple shock impacts could be imparted to the target material. These types of data are critical for safety experiments and hydrocode simulations to quantify shock-to-detonation transition mechanisms and the associated risk-margins for these materials.

  3. Foraminifer Shell Weight and Fragmentation: A Quantitative Study of the Influence of Temperature, [CO32-] and Dissolution on Proxies of the Marine Carbonate System

    Science.gov (United States)

    Mekik, F.; Pourmand, A.; Ward, B. M.

    2015-12-01

    Quantifying the various components of the marine carbonate system is important for understanding anthropogenic ocean acidification, and the rates and magnitudes of ocean acidification/ alkalization events in Earth's past. We performed multiple statistical analyses (factor analysis, partial correlations, multiple regression analysis and independent samples t -tests) on core top data using the Globorotalia menardii fragmentation index (MFI) in 89 core tops from across the tropical Pacific, Atlantic and Indian Oceans, the fragmentation trend of four species of foraminifers (Globorotalia truncatulinoides, G. menardii, Neogloboquadrina dutertrei and Pulleniatina obliquiloculata) in the EEP, tropical Atlantic and tropical Indian Ocean core tops, and Globorotalia menardii shell weight in a suite of 25 core tops the EEP in order to isolate the effects of surface ocean parameters such as temperature and [CO32-] from dissolution in sediments. Surface ocean parameters showed no significant effect on the G. menardii fragmentation index. We found no statistically significant influence of habitat water temperature or [CO32-] on foraminifer fragmentation in any of four species. While we found a strong influence of habitat water [CO32-] on the size normalized shell weight proxy in N. dutertrei and Pulleniatina obliquiloculata in our previous work, we found a much reduced influence of [CO32-] on the shell weight of G. menardii, which is most influenced by shell dissolution.

  4. Statistical Neurodynamics.

    Science.gov (United States)

    Paine, Gregory Harold

    1982-03-01

    The primary objective of the thesis is to explore the dynamical properties of small nerve networks by means of the methods of statistical mechanics. To this end, a general formalism is developed and applied to elementary groupings of model neurons which are driven by either constant (steady state) or nonconstant (nonsteady state) forces. Neuronal models described by a system of coupled, nonlinear, first-order, ordinary differential equations are considered. A linearized form of the neuronal equations is studied in detail. A Lagrange function corresponding to the linear neural network is constructed which, through a Legendre transformation, provides a constant of motion. By invoking the Maximum-Entropy Principle with the single integral of motion as a constraint, a probability distribution function for the network in a steady state can be obtained. The formalism is implemented for some simple networks driven by a constant force; accordingly, the analysis focuses on a study of fluctuations about the steady state. In particular, a network composed of N noninteracting neurons, termed Free Thinkers, is considered in detail, with a view to interpretation and numerical estimation of the Lagrange multiplier corresponding to the constant of motion. As an archetypical example of a net of interacting neurons, the classical neural oscillator, consisting of two mutually inhibitory neurons, is investigated. It is further shown that in the case of a network driven by a nonconstant force, the Maximum-Entropy Principle can be applied to determine a probability distribution functional describing the network in a nonsteady state. The above examples are reconsidered with nonconstant driving forces which produce small deviations from the steady state. Numerical studies are performed on simplified models of two physical systems: the starfish central nervous system and the mammalian olfactory bulb. Discussions are given as to how statistical neurodynamics can be used to gain a better

  5. An improved model for fragment-based lead generation at AstraZeneca.

    Science.gov (United States)

    Fuller, Nathan; Spadola, Loredana; Cowen, Scott; Patel, Joe; Schönherr, Heike; Cao, Qing; McKenzie, Andrew; Edfeldt, Fredrik; Rabow, Al; Goodnow, Robert

    2016-08-01

    Modest success rates in fragment-based lead generation (FBLG) projects at AstraZeneca (AZ) prompted operational changes to improve performance. In this review, we summarize these changes, emphasizing the construction and composition of the AZ fragment library, screening practices and working model. We describe the profiles of the screening method for specific fragment subsets and statistically assess our ability to follow up on fragment hits through near-neighbor selection. Performance analysis of our second-generation fragment library (FL2) in screening campaigns illustrates the complementary nature of flat and 3D fragments in exploring protein-binding pockets and highlights our ability to deliver fragment hits using multiple screening techniques for various target classes. The new model has had profound impact on the successful delivery of lead series to drug discovery projects.

  6. Life history strategy influences parasite responses to habitat fragmentation.

    Science.gov (United States)

    Froeschke, Götz; van der Mescht, Luther; McGeoch, Melodie; Matthee, Sonja

    2013-12-01

    Anthropogenic habitat use is a major threat to biodiversity and is known to increase the abundance of generalist host species such as rodents, which are regarded as potential disease carriers. Parasites have an intimate relationship with their host and the surrounding environment and it is expected that habitat fragmentation will affect parasite infestation levels. We investigated the effect of habitat fragmentation on the ecto- and endoparasitic burdens of a broad niche small mammal, Rhabdomys pumilio, in the Western Cape Province, South Africa. Our aim was to look at the effects of fragmentation on different parasite species with diverse life history characteristics and to determine whether general patterns can be found. Sampling took place within pristine lowland (Fynbos/Renosterveld) areas and at fragmented sites surrounded and isolated by agricultural activities. All arthropod ectoparasites and available gastrointestinal endoparasites were identified. We used conditional autoregressive models to investigate the effects of habitat fragmentation on parasite species richness and abundance of all recovered parasites. Host density and body size were larger in the fragments. Combined ecto- as well as combined endoparasite taxa showed higher parasite species richness in fragmented sites. Parasite abundance was generally higher in the case of R. pumilio individuals in fragmented habitats but it appears that parasites that are more permanently associated with the host's body and those that are host-specific show the opposite trend. Parasite life history is an important factor that needs to be considered when predicting the effects of habitat fragmentation on parasite and pathogen transmission. Copyright © 2013 Australian Society for Parasitology Inc. Published by Elsevier Ltd. All rights reserved.

  7. Kinetics of a Migration-Driven Aggregation-Fragmentation Process

    Institute of Scientific and Technical Information of China (English)

    ZHUANGYou-Yi; LINZhen-Quan; KEJian-Hon~

    2003-01-01

    We propose a reversible model of the migration-driven aggregation-fragmentation process with the symmetric migration rate kernels K (k; j) = K′(k; j) =λkjv and the constant aggregation rates I1, I2 and fragmentation rates Jl, J2. Based on the mean-field theory, we investigate the evolution behavior of the aggregate size distributions in several cases with different values of index v. We find that the fragmentation reaction plays a more important role in the kinetic behaviors of the system than the aggregation and migration. When Jl = 0 and J2 = O, the aggregate size distributions αk(t) and bk(t) obey the conventional scaling law, while when Jl > 0 and J2 > O, they obey the modified scaling law with an exponential scaling function. The total mass of either species remains conserved.

  8. Rotationally induced fragmentation in the prestellar core L1544

    Energy Technology Data Exchange (ETDEWEB)

    Klapp, Jaime; Zavala, Miguel [Departamento de Física, Instituto Nacional de Investigaciones Nucleares (ININ), Km. 36.5, Carretera México-Toluca, La Marquesa 52750, Estado de México (Mexico); Sigalotti, Leonardo Di G.; Peña-Polo, Franklin; Troconis, Jorge [Centro de Física, Instituto Venezolano de Investigaciones Científicas (IVIC), Apartado Postal 20632, Caracas 1020A (Venezuela, Bolivarian Republic of)

    2014-01-10

    Recent observations indicate that there is no correlation between the level of turbulence and fragmentation in detected protostellar cores, suggesting that turbulence works mainly before gravitationally bound prestellar cores form and that their inner parts are likely to be velocity coherent. Based on this evidence, we simulate the collapse and fragmentation of an isolated, initially centrally condensed, uniformly rotating core of total mass M = 5.4 M {sub ☉}, using the smoothed particle hydrodynamics code GADGET-2 modified with the inclusion of sink particles, in order to compare the statistical properties of the resulting stellar ensembles with previous gravoturbulent fragmentation models. The initial conditions are intended to fit the observed properties of the prestellar core L1544. We find that for ratios of the rotational to the gravitational energy β ≥ 0.05, a massive disk is formed at the core center from which a central primary condenses after ∼50 kyr. Soon thereafter the disk fragments into secondary protostars, consistent with an intermediate mode of star formation in which groups of 10-100 stars form from a single core. The models predict peak accretion rates between ∼10{sup –5} and 10{sup –4} M {sub ☉} yr{sup –1} for all stars and reproduce many of the statistical properties predicted from gravoturbulent fragmentation, suggesting that on the small scales of low-mass, dense cores these are independent of whether the contracting gas is turbulent or purely rotating.

  9. Hard photons in heavy-ion collisions: Direct or statistical\\?

    Science.gov (United States)

    Herrmann, N.; Bock, R.; Emling, H.; Freifelder, R.; Gobbi, A.; Grosse, E.; Hildenbrand, K. D.; Kulessa, R.; Matulewicz, T.; Rami, F.; Simon, R. S.; Stelzer, H.; Wessels, J.; Maurenzig, P. R.; Olmi, A.; Stefanini, A. A.; Kühn, W.; Metag, V.; Novotny, R.; Gnirs, M.; Pelte, D.; Braun-Munzinger, P.; Moretto, L. G.

    1988-04-01

    Photons with energies from 2 to 60 MeV have been measured in coincidence with binary fragments in the reaction 92Mo+92Mo at an incident energy of 19.5A MeV. The rapid change of the γ-ray spectrum and multiplicity with the fragment total kinetic energy in the exit channel indicates that the γ rays are emitted statistically by the highly excited fragments. Temperatures as high as 6 MeV are inferred.

  10. BIOFRAG - a new database for analyzing BIOdiversity responses to forest FRAGmentation

    Science.gov (United States)

    M. Pfeifer; Tamara Heartsill Scalley

    2014-01-01

    Habitat fragmentation studies have produced complex results that are challenging to synthesize. Inconsistencies among studies may result from variation in the choice of landscape metrics and response variables, which is often compounded by a lack of key statistical or methodological information. Collating primary datasets on biodiversity responses to fragmentation in a...

  11. Shape Distribution of Fragments from Microsatellite Impact Tests

    Science.gov (United States)

    Liou, J.C.; Hanada, T.

    2009-01-01

    Fragment shape is an important factor for conducting reliable orbital debris damage assessments for critical space assets, such as the International Space Station. To date, seven microsatellite impact tests have been completed as part of an ongoing collaboration between Kyushu University and the NASA Orbital Debris Program Office. The target satellites ranged in size from 15 cm 15 cm 15 cm to 20 cm 20 cm 20 cm. Each target satellite was equipped with fully functional electronics, including circuits, battery, and transmitter. Solar panels and multi-layer insulation (MLI) were added to the target satellites of the last two tests. The impact tests were carried out with projectiles of different sizes and impact speeds. All fragments down to about 2 mm in size were collected and analyzed based on their three orthogonal dimensions, x, y, and z, where x is the longest dimension, y is the longest dimension in the plane perpendicular to x, and z is the longest dimension perpendicular to both x and y. Each fragment was also photographed and classified by shape and material composition. This data set serves as the basis of our effort to develop a fragment shape distribution. Two distinct groups can be observed in the x/y versus y/z distribution of the fragments. Objects in the first group typically have large x/y values. Many of them are needle-like objects originating from the fragmentation of carbon fiber reinforced plastic materials used to construct the satellites. Objects in the second group tend to have small x/y values, and many of them are box-like or plate-like objects, depending on their y/z values. Each group forms the corresponding peak in the x/y distribution. However, only one peak can be observed in the y/z distribution. These distributions and how they vary with size, material type, and impact parameters will be described in detail within the paper.

  12. Drifting to oblivion? Rapid genetic differentiation in an endangered lizard following habitat fragmentation and drought

    Science.gov (United States)

    Vandergast, Amy; Wood, Dustin A.; Thompson, Andrew R.; Fisher, Mark; Barrows, Cameron W.; Grant, Tyler J.

    2016-01-01

    Aim The frequency and severity of habitat alterations and disturbance are predicted to increase in upcoming decades, and understanding how disturbance affects population integrity is paramount for adaptive management. Although rarely is population genetic sampling conducted at multiple time points, pre- and post-disturbance comparisons may provide one of the clearest methods to measure these impacts. We examined how genetic properties of the federally threatened Coachella Valley fringe-toed lizard (Uma inornata) responded to severe drought and habitat fragmentation across its range. Location Coachella Valley, California, USA. Methods We used 11 microsatellites to examine population genetic structure and diversity in 1996 and 2008, before and after a historic drought. We used Bayesian assignment methods and F-statistics to estimate genetic structure. We compared allelic richness across years to measure loss of genetic diversity and employed approximate Bayesian computing methods and heterozygote excess tests to explore the recent demographic history of populations. Finally, we compared effective population size across years and to abundance estimates to determine whether diversity remained low despite post-drought recovery. Results Genetic structure increased between sampling periods, likely as a result of population declines during the historic drought of the late 1990s–early 2000s, and habitat loss and fragmentation that precluded post-drought genetic rescue. Simulations supported recent demographic declines in 3 of 4 main preserves, and in one preserve, we detected significant loss of allelic richness. Effective population sizes were generally low across the range, with estimates ≤100 in most sites. Main conclusions Fragmentation and drought appear to have acted synergistically to induce genetic change over a short time frame. Progressive deterioration of connectivity, low Ne and measurable loss of genetic diversity suggest that conservation efforts have

  13. Cryopreservation increases DNA fragmentation in spermatozoa of smokers.

    Science.gov (United States)

    Aydin, Mehmet Serif; Senturk, Gozde Erkanli; Ercan, Feriha

    2013-05-01

    Smoking causes subfertility due to deterioration of spermatozoa including decreased concentration and abnormal morphology. Although evidence on the deleterious effects of smoking on spermatozoa parameters is well known, its interference with cryopreservation is not clear. This study aimed to investigate the effects of cryopreservation on sperm parameters and DNA fragmentation in non-smokers and smokers. Semen samples were obtained from 40 normospermic male volunteers of whom 20 were non-smokers and 20 smokers. Samples were analyzed in terms of motility, concentration, morphology, and DNA fragmentation before freezing and 1 and 3 months after freezing and thawing. Ultrastructural alterations were investigated by transmission electron microscopy. Sperm morphology seemed to be more affected after cryopreservation in samples obtained from smokers. Ultrastructural examination showed alterations in the integrity of the membranes and increased subacrosomal swelling. Before freezing, the increase in DNA fragmentation rate in smokers was not statistically significant compared to that of non-smokers. However, after thawing, the DNA fragmentation rates were significantly high in both non-smokers and smokers compared to their respective rates before freezing. The extent of the increase in DNA fragmentation rate was significantly higher in smokers after thawing compared to that of non-smokers. In conclusion, cryopreservation causes alterations in membrane integrity and increases DNA fragmentation, thus triggering relatively negative effects on the sperm samples of smokers compared to that of non-smokers. Copyright © 2012 Elsevier GmbH. All rights reserved.

  14. Size dependent pore size distribution of shales by gas physisorption

    Science.gov (United States)

    Roshan, Hamid; Andersen, Martin S.; Yu, Lu; Masoumi, Hossein; Arandian, Hamid

    2017-04-01

    Gas physisorption, in particular nitrogen adsorption-desorption, is a traditional technique for characterization of geomaterials including the organic rich shales. The low pressure nitrogen is used together with adsorption-desorption physical models to study the pore size distribution (PSD) and porosity of the porous samples. The samples are usually crushed to a certain fragment size to measure these properties however there is not yet a consistent standard size proposed for sample crushing. Crushing significantly increases the surface area of the fragments e.g. the created surface area is differentiated from that of pores using BET technique. In this study, we show that the smaller fragment sizes lead to higher cumulative pore volume and smaller pore diameters. It is also shown that some of the micro-pores are left unaccounted because of the correction of the external surface area. In order to illustrate this, the nitrogen physisorption is first conducted on the identical organic rich shale samples with different sizes: 20-25, 45-50 and 63-71 µm. We then show that such effects are not only a function of pore structure changes induced by crushing, but is linked to the inability of the physical models in differentiating between the external surface area (BET) and micro-pores for different crushing sizes at relatively low nitrogen pressure. We also discuss models currently used in nano-technology such as t-method to address this issue and their advantages and shortcoming for shale rock characterization.

  15. 地面雨滴谱观测技术及特征研究进展%Advances in Measurement Techniques and Statistics Features of Surface Raindrop Size Distribution

    Institute of Scientific and Technical Information of China (English)

    朱亚乔; 刘元波

    2013-01-01

    Rain Drop Size Distribution (DSD) is one of the key parameters to micro-physical process and macro-dynamical structure of precipitation.It provides useful information for understanding the mechanisms of precipitation formation and development.Conventional measurement techniques include momentum method,flour method,filtering paper,raindrop camera and immersion method.In general,the techniques have large measurement error,heavy workload,and low efficiency.Innovation of disdrometer is a remarkable progress in DSD observation.To date,the major techniques are classified into impacting,optical and acoustic disdrometers,which are automated and more convenient and accurate.The impacting disdrometer transforms the momentum of raindrops into electric impulse,which are easy to operate and quality-assured but with large errors for extremely large or small raindrops.The optical disdrometer measures rainfall diameter and its velocity in the same time,but cannot distinguish the particles passing through sampling area simultaneously.The acoustic disdrometer determines DSD from the raindrop impacts on water body with a high temporal resolution but easily affected by wind.In addition,the Doppler can provide DSD with polarimetric techniques for large area while it is affected by updrafts,downdrafts and horizontal winds.DSD has meteorological features,which can be described with the Marshall-Palmer (M-P),the Gamma,the lognormal or the normalized models.The M-P model is suitable for steady rainfall,usually used for weak and moderate rainfall.The gamma model is proposed for DSD at high rain rate.The lognormal model is widely applied for cloud droplet analysis,but not appropriate for DSD with a broad spectrum.The normalized model is free of assumptions about the shape of the DSD.For practical application,statistical comparison is necessary for selection of a most suitable model.Meteorologically,convective rain has a relatively narrow and smooth DSD spectrum usually described by the M

  16. Forest habitat loss, fragmentation, and red-cockaded woodpecker populations

    Science.gov (United States)

    Richard N. Conner; D. Craig Rudolph

    1991-01-01

    Loss of mature forest habitat was measured around Red-cockaded Woodpecker (Picoides borealis) cavity tree clusters (colonies) in three National Forests in eastern Texas. Forest removal results in a loss of foraging habitat and causes habitat fragmentation of the remaining mature forest. Habitat loss was negatively associated with woodpecker group size in small...

  17. Analysis methods for Kevlar shield response to rotor fragments

    Science.gov (United States)

    Gerstle, J. H.

    1977-01-01

    Several empirical and analytical approaches to rotor burst shield sizing are compared and principal differences in metal and fabric dynamic behavior are discussed. The application of transient structural response computer programs to predict Kevlar containment limits is described. For preliminary shield sizing, present analytical methods are useful if insufficient test data for empirical modeling are available. To provide other information useful for engineering design, analytical methods require further developments in material characterization, failure criteria, loads definition, and post-impact fragment trajectory prediction.

  18. Observation of anisotropic fragmentation in methane subjected to femtosecond radiation

    CERN Document Server

    Strohaber, J; Kolomenskii, A A; Schuessler, H A

    2013-01-01

    We present experimental results on the ionization/dissociation of methane in femtosecond pulses of radiation. Angular and intensity dependent yields of singly and doubly charged species were measured using an imaging mass spectrometer. The measured data shows that all fragments yields exhibit some degree of anisotropy as a result of them being preferably ejected parallel to the polarization direction. Additionally, an anomalous perpendicular fragmentation pattern is found for CH\\-(2)\\+(2+). We find evidence of multiple dissociation mechanisms including statistical decay, field assisted dissociation and Coulomb explosion.

  19. Binary and Ternary Fission Within the Statistical Model

    Science.gov (United States)

    Adamian, Gurgen G.; Andreev, Alexander V.; Antonenko, Nikolai V.; Scheid, Werner

    The binary and ternary nuclear fission are treated within the statistical model. At the scission point we calculate the potentials as functions of the deformations of the fragments in the dinuclear model. The potentials give the mass and charge distributions of the fission fragments. The ternary fission is assumed to occur during the binary fission.

  20. Hot spot analysis for driving the development of hits into leads in fragment based drug discovery

    OpenAIRE

    Hall, David R.; Ngan, Chi Ho; Zerbe, Brandon S.; Kozakov, Dima; Vajda, Sandor

    2011-01-01

    Fragment based drug design (FBDD) starts with finding fragment-sized compounds that are highly ligand efficient and can serve as a core moiety for developing high affinity leads. Although the core-bound structure of a protein facilitates the construction of leads, effective design is far from straightforward. We show that protein mapping, a computational method developed to find binding hot spots and implemented as the FTMap server, provides information that complements the fragment screening...

  1. Small size sampling

    Directory of Open Access Journals (Sweden)

    Rakesh R. Pathak

    2012-02-01

    Full Text Available Based on the law of large numbers which is derived from probability theory, we tend to increase the sample size to the maximum. Central limit theorem is another inference from the same probability theory which approves largest possible number as sample size for better validity of measuring central tendencies like mean and median. Sometimes increase in sample-size turns only into negligible betterment or there is no increase at all in statistical relevance due to strong dependence or systematic error. If we can afford a little larger sample, statistically power of 0.90 being taken as acceptable with medium Cohen's d (<0.5 and for that we can take a sample size of 175 very safely and considering problem of attrition 200 samples would suffice. [Int J Basic Clin Pharmacol 2012; 1(1.000: 43-44

  2. Hands as markers of fragmentation

    Directory of Open Access Journals (Sweden)

    A. Barnard

    2005-07-01

    Full Text Available Margaret Atwood is an internationally read, translated, and critiqued writer whose novels have established her as one of the most esteemed authors in English (McCombs & Palmer, 1991:1. Critical studies of her work deal mainly with notions of identity from psychoanalytical perspectives. This study has identified a gap in current critical studies on Atwood’s works, namely the challenging of textual unity which is paralleled in the challenging of the traditional (single narrative voice. The challenging of textual unity and the single narrative voice brings about the fragmentation of both. This article will focus on the role that hands play as markers of fragmentation in “The Blind Assassin” (2000. In the novel, the writing hand destabilises the narrative voice, since it is not connected to the voice of a single author. If the author of the text – the final signified – is eliminated, the text becomes fragmentary and open, inviting the reader to contribute to the creation of meaning. Hands play a signficant role in foregrounding the narrator’s fragmented identity, and consequently, the fragmentation of the text. We will investigate this concept in the light of Roland Barthes’ notion of the scriptor, whose hand is metaphorically severed from his or her “voice”. Instead of the text being a unified entity, it becomes unstable and it displays the absence of hierarchical textual levels. Based mainly on Barthes’ writings, this article concludes that hands foreground the narrator’s fragmented identity, which is paralleled in the fragmented text.

  3. Population of bound excited states in intermediate-energy fragmentation reactions

    CERN Document Server

    Obertelli, A; Bazin, D; Campbell, C M; Cook, J M; Cottle, P D; Davies, A D; Dinca, D C; Glasmacher, T; Hansen, P G; Hoagland, T; Kemper, K W; Lecouey, J L; Müller, W F; Reynolds, R R; Roeder, B T; Terry, J R; Tostevin, J A; Yoneda, K; Zwahlen, H

    2006-01-01

    Fragmentation reactions with intermediate-energy heavy-ion beams exhibit a wide range of reaction mechanisms, ranging from direct reactions to statistical processes. We examine this transition by measuring the relative population of excited states in several sd-shell nuclei produced by fragmentation with the number of removed nucleons ranging from two to sixteen. The two-nucleon removal is consistent with a non-dissipative process whereas the removal of more than five nucleons appears to be mainly statistical.

  4. Energetics of glass fragmentation: Experiments on synthetic and natural glasses

    Science.gov (United States)

    Kolzenburg, S.; Russell, J. K.; Kennedy, L. A.

    2013-11-01

    Natural silicate glasses are an essential component of many volcanic rock types including coherent and pyroclastic rocks; they span a wide range of compositions, occur in diverse environments, and form under a variety of pressure-temperature conditions. In subsurface volcanic environments (e.g., conduits and feeders), melts intersect the thermodynamically defined glass transition temperature to form glasses at elevated confining pressures and under differential stresses. We present a series of room temperature experiments designed to explore the fundamental mechanical and fragmentation behavior of natural (obsidian) and synthetic glasses (Pyrex™) under confining pressures of 0.1-100 MPa. In each experiment, glass cores are driven to brittle failure under compressive triaxial stress. Analysis of the load-displacement response curves is used to quantify the storage of energy in samples prior to failure, the (brittle) release of elastic energy at failure, and the residual energy stored in the post-failure material. We then establish a relationship between the energy density within the sample at failure and the grain-size distributions (D-values) of the experimental products. The relationship between D-values and energy density for compressive fragmentation is significantly different from relationships established by previous workers for decompressive fragmentation. Compressive fragmentation is found to have lower fragmentation efficiency than fragmentation through decompression (i.e., a smaller change in D-value with increasing energy density). We further show that the stress storage capacity of natural glasses can be enhanced (approaching synthetic glasses) through heat treatment.

  5. High Efficiency Hydrodynamic DNA Fragmentation in a Bubbling System

    Science.gov (United States)

    Li, Lanhui; Jin, Mingliang; Sun, Chenglong; Wang, Xiaoxue; Xie, Shuting; Zhou, Guofu; van den Berg, Albert; Eijkel, Jan C. T.; Shui, Lingling

    2017-01-01

    DNA fragmentation down to a precise fragment size is important for biomedical applications, disease determination, gene therapy and shotgun sequencing. In this work, a cheap, easy to operate and high efficiency DNA fragmentation method is demonstrated based on hydrodynamic shearing in a bubbling system. We expect that hydrodynamic forces generated during the bubbling process shear the DNA molecules, extending and breaking them at the points where shearing forces are larger than the strength of the phosphate backbone. Factors of applied pressure, bubbling time and temperature have been investigated. Genomic DNA could be fragmented down to controllable 1-10 Kbp fragment lengths with a yield of 75.30-91.60%. We demonstrate that the ends of the genomic DNAs generated from hydrodynamic shearing can be ligated by T4 ligase and the fragmented DNAs can be used as templates for polymerase chain reaction. Therefore, in the bubbling system, DNAs could be hydrodynamically sheared to achieve smaller pieces in dsDNAs available for further processes. It could potentially serve as a DNA sample pretreatment technique in the future.

  6. Mammal assemblages in forest fragments and landscapes occupied by black howler monkeys.

    Science.gov (United States)

    Rangel-Negrín, Ariadna; Coyohua-Fuentes, Alejandro; Canales-Espinosa, Domingo; Dias, Pedro Américo D

    2014-07-01

    Species assemblages in disturbed habitats vary as a function of the interaction between species requirements and the spatial configuration of the habitat. There are many reports accounting for the presence of howler monkeys in fragments where other mammals are absent, suggesting that they are more resilient. In the present study we explored this idea and predicted that if howler monkeys were more resilient to habitat loss and fragmentation than other mammals, mammal assemblages in fragments occupied by howler monkeys should include fewer species with decreasing amount of habitat (smaller fragment size and less habitat in the landscape) and increasing number of forest fragments. We explored these relationships by additionally considering the feeding and life habits of mammal species, as well as the isolation and proximity of each fragment to human settlements and roads. We sampled the presence of mammals in five fragments occupied by black howler monkeys (Alouatta pigra) in the Mexican state of Campeche. Through direct sights performed during 240 h in each fragment, we observed 23 species. At the landscape scale, higher fragmentation was associated with a decrease in herbivores, omnivores and total number of species. At the fragment scale semiarboreal, omnivore, and total number of species increased with increasing fragment size. This study supports the idea that howler monkeys are more resilient to forest loss and fragmentation than other native mammals, and our exploratory analyses suggest that the specific mammal assemblages that are found in fragments are related to both landscape and fragment scale spatial attributes, as well as with species-specific characteristics.

  7. High fragmentation characterizes tumour-derived circulating DNA.

    Directory of Open Access Journals (Sweden)

    Florent Mouliere

    Full Text Available BACKGROUND: Circulating DNA (ctDNA is acknowledged as a potential diagnostic tool for various cancers including colorectal cancer, especially when considering the detection of mutations. Certainly due to lack of normalization of the experimental conditions, previous reports present many discrepancies and contradictory data on the analysis of the concentration of total ctDNA and on the proportion of tumour-derived ctDNA fragments. METHODOLOGY: In order to rigorously analyse ctDNA, we thoroughly investigated ctDNA size distribution. We used a highly specific Q-PCR assay and athymic nude mice xenografted with SW620 or HT29 human colon cancer cells, and we correlated our results by examining plasma from metastatic CRC patients. CONCLUSION/SIGNIFICANCE: Fragmentation and concentration of tumour-derived ctDNA is positively correlated with tumour weight. CtDNA quantification by Q-PCR depends on the amplified target length and is optimal for 60-100 bp fragments. Q-PCR analysis of plasma samples from xenografted mice and cancer patients showed that tumour-derived ctDNA exhibits a specific amount profile based on ctDNA size and significant higher ctDNA fragmentation. Metastatic colorectal patients (n = 12 showed nearly 5-fold higher mean ctDNA fragmentation than healthy individuals (n = 16.

  8. Assortative mating and fragmentation within dog breeds

    Directory of Open Access Journals (Sweden)

    Hailer Frank

    2008-01-01

    Full Text Available Abstract Background There are around 400 internationally recognized dog breeds in the world today, with a remarkable diversity in size, shape, color and behavior. Breeds are considered to be uniform groups with similar physical characteristics, shaped by selection rooted in human preferences. This has led to a large genetic difference between breeds and a large extent of linkage disequilibrium within breeds. These characteristics are important for association mapping of candidate genes for diseases and therefore make dogs ideal models for gene mapping of human disorders. However, genetic uniformity within breeds may not always be the case. We studied patterns of genetic diversity within 164 poodles and compared it to 133 dogs from eight other breeds. Results Our analyses revealed strong population structure within poodles, with differences among some poodle groups as pronounced as those among other well-recognized breeds. Pedigree analysis going three generations back in time confirmed that subgroups within poodles result from assortative mating imposed by breed standards as well as breeder preferences. Matings have not taken place at random or within traditionally identified size classes in poodles. Instead, a novel set of five poodle groups was identified, defined by combinations of size and color, which is not officially recognized by the kennel clubs. Patterns of genetic diversity in other breeds suggest that assortative mating leading to fragmentation may be a common feature within many dog breeds. Conclusion The genetic structure observed in poodles is the result of local mating patterns, implying that breed fragmentation may be different in different countries. Such pronounced structuring within dog breeds can increase the power of association mapping studies, but also represents a serious problem if ignored. In dog breeding, individuals are selected on the basis of morphology, behaviour, working or show purposes, as well as geographic

  9. Assortative mating and fragmentation within dog breeds.

    Science.gov (United States)

    Björnerfeldt, Susanne; Hailer, Frank; Nord, Maria; Vilà, Carles

    2008-01-28

    There are around 400 internationally recognized dog breeds in the world today, with a remarkable diversity in size, shape, color and behavior. Breeds are considered to be uniform groups with similar physical characteristics, shaped by selection rooted in human preferences. This has led to a large genetic difference between breeds and a large extent of linkage disequilibrium within breeds. These characteristics are important for association mapping of candidate genes for diseases and therefore make dogs ideal models for gene mapping of human disorders. However, genetic uniformity within breeds may not always be the case. We studied patterns of genetic diversity within 164 poodles and compared it to 133 dogs from eight other breeds. Our analyses revealed strong population structure within poodles, with differences among some poodle groups as pronounced as those among other well-recognized breeds. Pedigree analysis going three generations back in time confirmed that subgroups within poodles result from assortative mating imposed by breed standards as well as breeder preferences. Matings have not taken place at random or within traditionally identified size classes in poodles. Instead, a novel set of five poodle groups was identified, defined by combinations of size and color, which is not officially recognized by the kennel clubs. Patterns of genetic diversity in other breeds suggest that assortative mating leading to fragmentation may be a common feature within many dog breeds. The genetic structure observed in poodles is the result of local mating patterns, implying that breed fragmentation may be different in different countries. Such pronounced structuring within dog breeds can increase the power of association mapping studies, but also represents a serious problem if ignored. In dog breeding, individuals are selected on the basis of morphology, behaviour, working or show purposes, as well as geographic population structure. The same processes which have

  10. Phenomenology of Dihadron Fragmentation Function

    CERN Document Server

    Courtoy, A

    2016-01-01

    We report on the phenomenological results obtained through Dihadron Fragmentation Functions related processes. In 2015, an update on the fitting techniques for the Dihadron Fragmentation Functions has led to an improved extraction of the transversity PDF and, as a consequence, the nucleon tensor charge. We discuss the impact of the determination of the latter on search for physics Beyond the Standard Model, focusing on the error treatment. We also comment on the future of the extraction of the subleading-twist PDF $e(x)$ from JLab soon-to-be-released Beam Spin Asymmetry data.

  11. Transversity and dihadron fragmentation functions

    CERN Document Server

    Bacchetta, A; Bacchetta, Alessandro; Radici, Marco

    2005-01-01

    The observation of the quark transversity distribution requires another soft object sensitive to the quark's transverse spin. Dihadron fragmentation functions represent a convenient tool to analyze partonic spin, which can influence the angular distribution of the two hadrons. In particular, the so-called interference fragmentation functions can be used to probe transversity both in semi-inclusive deep inelastic scattering as well as proton-proton collisions. We discuss two single-spin asymmetries sensitive to transversity in the these two processes, at leading twist and leading order in alpha_S.

  12. RIA Fragmentation Line Beam Dumps

    Energy Technology Data Exchange (ETDEWEB)

    Stein, W

    2003-08-08

    The Rare Isotope Accelerator project involves generating heavy-element ion beams for use in a fragmentation target line to produce beams for physics research. The main beam, after passing through the fragmentation target, may be dumped into a beam dump located in the vacuum cavity of the first dipole magnet. For a dump beam power of 100 kW, cooling is required to avoid excessive high temperatures. The proposed dump design involves rotating cylinders to spread out the energy deposition and turbulent subcooled water flow through internal water cooling passages to obtain high, nonboiling, cooling rates.

  13. Effects of prairie fragmentation on the nest success of breeding birds in the midcontinental United States

    Science.gov (United States)

    Herkert, J.R.; Reinking, D.L.; Wiedenfeld, D.A.; Winter, M.; Zimmerman, J.L.; Jensen, W.E.; Finck, E.J.; Koford, Rolf R.; Wolfe, D.H.; Sherrod, S.K.; Jenkins, M.A.; Faaborg, J.; Robinson, S.K.

    2003-01-01

    Grassland fragmentation and habitat loss are hypothesized to be contributing to widespread grassland bird declines in North America due to the adverse effects of fragmentation on breeding bird abundance and reproductive success. To assess the effects of fragmentation on the reproductive success of grassland birds, we measured rates of nest predation and brood parasitism for four species of birds (Grasshopper Sparrow [Ammodramus savannaru], Henslow's Sparrow[Ammodramus henslowii], Eastern Meadowlark [Sturnella magna], and Dickcissel [Spiza Americana]) in 39 prairie fragments ranging from 24 to >40,000 ha in size in five states in the mid-continental United States. Throughout the region, nest-predation rates were significantly influenced by habitat fragmentation. Nest predation was highest in small (1000 ha) prairie fragments. Rates of brood parasitism by Brown-headed Cowbirds (Molothrus ater), however, were not consistently related to fragment size and instead were more strongly related to regional cowbird abundance, being significantly higher in regions with high cowbird abundance. Differences in nest-predation rates between large fragments (54-68% of all nests lost to predators) and small fragments (78-84% lost to predators) suggest that fragmentation of prairie habitats may be contributing to regional declines of grassland birds. Maintaining grassland bird populations, therefore, may require protection and restoration of large prairie areas.

  14. Characterizing the forest fragmentation of Canada's national parks.

    Science.gov (United States)

    Soverel, Nicholas O; Coops, Nicholas C; White, Joanne C; Wulder, Michael A

    2010-05-01

    Characterizing the amount and configuration of forests can provide insights into habitat quality, biodiversity, and land use. The establishment of protected areas can be a mechanism for maintaining large, contiguous areas of forests, and the loss and fragmentation of forest habitat is a potential threat to Canada's national park system. Using the Earth Observation for Sustainable Development of Forests (EOSD) land cover product (EOSD LC 2000), we characterize the circa 2000 forest patterns in 26 of Canada's national parks and compare these to forest patterns in the ecological units surrounding these parks, referred to as the greater park ecosystem (GPE). Five landscape pattern metrics were analyzed: number of forest patches, mean forest patch size (hectare), standard deviation of forest patch size (hectare), mean forest patch perimeter-to-area ratio (meters per hectare), and edge density of forest patches (meters per hectare). An assumption is often made that forests within park boundaries are less fragmented than the surrounding GPE, as indicated by fewer forest patches, a larger mean forest patch size, less variability in forest patch size, a lower perimeter-to-area ratio, and lower forest edge density. Of the 26 national parks we analyzed, 58% had significantly fewer patches, 46% had a significantly larger mean forest patch size (23% were not significantly different), and 46% had a significantly smaller standard deviation of forest patch size (31% were not significantly different), relative to their GPEs. For forest patch perimeter-to-area ratio and forest edge density, equal proportions of parks had values that were significantly larger or smaller than their respective GPEs and no clear trend emerged. In summary, all the national parks we analyzed, with the exception of the Georgian Bay Islands, were found to be significantly different from their corresponding GPE for at least one of the five metrics assessed, and 50% of the 26 parks were significantly

  15. Characterization of fragment emission in $^{20}$Ne (7 - 10 MeV/nucleon) + $^{12}$C reactions

    CERN Document Server

    Dey, Aparajita; Bhattacharya, S; Kundu, S; Banerjee, K; Mukhopadhyay, S; Gupta, D; Bhattacharjee, T; Banerjee, S R; Bhattacharya, S; Rana, T K; Basu, S K; Saha, R; Krishan, K; Mukherjee, A; Bandyopadhyay, D; Beck, C

    2007-01-01

    The inclusive energy distributions of the complex fragments (3 $\\leq$ Z $\\leq$ 7) emitted from the bombardment of $^{12}$C by $^{20}$Ne beams with incident energies between 145 and 200 MeV have been measured in the angular range 10$^{o} \\leq \\theta_{lab} \\leq$ 50$^{o}$. Damped fragment yields in all the cases have been found to be the characteristic of emission from fully energy equilibrated composites. The binary fragment yields are compared with the standard statistical model predictions. Enhanced yields of entrance channel fragments (5 $\\leq$ Z $\\leq$ 7) indicate the survival of orbiting-like process in $^{20}$Ne + $^{12}$C system at these energies.

  16. Subcascade formation and defect cluster size scaling in high-energy collision events in metals

    Science.gov (United States)

    De Backer, A.; Sand, A. E.; Nordlund, K.; Luneville, L.; Simeone, D.; Dudarev, S. L.

    2016-07-01

    It has been recently established that the size of the defects created under ion irradiation follows a scaling law (Sand A. E. et al., EPL, 103 (2013) 46003; Yi X. et al., EPL, 110 (2015) 36001). A critical constraint associated with its application to phenomena occurring over a broad range of irradiation conditions is the limitation on the energy of incident particles. Incident neutrons or ions, with energies exceeding a certain energy threshold, produce a complex hierarchy of collision subcascade events, which impedes the use of the defect cluster size scaling law derived for an individual low-energy cascade. By analyzing the statistics of subcascade sizes and energies, we show that defect clustering above threshold energies can be described by a product of two scaling laws, one for the sizes of subcascades and the other for the sizes of defect clusters formed in subcascades. The statistics of subcascade sizes exhibits a transition at a threshold energy, where the subcascade morphology changes from a single domain below the energy threshold, to several or many sub-domains above the threshold. The number of sub-domains then increases in proportion to the primary knock-on atom energy. The model has been validated against direct molecular-dynamics simulations and applied to W, Fe, Be, Zr and sixteen other metals, enabling the prediction of full statistics of defect cluster sizes with no limitation on the energy of cascade events. We find that populations of defect clusters produced by the fragmented high-energy cascades are dominated by individual Frenkel pairs and relatively small defect clusters, whereas the lower-energy non-fragmented cascades produce a greater proportion of large defect clusters.

  17. Physical map of polyoma viral DNA fragments produced by cleavage with a restriction enzyme from Haemophilus aegyptius, endonuclease R-HaeIII.

    Science.gov (United States)

    Summers, J

    1975-04-01

    Digestion of polyoma viral DNA with a restriction enzyme from Haemophilus aegyptius generates at least 22 unique fragments. The fragments have been characterized with respect to size and physical order on the polyoma genome, and the 5' to 3' orientation of the (+) and (-) strands has been determined. A method for specific radiolabeling of adjacent fragments was employed to establish the fragment order. This technique may be useful for ordering the fragments produced by digestion of complex DNAs.

  18. Metastable fragmentation of silver bromide clusters

    Energy Technology Data Exchange (ETDEWEB)

    L' Hermite, J.M.; Rabilloud, F.; Marcou, L.; Labastie, P. [Lab. CAR/IRSAMC, Univ. Paul Sabatier, Toulouse (France)

    2001-06-01

    The abundance spectra and the fragmentation channels of silver bromide clusters have been measured and analyzed. The most abundant species are Ag{sub n}Br{sub n} {sub -} {sub 1}{sup +} and Ag {sub n}Br {sub n} {sub +} {sub 1}{sup -} and Ag {sub 14}Br {sub 13}{sup +} is a magic number, revealing their ionic nature. However, some features depart from what is generally observed for alkali-halide ionic clusters. From a certain size, Ag {sub n}Br {sub n} {sub -} {sub 1}{sup +} is no more the main series, and Ag {sub n}Br {sub n} {sub -} {sub 2,} {sub 3}{sup +} series become almost as important. The fast fragmentation induced by a UV laser makes the cations lose more bromine than silver ions and lead to more silver-rich clusters. Negative ions mass spectra contain also species with more silver atoms than required by stoichiometry. We have investigated the metastable fragmentation of the cations using a new experimental method. The large majority of the cations release mainly a neutral Ag {sub 3}Br {sub 3} cluster. These decay channels are in full agreement with our recent ab initio DFT calculations, which show that Ag {sup +}-Ag {sup +} repulsion is reduced due to a globally attractive interaction of their d orbitals. This effect leads to a particularly stable trimer (AgBr) {sub 3} and to quasi-planar cyclic structures of (AgBr) {sub n} clusters up to n = 6. We have shown that these two features may be extended to other silver halides, to silver hydroxides (AgOH) {sub n}, and to cuprous halide compounds. (orig.)

  19. DNA Studies Using Atomic Force Microscopy: Capabilities for Measurement of Short DNA Fragments

    Directory of Open Access Journals (Sweden)

    Dalong ePang

    2015-01-01

    Full Text Available Short DNA fragments, resulting from ionizing radiation induced DNA double strand breaks (DSBs, or released from cells as a result of physiological processes and circulating in the blood stream, may play important roles in cellular function and potentially in disease diagnosis and early intervention. The size distribution of DNA fragments contribute to knowledge of underlining biological processes. Traditional techniques used in radiation biology for DNA fragment size measurements lack the resolution to quantify short DNA fragments. For the measurement of cell-free circulating DNA (ccfDNA, real time quantitative Polymerase Chain Reaction (q-PCR provides quantification of DNA fragment sizes, concentration and specific gene mutation. A complementary approach, the imaging-based technique using Atomic Force Microscopy (AFM provides direct visualization and measurement of individual DNA fragments. In this review, we summarize and discuss the application of AFM-based measurements of DNA fragment sizes. Imaging of broken plasmid DNA, as a result of exposure to ionizing radiation, as well as ccfDNA in clinical specimens offer an innovative approach for studies of short DNA fragments and their biological functions.

  20. The effect of habitat fragmentation and abiotic factors on fen plant occurrence

    NARCIS (Netherlands)

    Soomers, H.; Karssenberg, D.J.; Verhoeven, J.T.A.; Verweij, P.A.; Wassen, M.J.

    2013-01-01

    Human landscape modification has led to habitat fragmentation for many species. Habitat fragmentation, leading to isolation, decrease in patch size and increased edge effect, is observed in fen ecosystems that comprise many endangered plant species. However, until now it has remained unclear whether

  1. The effect of habitat fragmentation and abiotic factors on fen plant occurrence

    NARCIS (Netherlands)

    Soomers, H.; Karssenberg, D.J.; Verhoeven, J.T.A.; Verweij, P.A.; Wassen, M.J.

    2013-01-01

    Human landscape modification has led to habitat fragmentation for many species. Habitat fragmentation, leading to isolation, decrease in patch size and increased edge effect, is observed in fen ecosystems that comprise many endangered plant species. However, until now it has remained unclear

  2. The Fragmentation of Literary Theory

    Science.gov (United States)

    Howard, Jennifer

    2005-01-01

    Syllabi from some 20 colleges and universities were reviewed with prominent English and literature departments and a discussion was held with a number of professors who teach literary theory. It is suggested that devolution and fragmentation of theory might be a survival strategy, an adaptation to the new realties of academic institutions.

  3. Fragmented nature : consequences for biodiversity

    NARCIS (Netherlands)

    Olff, Han; Ritchie, Mark E.

    2002-01-01

    We discuss how fragmentation of resources and habitat operate differently on species diversity across spatial scales, ranging from positive effects on local species coexistence to negative effect on intermediate spatial scales, to again positive effects on large spatial and temporal scales. Species

  4. Fragmented nature: consequences for biodiversity

    NARCIS (Netherlands)

    Olff, H.; Ritchie, M.E.

    2002-01-01

    We discuss how fragmentation of resources and habitat operate differently on species diversity across spatial scales, ranging from positive effects on local species coexistence to negative effect on intermediate spatial scales, to again positive effects on large spatial and temporal scales. Species

  5. Fragmented nature : consequences for biodiversity

    NARCIS (Netherlands)

    Olff, Han; Ritchie, Mark E.

    2002-01-01

    We discuss how fragmentation of resources and habitat operate differently on species diversity across spatial scales, ranging from positive effects on local species coexistence to negative effect on intermediate spatial scales, to again positive effects on large spatial and temporal scales. Species

  6. Fragmented nature: consequences for biodiversity

    NARCIS (Netherlands)

    Olff, H.; Ritchie, M.E.

    2002-01-01

    We discuss how fragmentation of resources and habitat operate differently on species diversity across spatial scales, ranging from positive effects on local species coexistence to negative effect on intermediate spatial scales, to again positive effects on large spatial and temporal scales. Species

  7. Modified Classical Graph Algorithms for the DNA Fragment Assembly Problem

    Directory of Open Access Journals (Sweden)

    Guillermo M. Mallén-Fullerton

    2015-09-01

    Full Text Available DNA fragment assembly represents an important challenge to the development of efficient and practical algorithms due to the large number of elements to be assembled. In this study, we present some graph theoretical linear time algorithms to solve the problem. To achieve linear time complexity, a heap with constant time operations was developed, for the special case where the edge weights are integers and do not depend on the problem size. The experiments presented show that modified classical graph theoretical algorithms can solve the DNA fragment assembly problem efficiently.

  8. Fast Rise of "Neptune-Size" Planets (4-8 R_Earth) from P~10 to ~250 days -- Statistics of Kepler Planet Candidates Up to ~0.75 AU

    CERN Document Server

    Dong, Subo

    2013-01-01

    We infer period (P) and size (R_p) distribution of Kepler transiting planet candidates with R_p> 1 R_Earth and P10 days, the planet frequency dN_p/d logP for "Neptune-size" planets (R_p = 4-8 R_Earth) increases with period as \\propto P^{0.7\\pm0.1}. In contrast, dN_p/dlogP for Super-Earth-Size (2-4 R_Earth) as well as Earth-size (1-2 R_Earth) planets are consistent with a nearly flat distribution as a function of period (\\propto P^{0.11\\pm0.05}) and \\propto P^{-0.10\\pm0.12}, respectively), and the normalizations are remarkably similar (within a factor of ~ 1.5). The shape of the distribution function is found to be not sensitive to changes in selection criteria of the sample. The implied nearly flat or rising planet frequency at long period appears to be in tension with the sharp decline at ~100 days in planet frequency for low mass planets (planet mass m_p < 30 M_Earth) recently suggested by the HARPS survey.

  9. The Importance of Maize Management on Dung Beetle Communities in Atlantic Forest Fragments.

    Directory of Open Access Journals (Sweden)

    Renata Calixto Campos

    Full Text Available Dung beetle community structures changes due to the effects of destruction, fragmentation, isolation and decrease in tropical forest area, and therefore are considered ecological indicators. In order to assess the influence of type of maize cultivated and associated maize management on dung beetle communities in Atlantic Forest fragments surrounded by conventional and transgenic maize were evaluated 40 Atlantic Forest fragments of different sizes, 20 surrounded by GM maize and 20 surrounded by conventional maize, in February 2013 and 2014 in Southern Brazil. After applying a sampling protocol in each fragment (10 pitfall traps baited with human feces or carrion exposed for 48 h, a total of 3454 individuals from 44 species were captured: 1142 individuals from 38 species in GM maize surrounded fragments, and 2312 from 42 species in conventional maize surrounded fragments. Differences in dung beetle communities were found between GM and conventional maize communities. As expected for fragmented areas, the covariance analysis showed a greater species richness in larger fragments under both conditions; however species richness was greater in fragments surrounded by conventional maize. Dung beetle structure in the forest fragments was explained by environmental variables, fragment area, spatial distance and also type of maize (transgenic or conventional associated with maize management techniques. In Southern Brazil's scenario, the use of GM maize combined with associated agricultural management may be accelerating the loss of diversity in Atlantic Forest areas, and consequently, important ecosystem services provided by dung beetles may be lost.

  10. Key World Energy Statistics

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-07-01

    The IEA produced its first handy, pocket-sized summary of key energy data in 1997. This new edition responds to the enormously positive reaction to the book since then. Key World Energy Statistics produced by the IEA contains timely, clearly-presented data on supply, transformation and consumption of all major energy sources. The interested businessman, journalist or student will have at his or her fingertips the annual Canadian production of coal, the electricity consumption in Thailand, the price of diesel oil in Spain and thousands of other useful energy facts. It exists in different formats to suit our readers' requirements.

  11. Subcellular Size

    Science.gov (United States)

    Marshall, Wallace F.

    2015-01-01

    All of the same conceptual questions about size in organisms apply equally at the level of single cells. What determines the size, not only of the whole cell, but of all of its parts? What ensures that subcellular components are properly proportioned relative to the whole cell? How does alteration in organelle size affect biochemical function? Answering such fundamental questions requires us to understand how the size of individual organelles and other cellular structures is determined. Knowledge of organelle biogenesis and dynamics has advanced rapidly in recent years. Does this knowledge give us enough information to formulate reasonable models for organelle size control, or are we still missing something? PMID:25957302

  12. Preliminary insights into a model for mafic magma fragmentation

    Science.gov (United States)

    Edwards, Matt; Pioli, Laura; Andronico, Daniele; Cristaldi, Antonio; Scollo, Simona

    2017-04-01

    Fragmentation of mafic magmas remains a poorly understood process despite the common occurrence of low viscosity explosive eruptions. In fact, it has been commonly overlooked based on the assumption that low viscosity magmas have very limited explosivity and low potential to undergo brittle fragmentation. However, it is now known that highly explosive, ash forming eruptions can be relatively frequent at several mafic volcanoes. Three questions arise due to this - What is the specific fragmentation mechanism occuring in these eruptions? What are the primary factors controlling fragmentation efficiency? Can a link between eruption style and fragmentation efficiency be quantified? We addressed these questions by coupling theoretical observations and field analysis of the recent May 2016 eruption at Mount Etna volcano. Within this complex 10-day event three paroxysmal episodes of pulsating basaltic lava jets alternating with small lava flows were recorded from a vent within the Voragine crater. The associated plumes which were produced deposited tephra along narrow axes to the east and south east. Sampling was done on the deposits associated with the first two plumes and the third one. We briefly characterise the May 2016 eruption by assessing plume height, eruption phases, total erupted masses and fallout boundaries and comparing them to previous eruptions. We also analyse the total grainsize distribution (TGSD) of the scoria particles formed in the jets. Conventional methods for obtaining grainsize and total distributions of an eruption are based on mass and provide limited information on fragmentation though. For this reason, the TGSD was assessed by coupling particle analyser data and conventional sieving data to assess both particle size and number of particle distributions with better precision. This allowed for more accurate testing of several existing models describing the shape of the TGSD. Coupled further with observations on eruption dynamics and eruption

  13. The VERDI fission fragment spectrometer

    Science.gov (United States)

    Frégeau, M. O.; Bryś, T.; Gamboni, Th.; Geerts, W.; Oberstedt, S.; Oberstedt, A.; Borcea, R.

    2013-12-01

    The VERDI time-of-flight spectrometer is dedicated to measurements of fission product yields and of prompt neutron emission data. Pre-neutron fission-fragment masses will be determined by the double time-of-flight (TOF) technique. For this purpose an excellent time resolution is required. The time of flight of the fragments will be measured by electrostatic mirrors located near the target and the time signal coming from silicon detectors located at 50 cm on both sides of the target. This configuration, where the stop detector will provide us simultaneously with the kinetic energy of the fragment and timing information, significantly limits energy straggling in comparison to legacy experimental setup where a thin foil was usually used as a stop detector. In order to improve timing resolution, neutron transmutation doped silicon will be used. The high resistivity homogeneity of this material should significantly improve resolution in comparison to standard silicon detectors. Post-neutron fission fragment masses are obtained form the time-of-flight and the energy signal in the silicon detector. As an intermediary step a diamond detector will also be used as start detector located very close to the target. Previous tests have shown that poly-crystalline chemical vapour deposition (pCVD) diamonds provides a coincidence time resolution of 150 ps not allowing complete separation between very low-energy fission fragments, alpha particles and noise. New results from using artificial single-crystal diamonds (sCVD) show similar time resolution as from pCVD diamonds but also sufficiently good energy resolution.

  14. The VERDI fission fragment spectrometer

    Directory of Open Access Journals (Sweden)

    Frégeau M.O.

    2013-12-01

    Full Text Available The VERDI time-of-flight spectrometer is dedicated to measurements of fission product yields and of prompt neutron emission data. Pre-neutron fission-fragment masses will be determined by the double time-of-flight (TOF technique. For this purpose an excellent time resolution is required. The time of flight of the fragments will be measured by electrostatic mirrors located near the target and the time signal coming from silicon detectors located at 50 cm on both sides of the target. This configuration, where the stop detector will provide us simultaneously with the kinetic energy of the fragment and timing information, significantly limits energy straggling in comparison to legacy experimental setup where a thin foil was usually used as a stop detector. In order to improve timing resolution, neutron transmutation doped silicon will be used. The high resistivity homogeneity of this material should significantly improve resolution in comparison to standard silicon detectors. Post-neutron fission fragment masses are obtained form the time-of-flight and the energy signal in the silicon detector. As an intermediary step a diamond detector will also be used as start detector located very close to the target. Previous tests have shown that poly-crystalline chemical vapour deposition (pCVD diamonds provides a coincidence time resolution of 150 ps not allowing complete separation between very low-energy fission fragments, alpha particles and noise. New results from using artificial single-crystal diamonds (sCVD show similar time resolution as from pCVD diamonds but also sufficiently good energy resolution.

  15. Equilibrium and non-equilibrium emission of complex fragments

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, D.R.

    1989-08-01

    Complex fragment emission (Z{gt}2) has been studied in the reactions of 50, 80, and 100 MeV/u {sup 139}La + {sup 12}C, and 80 MeV/u {sup 139}La + {sup 27}Al, {sup nat}Cu, and {sup 197}Au. Charge, angle, and energy distributions were measured inclusively and in coincidence with other complex fragments, and were used to extract the source rapidities, velocity distributions, and cross sections. The experimental emission velocity distributions, charge loss distributions, and cross sections have been compared with calculations based on statistical compound nucleus decay. The binary signature of the coincidence events and the sharpness of the velocity distributions illustrate the primarily 2-body nature of the {sup 139}La + {sup 12}C reaction mechanism between 50 and 100 MeV/u. The emission velocities, angular distributions, and absolute cross sections of fragments of 20{le}Z{le}35 at 50 MeV/u, 19{le}Z{le}28 at 80 MeV/u, and 17{le}Z{le}21 at 100 MeV/u indicate that these fragments arise solely from the binary decay of compound nuclei formed in incomplete fusion reactions in which the {sup 139}La projectile picks up about one-half of the {sup 12}C target. In the 80 MeV/u {sup 139}La + {sup 27}Al, {sup nat}Cu, and {sup 197}Au reactions, the disappearance of the binary signature in the total charge and velocity distributions suggests and increase in the complex fragment and light charged particle multiplicity with increasing target mass. As in the 80 and 100 MeV/u {sup 139}La + {sup 12}C reactions, the lighter complex fragments exhibit anisotropic angular distributions and cross sections that are too large to be explained exclusively by statistical emission. 143 refs., 67 figs.

  16. Genetics of recent habitat contraction and reduction in population size: does isolation by distance matter?

    Science.gov (United States)

    Leblois, Raphael; Estoup, Arnaud; Streiff, Rejane

    2006-10-01

    Fragmentation and loss of natural habitats are recognized as major threats to contemporary flora and fauna. Detecting past or current reductions in population size is therefore a major aim in conservation genetics. Statistical methods developed to this purpose have tended to ignore the effects of spatial population structure. However in many species, individual dispersal is restricted in space and fine-scale spatial structure such as isolation by distance (IBD) is commonly observed in continuous populations. Using a simulation-based approach, we investigated how comparative and single-point methods, traditionally used in a Wright-Fisher (WF) population context for detecting population size reduction, behave for IBD populations. We found that a complex 'quartet' of factors was acting that includes restricted dispersal, population size (i.e. habitat size), demographic history, and sampling scale. After habitat reduction, IBD populations were characterized by a stronger inertia in the loss of genetic diversity than WF populations. This inertia increases with the strength of IBD, and decreases when the sampling scale increases. Depending on the method used to detect a population size reduction, a local sampling can be more informative than a sample scaled to habitat size or vice versa. However, IBD structure led in numerous cases to incorrect inferences on population demographic history. The reanalysis of a real microsatellite data set of skink populations from fragmented and intact rainforest habitats confirmed most of our simulation results.

  17. Investigation of Nuclear Fragmentation in Relativistic Heavy Ion Collisions Using Plastic - Nuclear - Track Detectors

    CERN Multimedia

    2002-01-01

    In this experiment CR39 plastic nuclear track detectors will be used which are sensitive to detect relativistic nuclear fragments with charges Z@$>$5. They will be analyzed using an automatic track measuring system which was developed at the University of Siegen.\\\\ \\\\ This allows to measure large quantities of tracks in these passive detectors and to perform high statistics experiments. We intend to measure cross sections for the production of nuclear fragments from heavy ion beams at the SPS. \\\\ \\\\ The energy independence of the cross sections predicted by the idea of limiting fragmentation will be tested at high energies. In exposures with different targets we plan to analyze the factorization of the fragmentation cross sections into a target depending factor and a factor depending on the beam particle and the fragment. The cross sections for one proton remov Coulomb dissociation. \\\\ \\\\ We plan to investigate Coulomb dissociation for different targets and different energies. Fragment and projectile charges ...

  18. Fragmentation

    Science.gov (United States)

    K.H. Riitters

    2009-01-01

    Effective resource management takes into account the administrative and biophysical settings within which natural resources occur. A setting may be described in many ways; for example, by forest land ownership, by reserved and roadless designation, or by the distribution of human populations in relation to forest (chapter 3). The physical arrangement of forest in a...

  19. 卫星解体碎片生成数值模拟%Numerical simulation of fragment generation from satellite breakup

    Institute of Scientific and Technical Information of China (English)

    张晓天; 贾光辉

    2014-01-01

    提出卫星解体碎片生成的数值模拟方法,对卫星模型解体实验问题进行了数值模拟研究。有限元重构方法是一种有限元与 SPH 方法的结合,能够模拟获得孤立碎片的特性数据。通过在 SPH 模拟结果中重构有限元单元,能够有效区分碎片云中的置信孤立碎片和非置信孤立碎片,结合图论方法能够获得每个孤立碎片的单元构成及其尺寸、速度矢量和质量等信息。进而通过数据统计能够获得碎片分布信息。解体碎片数值模拟数据与实验数据具有较好的一致性,表明了该方法的有效性。%A numerical method for simulating fragment generation from satellite breakup is proposed and the impact case relevant with the test is simulated.Currently HVI numerical simu-lation technique is mainly used in spacecraft protective structure analysis.And the most widely used method is SPH.In this paper,the complete disintegration of spacecraft will not occur be-cause of the small size of the impactor.Protective structure HVI simulation focuses on the pene-tration limit of the shield while the individual fragment characteristics in the secondary debris cloud are not concerned much about,such as:amount of fragments,size and mass of each frag-ment,etc.In contrast,the purpose of spacecraft breakup model is to provide the characteristics of individual fragment,which are also supposed to be the output of breakup dynamics simulation. Finite element reconstruction method is a hybrid of finite element method and the smoothed par-ticle hydrodynamics method.The characteristics of the individual fragments can be obtained from the simulation.The confidence individual fragment can be identified by reconstructing finite ele-ments from the smoothed particles.The size,velocity vector,and mass can be computed with the fragment statistics method based on graph theory.The fragment distribution can be obtained from the individual fragment data.The good agreement

  20. Genetic population structure of the wind-pollinated, dioecious shrub Juniperus communis in fragmented Dutch heathlands

    NARCIS (Netherlands)

    Oostermeijer, J.G.B.; de Knegt, B

    2004-01-01

    The wind-pollinated, dioecious shrub Juniperus communis L. is declining in Dutch heathlands, mainly because recruitment is scarce. Aside from ecological factors, inbreeding associated with reduced population size and isolation in the currently fragmented landscape might explain this decline.

  1. Tropical Forest Fragmentation Affects Floral Visitors but Not the Structure of Individual-Based Palm-Pollinator Networks

    Science.gov (United States)

    Dáttilo, Wesley; Aguirre, Armando; Quesada, Mauricio; Dirzo, Rodolfo

    2015-01-01

    Despite increasing knowledge about the effects of habitat loss on pollinators in natural landscapes, information is very limited regarding the underlying mechanisms of forest fragmentation affecting plant-pollinator interactions in such landscapes. Here, we used a network approach to describe the effects of forest fragmentation on the patterns of interactions involving the understory dominant palm Astrocaryum mexicanum (Arecaceae) and its floral visitors (including both effective and non-effective pollinators) at the individual level in a Mexican tropical rainforest landscape. Specifically, we asked: (i) Does fragment size affect the structure of individual-based plant-pollinator networks? (ii) Does the core of highly interacting visitor species change along the fragmentation size gradient? (iii) Does forest fragment size influence the abundance of effective pollinators of A. mexicanum? We found that fragment size did not affect the topological structure of the individual-based palm-pollinator network. Furthermore, while the composition of peripheral non-effective pollinators changed depending on fragment size, effective core generalist species of pollinators remained stable. We also observed that both abundance and variance of effective pollinators of male and female flowers of A. mexicanum increased with forest fragment size. These findings indicate that the presence of effective pollinators in the core of all forest fragments could keep the network structure stable along the gradient of forest fragmentation. In addition, pollination of A. mexicanum could be more effective in larger fragments, since the greater abundance of pollinators in these fragments may increase the amount of pollen and diversity of pollen donors between flowers of individual plants. Given the prevalence of fragmentation in tropical ecosystems, our results indicate that the current patterns of land use will have consequences on the underlying mechanisms of pollination in remnant forests

  2. Tropical forest fragmentation affects floral visitors but not the structure of individual-based palm-pollinator networks.

    Science.gov (United States)

    Dáttilo, Wesley; Aguirre, Armando; Quesada, Mauricio; Dirzo, Rodolfo

    2015-01-01

    Despite increasing knowledge about the effects of habitat loss on pollinators in natural landscapes, information is very limited regarding the underlying mechanisms of forest fragmentation affecting plant-pollinator interactions in such landscapes. Here, we used a network approach to describe the effects of forest fragmentation on the patterns of interactions involving the understory dominant palm Astrocaryum mexicanum (Arecaceae) and its floral visitors (including both effective and non-effective pollinators) at the individual level in a Mexican tropical rainforest landscape. Specifically, we asked: (i) Does fragment size affect the structure of individual-based plant-pollinator networks? (ii) Does the core of highly interacting visitor species change along the fragmentation size gradient? (iii) Does forest fragment size influence the abundance of effective pollinators of A. mexicanum? We found that fragment size did not affect the topological structure of the individual-based palm-pollinator network. Furthermore, while the composition of peripheral non-effective pollinators changed depending on fragment size, effective core generalist species of pollinators remained stable. We also observed that both abundance and variance of effective pollinators of male and female flowers of A. mexicanum increased with forest fragment size. These findings indicate that the presence of effective pollinators in the core of all forest fragments could keep the network structure stable along the gradient of forest fragmentation. In addition, pollination of A. mexicanum could be more effective in larger fragments, since the greater abundance of pollinators in these fragments may increase the amount of pollen and diversity of pollen donors between flowers of individual plants. Given the prevalence of fragmentation in tropical ecosystems, our results indicate that the current patterns of land use will have consequences on the underlying mechanisms of pollination in remnant forests.

  3. Tropical forest fragmentation affects floral visitors but not the structure of individual-based palm-pollinator networks.

    Directory of Open Access Journals (Sweden)

    Wesley Dáttilo

    Full Text Available Despite increasing knowledge about the effects of habitat loss on pollinators in natural landscapes, information is very limited regarding the underlying mechanisms of forest fragmentation affecting plant-pollinator interactions in such landscapes. Here, we used a network approach to describe the effects of forest fragmentation on the patterns of interactions involving the understory dominant palm Astrocaryum mexicanum (Arecaceae and its floral visitors (including both effective and non-effective pollinators at the individual level in a Mexican tropical rainforest landscape. Specifically, we asked: (i Does fragment size affect the structure of individual-based plant-pollinator networks? (ii Does the core of highly interacting visitor species change along the fragmentation size gradient? (iii Does forest fragment size influence the abundance of effective pollinators of A. mexicanum? We found that fragment size did not affect the topological structure of the individual-based palm-pollinator network. Furthermore, while the composition of peripheral non-effective pollinators changed depending on fragment size, effective core generalist species of pollinators remained stable. We also observed that both abundance and variance of effective pollinators of male and female flowers of A. mexicanum increased with forest fragment size. These findings indicate that the presence of effective pollinators in the core of all forest fragments could keep the network structure stable along the gradient of forest fragmentation. In addition, pollination of A. mexicanum could be more effective in larger fragments, since the greater abundance of pollinators in these fragments may increase the amount of pollen and diversity of pollen donors between flowers of individual plants. Given the prevalence of fragmentation in tropical ecosystems, our results indicate that the current patterns of land use will have consequences on the underlying mechanisms of pollination in

  4. Sperm DNA fragmentation, recurrent implantation failure and recurrent miscarriage

    Directory of Open Access Journals (Sweden)

    Carol Coughlan

    2015-01-01

    Full Text Available Evidence is increasing that the integrity of sperm DNA may also be related to implantation failure and recurrent miscarriage (RM. To investigate this, the sperm DNA fragmentation in partners of 35 women with recurrent implantation failure (RIF following in vitro fertilization, 16 women diagnosed with RM and seven recent fathers (control were examined. Sperm were examined pre- and post-density centrifugation by the sperm chromatin dispersion (SCD test and the terminal deoxynucleotidyl transferase dUTP nick end labeling (TUNEL assay. There were no significant differences in the age of either partner or sperm concentration, motility or morphology between three groups. Moreover, there were no obvious differences in sperm DNA fragmentation measured by either test. However, whilst on average sperm DNA fragmentation in all groups was statistically lower in prepared sperm when measured by the SCD test, this was not seen with the results from the TUNEL assay. These results do not support the hypothesis that sperm DNA fragmentation is an important cause of RIF or RM, or that sperm DNA integrity testing has value in such patients. It also highlights significant differences between test methodologies and sperm preparation methods in interpreting the data from sperm DNA fragmentation tests.

  5. Sperm DNA fragmentation, recurrent implantation failure and recurrent miscarriage.

    Science.gov (United States)

    Coughlan, Carol; Clarke, Helen; Cutting, Rachel; Saxton, Jane; Waite, Sarah; Ledger, William; Li, Tinchiu; Pacey, Allan A

    2015-01-01

    Evidence is increasing that the integrity of sperm DNA may also be related to implantation failure and recurrent miscarriage (RM). To investigate this, the sperm DNA fragmentation in partners of 35 women with recurrent implantation failure (RIF) following in vitro fertilization, 16 women diagnosed with RM and seven recent fathers (control) were examined. Sperm were examined pre- and post-density centrifugation by the sperm chromatin dispersion (SCD) test and the terminal deoxynucleotidyl transferase dUTP nick end labeling (TUNEL) assay. There were no significant differences in the age of either partner or sperm concentration, motility or morphology between three groups. Moreover, there were no obvious differences in sperm DNA fragmentation measured by either test. However, whilst on average sperm DNA fragmentation in all groups was statistically lower in prepared sperm when measured by the SCD test, this was not seen with the results from the TUNEL assay. These results do not support the hypothesis that sperm DNA fragmentation is an important cause of RIF or RM, or that sperm DNA integrity testing has value in such patients. It also highlights significant differences between test methodologies and sperm preparation methods in interpreting the data from sperm DNA fragmentation tests.

  6. Efficient and accurate fragmentation methods.

    Science.gov (United States)

    Pruitt, Spencer R; Bertoni, Colleen; Brorsen, Kurt R; Gordon, Mark S

    2014-09-16

    Conspectus Three novel fragmentation methods that are available in the electronic structure program GAMESS (general atomic and molecular electronic structure system) are discussed in this Account. The fragment molecular orbital (FMO) method can be combined with any electronic structure method to perform accurate calculations on large molecular species with no reliance on capping atoms or empirical parameters. The FMO method is highly scalable and can take advantage of massively parallel computer systems. For example, the method has been shown to scale nearly linearly on up to 131 000 processor cores for calculations on large water clusters. There have been many applications of the FMO method to large molecular clusters, to biomolecules (e.g., proteins), and to materials that are used as heterogeneous catalysts. The effective fragment potential (EFP) method is a model potential approach that is fully derived from first principles and has no empirically fitted parameters. Consequently, an EFP can be generated for any molecule by a simple preparatory GAMESS calculation. The EFP method provides accurate descriptions of all types of intermolecular interactions, including Coulombic interactions, polarization/induction, exchange repulsion, dispersion, and charge transfer. The EFP method has been applied successfully to the study of liquid water, π-stacking in substituted benzenes and in DNA base pairs, solvent effects on positive and negative ions, electronic spectra and dynamics, non-adiabatic phenomena in electronic excited states, and nonlinear excited state properties. The effective fragment molecular orbital (EFMO) method is a merger of the FMO and EFP methods, in which interfragment interactions are described by the EFP potential, rather than the less accurate electrostatic potential. The use of EFP in this manner facilitates the use of a smaller value for the distance cut-off (Rcut). Rcut determines the distance at which EFP interactions replace fully quantum

  7. Laboratory Photo-chemistry of PAHs: Ionization versus Fragmentation

    CERN Document Server

    Zhen, Junfeng; Paardekooper, Daniel M; Ligterink, Niels; Linnartz, Harold; NAhon, Laurent; Joblin, Christine; Tielens, Alexander G G M

    2015-01-01

    Interstellar polycyclic aromatic hydrocarbons (PAHs) are expected to be strongly processed by vacuum ultraviolet photons. Here, we report experimental studies on the ionization and fragmentation of coronene (C24H12), ovalene (C32H14) and hexa-peri-hexabenzocoronene (HBC; C42H18) cations by exposure to synchrotron radiation in the range of 8--40 eV. The results show that for small PAH cations such as coronene, fragmentation (H-loss) is more important than ionization. However, as the size increases, ionization becomes more and more important and for the HBC cation, ionization dominates. These results are discussed and it is concluded that, for large PAHs, fragmentation only becomes important when the photon energy has reached the highest ionization potential accessible. This implies that PAHs are even more photo-stable than previously thought. The implications of this experimental study for the photo-chemical evolution of PAHs in the interstellar medium are briefly discussed.

  8. Modelling rock fragmentation of Extremely Energetic Rockfalls

    Science.gov (United States)

    De Blasio, Fabio; Dattola, Giuseppe; Battista Crosta, Giovanni

    2017-04-01

    Extremely energetic rockfalls (EER) are phenomena for which the combination of a large volume (at least some thousands of m ) and a free fall height of hundreds of metres, results in a large released energy. We fix a threshold value of around 1/50 of kilotons to define such a type of events. Documented examples include several events with dif-ferent size in the Alps (Dru, 2005, 2011, 265,000, 59,200 m3; val Fiscalina - Cima Una, 2007, 40,000 m3; Thurwieser 2004, ca 2 Mm3; Cengalo, 2011, 1.5*105 m3 in 2016, in Switzerland; Civetta, 2013, ca 50,000 m3;), in the Apennines (Gran Sasso, 2006, 30,000 m3), Rocky Mountains (Yosemite, Happy Isles, 38,000 m3), and Himalaya. EERs may become more frequent on steep and sharp mountain peaks as a consequence of permafrost thawing at higher altitudes. In contrast to low energy rockfalls where block disintegration is limited, in EERs the impact after free fall causes an immediate and efficient release of energy much like an explosion. The severe disintegration of the rock and the corresponding air blast are capable of snapping trees many hundreds of metres ahead of the fall area. Pulverized rock at high speed can abrade tree logs, and the resulting suspension flow may travel much further the impact zone, blanketing vast surrounding areas. Using both published accounts of some of these events and collecting direct data for some of them, we present some basic models to describe the involved processes based on analogies with explosions and explosive fragmentation. Of the initial energy, one part is used up in the rock disintegration, and the rest is shared between the shock wave and air blast. The fragmentation energy is calculated based on the fitting of the dust size spectrum by using different proba-bilistic distribution laws and the definition of a surface energy and by considering the involved strain rate. We find the fragmentation is around one third of the initial boulder energy. Finally, we evaluate the velocity of the

  9. Disentangling the drivers of reduced long-distance seed dispersal by birds in an experimentally fragmented landscape.

    Science.gov (United States)

    Uriarte, María; Anciães, Marina; da Silva, Mariana T B; Rubim, Paulo; Johnson, Erik; Bruna, Emilio M

    2011-04-01

    Seed dispersal is a crucial component of plant population dynamics. Human landscape modifications, such as habitat destruction and fragmentation, can alter the abundance of fruiting plants and animal dispersers, foraging rates, vector movement, and the composition of the disperser community, all of which can singly or in concert affect seed dispersal. Here, we quantify and tease apart the effects of landscape configuration, namely, fragmentation of primary forest and the composition of the surrounding forest matrix, on individual components of seed dispersal of Heliconia acuminata, an Amazonian understory herb. First we identified the effects of landscape configuration on the abundance of fruiting plants and six bird disperser species. Although highly variable in space and time, densities of fruiting plants were similar in continuous forest and fragments. However, the two largest-bodied avian dispersers were less common or absent in small fragments. Second, we determined whether fragmentation affected foraging rates. Fruit removal rates were similar and very high across the landscape, suggesting that Heliconia fruits are a key resource for small frugivores in this landscape. Third, we used radiotelemetry and statistical models to quantify how landscape configuration influences vector movement patterns. Bird dispersers flew farther and faster, and perched longer in primary relative to secondary forests. One species also altered its movement direction in response to habitat boundaries between primary and secondary forests. Finally, we parameterized a simulation model linking data on fruit density and disperser abundance and behavior with empirical estimates of seed retention times to generate seed dispersal patterns in two hypothetical landscapes. Despite clear changes in bird movement in response to landscape configuration, our simulations demonstrate that these differences had negligible effects on dispersal distances. However, small fragments had reduced densities

  10. Shifts in resident bird communities associated with cloud forest patch size in Central Veracruz, Mexico

    National Research Council Canada - National Science Library

    Rafael Rueda-Hernandez; Ian MacGregor-Fors; Katherine Renton

    2015-01-01

    .... We evaluated species richness, bird community density, community composition, and dominance as indicators of the response to fragment size in a fragmented cloud forest landscape in central Veracruz, Mexico...

  11. The relative influence of habitat loss and fragmentation: do tropical mammals meet the temperate paradigm?

    Science.gov (United States)

    Thornton, Daniel H; Branch, Lyn C; Sunquist, Melvin E

    2011-09-01

    The relative influence of habitat loss vs. habitat fragmentation per se (the breaking apart of habitat) on species distribution and abundance is a topic of debate. Although some theoretical studies predict a strong negative effect of fragmentation, consensus from empirical studies is that habitat fragmentation has weak effects compared with habitat loss and that these effects are as likely to be positive as negative. However, few empirical investigations of this issue have been conducted on tropical or wide-ranging species that may be strongly influenced by changes in patch size and edge that occur with increasing fragmentation. We tested the relative influence of habitat loss and fragmentation by examining occupancy of forest patches by 20 mid- and large-sized Neotropical mammal species in a fragmented landscape of northern Guatemala. We related patch occupancy of mammals to measures of habitat loss and fragmentation and compared the influence of these two factors while controlling for patch-level variables. Species responded strongly to both fragmentation and loss, and response to fragmentation generally was negative. Our findings support previous assumptions that conservation of large mammals in the tropics will require conservation strategies that go beyond prevention of habitat loss to also consider forest cohesion or other aspects of landscape configuration.

  12. Fragment separator momentum compression schemes

    Energy Technology Data Exchange (ETDEWEB)

    Bandura, Laura, E-mail: bandura@anl.gov [Facility for Rare Isotope Beams (FRIB), 1 Cyclotron, East Lansing, MI 48824-1321 (United States); National Superconducting Cyclotron Lab, Michigan State University, 1 Cyclotron, East Lansing, MI 48824-1321 (United States); Erdelyi, Bela [Argonne National Laboratory, Argonne, IL 60439 (United States); Northern Illinois University, DeKalb, IL 60115 (United States); Hausmann, Marc [Facility for Rare Isotope Beams (FRIB), 1 Cyclotron, East Lansing, MI 48824-1321 (United States); Kubo, Toshiyuki [RIKEN Nishina Center, RIKEN, Wako (Japan); Nolen, Jerry [Argonne National Laboratory, Argonne, IL 60439 (United States); Portillo, Mauricio [Facility for Rare Isotope Beams (FRIB), 1 Cyclotron, East Lansing, MI 48824-1321 (United States); Sherrill, Bradley M. [National Superconducting Cyclotron Lab, Michigan State University, 1 Cyclotron, East Lansing, MI 48824-1321 (United States)

    2011-07-21

    We present a scheme to use a fragment separator and profiled energy degraders to transfer longitudinal phase space into transverse phase space while maintaining achromatic beam transport. The first order beam optics theory of the method is presented and the consequent enlargement of the transverse phase space is discussed. An interesting consequence of the technique is that the first order mass resolving power of the system is determined by the first dispersive section up to the energy degrader, independent of whether or not momentum compression is used. The fragment separator at the Facility for Rare Isotope Beams is a specific application of this technique and is described along with simulations by the code COSY INFINITY.

  13. Fragment separator momentum compression schemes.

    Energy Technology Data Exchange (ETDEWEB)

    Bandura, L.; Erdelyi, B.; Hausmann, M.; Kubo, T.; Nolen, J.; Portillo, M.; Sherrill, B.M. (Physics); (MSU); (Northern Illinois Univ.); (RIKEN)

    2011-07-21

    We present a scheme to use a fragment separator and profiled energy degraders to transfer longitudinal phase space into transverse phase space while maintaining achromatic beam transport. The first order beam optics theory of the method is presented and the consequent enlargement of the transverse phase space is discussed. An interesting consequence of the technique is that the first order mass resolving power of the system is determined by the first dispersive section up to the energy degrader, independent of whether or not momentum compression is used. The fragment separator at the Facility for Rare Isotope Beams is a specific application of this technique and is described along with simulations by the code COSY INFINITY.

  14. Fragment correlations from NAUTILUS multidetector

    Energy Technology Data Exchange (ETDEWEB)

    Bizard, G. [Caen Univ., 14 (France). Lab. de Physique Corpusculaire

    1995-12-31

    It is shown on a few examples how heavy fragment correlations, induced either by conservation laws or by Coulomb interaction can bring physical information on nuclear reactions. All the experimental data discussed have been obtained at GANIL using the NAUTILUS gaseous multi detectors DELF and XYZT, which - due to their good spatial and time resolution and their large solid angle coverage - have proved to be efficient tools for multifragment correlation studies. Different reactions of Ar, Kr, Xe, and Pb beams on Au targets are discussed. It is shown that velocity and angular correlations between fragments provide a powerful clock to scrutinize the details of the hot nuclei decay history. (K.A.). 18 refs., 6 figs.

  15. Fragmentation in filamentary molecular clouds

    CERN Document Server

    Contreras, Yanett; Rathborne, Jill M; Sanhueza, Patricio

    2015-01-01

    Recent surveys of dust continuum emission at sub-mm wavelengths have shown that filamentary molecular clouds are ubiquitous along the Galactic plane. These structures are inhomogeneous, with over-densities that are sometimes associated with infrared emission and active of star formation. To investigate the connection between filaments and star formation, requires an understanding of the processes that lead to the fragmentation of filaments and a determination of the physical properties of the over-densities (clumps). In this paper, we present a multi-wavelength study of five filamentary molecular clouds, containing several clumps in different evolutionary stages of star formation. We analyse the fragmentation of the filaments and derive the physical properties of their clumps. We find that the clumps in all filaments have a characteristic spacing consistent with the prediction of the `sausage' instability theory, regardless of the complex morphology of the filaments or their evolutionary stage. We also find t...

  16. Fragmentering og korridorer i landskabet

    DEFF Research Database (Denmark)

    Hammershøj, M.; Madsen, A. B.

    , at fragmentering af habitater resulterer i en reduktion og isolering af mange plante- og dyrepopulationer. Det er desuden vist, at korridorer har en funktion som habitater, hvilket er medvirkende til, at et område med korridorer kan huse flere arter og individer end et tilsvarende område uden korridorer. Der......Rapporten indeholder en litteraturudredning, der er baseret på en bearbejdning af den tilgængelige nationale og internationale litteratur omhandlende fragmentering og korridorer på det botaniske og zoologiske område. I alt 1.063 titler ligger til grund for udredningen. Udredningen har vist...... mangler dog entydige beviser for, at korridorer kan være af afgørende betydning for rekolonisering af habitater, i hvilke en given art er forsvundet. Afslutningsvis gives en liste med forskningsbehov samt en række anbefalinger....

  17. Asymmetry effects in fragment production

    Science.gov (United States)

    Kaur, Manpreet; Kaur, Varinderjit

    2016-05-01

    The production of different fragments has been studied by taking into account the mass asymmetry of the reaction and employing the momentum dependent interactions. Two different set of asymmetric reactions have been analyzed while keeping Atotal fixed using soft momentum dependent equation of state. Our results indicate that the impact of momentum dependent interactions is different in lighter projectile systems as compared to heavier ones. The comparative analysis of IQMD simulations with the experimental data in case of heavier projectile and lighter target system for the reaction of 197Au+27Al (η = 0.7) at E = 600 MeV/nucleon shows that with the inclusion of MDI we are able, upto some extent, to reproduce the experimental universality of rise and fall of intermediate mass fragments (IMFs).

  18. Beyond the fragmentation threshold hypothesis: regime shifts in biodiversity across fragmented landscapes.

    Directory of Open Access Journals (Sweden)

    Renata Pardini

    Full Text Available Ecological systems are vulnerable to irreversible change when key system properties are pushed over thresholds, resulting in the loss of resilience and the precipitation of a regime shift. Perhaps the most important of such properties in human-modified landscapes is the total amount of remnant native vegetation. In a seminal study Andrén proposed the existence of a fragmentation threshold in the total amount of remnant vegetation, below which landscape-scale connectivity is eroded and local species richness and abundance become dependent on patch size. Despite the fact that species patch-area effects have been a mainstay of conservation science there has yet to be a robust empirical evaluation of this hypothesis. Here we present and test a new conceptual model describing the mechanisms and consequences of biodiversity change in fragmented landscapes, identifying the fragmentation threshold as a first step in a positive feedback mechanism that has the capacity to impair ecological resilience, and drive a regime shift in biodiversity. The model considers that local extinction risk is defined by patch size, and immigration rates by landscape vegetation cover, and that the recovery from local species losses depends upon the landscape species pool. Using a unique dataset on the distribution of non-volant small mammals across replicate landscapes in the Atlantic forest of Brazil, we found strong evidence for our model predictions--that patch-area effects are evident only at intermediate levels of total forest cover, where landscape diversity is still high and opportunities for enhancing biodiversity through local management are greatest. Furthermore, high levels of forest loss can push native biota through an extinction filter, and result in the abrupt, landscape-wide loss of forest-specialist taxa, ecological resilience and management effectiveness. The proposed model links hitherto distinct theoretical approaches within a single framework

  19. Population genetics at three spatial scales of a rare sponge living in fragmented habitats

    Directory of Open Access Journals (Sweden)

    Uriz Maria J

    2010-01-01

    Full Text Available Abstract Background Rare species have seldom been studied in marine habitats, mainly because it is difficult to formally assess the status of rare species, especially in patchy benthic organisms, for which samplings are often assumed to be incomplete and, thus, inappropriate for establishing the real abundance of the species. However, many marine benthic invertebrates can be considered rare, due to the fragmentation and rarity of suitable habitats. Consequently, studies on the genetic connectivity of rare species in fragmented habitats are basic for assessing their risk of extinction, especially in the context of increased habitat fragmentation by human activities. Sponges are suitable models for studying the intra- and inter-population genetic variation of rare invertebrates, as they produce lecitotrophic larvae and are often found in fragmented habitats. Results We investigated the genetic structure of a Mediterranean sponge, Scopalina lophyropoda (Schmidt, using the allelic size variation of seven specific microsatellite loci. The species can be classified as "rare" because of its strict habitat requirements, the low number of individuals per population, and the relatively small size of its distribution range. It also presents a strong patchy distribution, philopatric larval dispersal, and both sexual and asexual reproduction. Classical genetic-variance-based methods (AMOVA and differentiation statistics revealed that the genetic diversity of S. lophyropoda was structured at the three spatial scales studied: within populations, between populations of a geographic region, and between isolated geographic regions, although some stochastic gene flow might occur among populations within a region. The genetic structure followed an isolation-by-distance pattern according to the Mantel test. However, despite philopatric larval dispersal and fission events in the species, no single population showed inbreeding, and the contribution of clonality to the

  20. The fragmentation of Kosmos 2163

    Science.gov (United States)

    1992-01-01

    On 6 Dec. 1991 Kosmos 2163, a maneuverable Soviet spacecraft which had been in orbit for 58 days, experienced a major breakup at an altitude of approximately 210 km. Although numerous pieces of debris were created, the fragments decayed rapidly leaving no long-term impact on the near-Earth environment. The assessed cause of the event is the deliberate detonation of an explosive device. Details of this event are presented.