Collective probabilities algorithm for surface hopping calculations
International Nuclear Information System (INIS)
Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto
2003-01-01
General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method
Incorporating Skew into RMS Surface Roughness Probability Distribution
Stahl, Mark T.; Stahl, H. Philip.
2013-01-01
The standard treatment of RMS surface roughness data is the application of a Gaussian probability distribution. This handling of surface roughness ignores the skew present in the surface and overestimates the most probable RMS of the surface, the mode. Using experimental data we confirm the Gaussian distribution overestimates the mode and application of an asymmetric distribution provides a better fit. Implementing the proposed asymmetric distribution into the optical manufacturing process would reduce the polishing time required to meet surface roughness specifications.
Ahearn, Elizabeth A.; Lombard, Pamela J.
2014-01-01
for the 10-, 2-, 1, or 0.2-percent annual exceedance probabilities. The simulated water-surface elevations for August 2011 flood equal the elevations of State Route 12A about 500 ft downstream of Thurston Hill Road adjacent to the troughs between the rearing ponds. Four flood mitigation alternatives being considered by the Vermont Agency of Transportation to improve the hydraulic performance of Flint Brook and reduce the risk of flooding at the hatchery include: (A) no changes to the infrastructure or existing alignment of Flint Brook (existing conditions [2014]), (B) structural changes to the bridges and the existing retaining wall along Flint Brook, (C) realignment of Flint Brook to flow along the south side of Oxbow Road to accommodate larger stream discharges, and (D) a diversion channel for flows greater than 1-percent annual exceedance probability. Although the 10-, 2-, and 1-percent AEP floods do not flood the hatchery under alternative A (no changes to the infrastructure), the 0.2-percent AEP flow still poses a flooding threat to the hatchery because flow will continue to overtop the existing retaining wall and flood the hatchery. Under the other mitigation alternatives (B, C, and D) that include some variation of structural changes to bridges, a retaining wall, and (or) channel, the peak discharges for the 10-, 2-, 1-, and 0.2-percent annual exceedance probabilities do not flood the hatchery. Water-surface profiles and flood inundation maps of the August 2011 flood and the 10-, 2-, 1-, and 0.2-percent AEPs for four mitigation alternatives were developed for Flint Brook and the Third Branch White River in the vicinity of the hatchery and can be used by the Federal, State, and local agencies to better understand the potential for future flooding at the hatchery.
Maximization of regional probabilities using Optimal Surface Graphs
DEFF Research Database (Denmark)
Arias Lorza, Andres M.; Van Engelen, Arna; Petersen, Jens
2018-01-01
Purpose: We present a segmentation method that maximizes regional probabilities enclosed by coupled surfaces using an Optimal Surface Graph (OSG) cut approach. This OSG cut determines the globally optimal solution given a graph constructed around an initial surface. While most methods for vessel...... wall segmentation only use edge information, we show that maximizing regional probabilities using an OSG improves the segmentation results. We applied this to automatically segment the vessel wall of the carotid artery in magnetic resonance images. Methods: First, voxel-wise regional probability maps...... were obtained using a Support Vector Machine classifier trained on local image features. Then, the OSG segments the regions which maximizes the regional probabilities considering smoothness and topological constraints. Results: The method was evaluated on 49 carotid arteries from 30 subjects...
Shiryaev, A N
1996-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables
Probability of misclassifying biological elements in surface waters.
Loga, Małgorzata; Wierzchołowska-Dziedzic, Anna
2017-11-24
Measurement uncertainties are inherent to assessment of biological indices of water bodies. The effect of these uncertainties on the probability of misclassification of ecological status is the subject of this paper. Four Monte-Carlo (M-C) models were applied to simulate the occurrence of random errors in the measurements of metrics corresponding to four biological elements of surface waters: macrophytes, phytoplankton, phytobenthos, and benthic macroinvertebrates. Long series of error-prone measurement values of these metrics, generated by M-C models, were used to identify cases in which values of any of the four biological indices lay outside of the "true" water body class, i.e., outside the class assigned from the actual physical measurements. Fraction of such cases in the M-C generated series was used to estimate the probability of misclassification. The method is particularly useful for estimating the probability of misclassification of the ecological status of surface water bodies in the case of short sequences of measurements of biological indices. The results of the Monte-Carlo simulations show a relatively high sensitivity of this probability to measurement errors of the river macrophyte index (MIR) and high robustness to measurement errors of the benthic macroinvertebrate index (MMI). The proposed method of using Monte-Carlo models to estimate the probability of misclassification has significant potential for assessing the uncertainty of water body status reported to the EC by the EU member countries according to WFD. The method can be readily applied also in risk assessment of water management decisions before adopting the status dependent corrective actions.
Ozone-surface reactions in five homes: surface reaction probabilities, aldehyde yields, and trends.
Wang, H; Morrison, G
2010-06-01
Field experiments were conducted in five homes during three seasons (summer 2005, summer 2006 and winter 2007) to quantify ozone-initiated secondary aldehyde yields, surface reaction probabilities, and trends any temporal over a 1.5-year interval. Surfaces examined include living room carpets, bedroom carpets, kitchen floors, kitchen counters, and living room walls. Reaction probabilities for all surfaces for all seasons ranged from 9.4 x 10(-8) to 1.0 x 10(-4). There were no significant temporal trends in reaction probabilities for any surfaces from summer 2005 to summer 2006, nor over the entire 1.5-year period, indicating that it may take significantly longer than this period for surfaces to exhibit any 'ozone aging' or lowering of ozone-surface reactivity. However, all surfaces in three houses exhibited a significant decrease in reaction probabilities from summer 2006 to winter 2007. The total yield of aldehydes for the summer of 2005 were nearly identical to that for summer of 2006, but were significantly higher than for winter 2007. We also observed that older carpets were consistently less reactive than in newer carpets, but that countertops remained consistently reactive, probably because of occupant activities such as cooking and cleaning. Ozone reactions taking place at indoor surfaces significantly influence personal exposure to ozone and volatile reaction products. These field studies show that indoor surfaces only slowly lose their ability to react with ozone over several year time frames, and that this is probably because of a combination of large reservoirs of reactive coatings and periodic additions of reactive coatings in the form of cooking, cleaning, and skin-oil residues. When considering exposure to ozone and its reaction products and in the absence of dramatic changes in occupancy, activities or furnishings, indoor surface reactivity is expected to change very slowly.
International Nuclear Information System (INIS)
Hidalgo, J. C.; Polnarev, A. G.
2009-01-01
In this paper we derive the probability of the radial profiles of spherically symmetric inhomogeneities in order to provide an improved estimation of the number density of primordial black holes (PBHs). We demonstrate that the probability of PBH formation depends sensitively on the radial profile of the initial configuration. We do this by characterizing this profile with two parameters chosen heuristically: the amplitude of the inhomogeneity and the second radial derivative, both evaluated at the center of the configuration. We calculate the joint probability of initial cosmological inhomogeneities as a function of these two parameters and then find a correspondence between these parameters and those used in numerical computations of PBH formation. Finally, we extend our heuristic study to evaluate the probability of PBH formation taking into account for the first time the radial profile of curvature inhomogeneities.
International Nuclear Information System (INIS)
Li Hai-Xia; Cheng Chuan-Fu
2011-01-01
We study the light scattering of an orthogonal anisotropic rough surface with secondary most probable slope distribution. It is found that the scattered intensity profiles have obvious secondary maxima, and in the direction perpendicular to the plane of incidence, the secondary maxima are oriented in a curve on the observation plane, which is called the orientation curve. By numerical calculation of the scattering wave fields with the height data of the sample, it is validated that the secondary maxima are induced by the side face element, which constitutes the prismoid structure of the anisotropic surface. We derive the equation of the quadratic orientation curve. Experimentally, we construct the system for light scattering measurement using a CCD. The scattered intensity profiles are extracted from the images at different angles of incidence along the orientation curves. The experimental results conform to the theory. (fundamental areas of phenomenology(including applications))
Chasmer, L.; Hopkinson, C.; Gynan, C.; Mahoney, C.; Sitar, M.
2015-12-01
Airborne and terrestrial lidar are increasingly used in forest attribute modeling for carbon, ecosystem and resource monitoring. The near infra-red wavelength at 1064nm has been utilised most in airborne applications due to, for example, diode manufacture costs, surface reflectance and eye safety. Foliage reflects well at 1064nm and most of the literature on airborne lidar forest structure is based on data from this wavelength. However, lidar systems also operate at wavelengths further from the visible spectrum (e.g. 1550nm) for eye safety reasons. This corresponds to a water absorption band and can be sensitive to attenuation if surfaces contain moisture. Alternatively, some systems operate in the visible range (e.g. 532nm) for specialised applications requiring simultaneous mapping of terrestrial and bathymetric surfaces. All these wavelengths provide analogous 3D canopy structure reconstructions and thus offer the potential to be combined for spatial comparisons or temporal monitoring. However, a systematic comparison of wavelength-dependent foliage profile and gap probability (index of transmittance) is needed. Here we report on two multispectral lidar missions carried out in 2013 and 2015 over conifer, deciduous and mixed stands in Ontario, Canada. The first used separate lidar sensors acquiring comparable data at three wavelengths, while the second used a single sensor with 3 integrated laser systems. In both cases, wavelenegths sampled were 532nm, 1064nm and 1550nm. The experiment revealed significant differences in proportions of returns at ground level, the vertical foliage distribution and gap probability across wavelengths. Canopy attenuation was greatest at 532nm due to photosynthetic plant tissue absorption. Relative to 1064nm, foliage was systematically undersampled at the 10% to 60% height percentiles at both 1550nm and 532nm (this was confirmed with coincident terrestrial lidar data). When using all returns to calculate gap probability, all
Airborne Surface Profiling of Alaskan Glaciers
National Oceanic and Atmospheric Administration, Department of Commerce — This data set consists of glacier outline, laser altimetry profile, and surface elevation change data for 46 glaciers in Alaska and British Columbia, Canada,...
SURFACE BRIGHTNESS PROFILES OF DWARF GALAXIES. I. PROFILES AND STATISTICS
International Nuclear Information System (INIS)
Herrmann, Kimberly A.; Hunter, Deidre A.; Elmegreen, Bruce G.
2013-01-01
Radial surface brightness profiles of spiral galaxies are classified into three types: (I) single exponential, or the light falls off with one exponential to a break before falling off (II) more steeply, or (III) less steeply. Profile breaks are also found in dwarf disks, but some dwarf Type IIs are flat or increasing out to a break before falling off. Here we re-examine the stellar disk profiles of 141 dwarfs: 96 dwarf irregulars (dIms), 26 Blue Compact Dwarfs (BCDs), and 19 Magellanic-type spirals (Sms). We fit single, double, or even triple exponential profiles in up to 11 passbands: GALEX FUV and NUV, ground-based UBVJHK and Hα, and Spitzer 3.6 and 4.5 μm. We find that more luminous galaxies have brighter centers, larger inner and outer scale lengths, and breaks at larger radii; dwarf trends with M B extend to spirals. However, the V-band break surface brightness is independent of break type, M B , and Hubble type. Dwarf Type II and III profiles fall off similarly beyond the breaks but have different interiors and IIs break ∼twice as far as IIIs. Outer Type II and III scale lengths may have weak trends with wavelength, but pure Type II inner scale lengths clearly decrease from the FUV to visible bands whereas Type III inner scale lengths increase with redder bands. This suggests the influence of different star formation histories on profile type, but nonetheless the break location is approximately the same in all passbands. Dwarfs continue trends between profile and Hubble types such that later-type galaxies have more Type II but fewer Type I and III profiles than early-type spirals. BCDs and Sms are over-represented as Types III and II, respectively, compared to dIms
Doping profile measurement on textured silicon surface
Essa, Zahi; Taleb, Nadjib; Sermage, Bernard; Broussillou, Cédric; Bazer-Bachi, Barbara; Quillec, Maurice
2018-04-01
In crystalline silicon solar cells, the front surface is textured in order to lower the reflection of the incident light and increase the efficiency of the cell. This texturing whose dimensions are a few micrometers wide and high, often makes it difficult to determine the doping profile measurement. We have measured by secondary ion mass spectrometry (SIMS) and electrochemical capacitance voltage profiling the doping profile of implanted phosphorus in alkaline textured and in polished monocrystalline silicon wafers. The paper shows that SIMS gives accurate results provided the primary ion impact angle is small enough. Moreover, the comparison between these two techniques gives an estimation of the concentration of electrically inactive phosphorus atoms.
Input-profile-based software failure probability quantification for safety signal generation systems
International Nuclear Information System (INIS)
Kang, Hyun Gook; Lim, Ho Gon; Lee, Ho Jung; Kim, Man Cheol; Jang, Seung Cheol
2009-01-01
The approaches for software failure probability estimation are mainly based on the results of testing. Test cases represent the inputs, which are encountered in an actual use. The test inputs for the safety-critical application such as a reactor protection system (RPS) of a nuclear power plant are the inputs which cause the activation of protective action such as a reactor trip. A digital system treats inputs from instrumentation sensors as discrete digital values by using an analog-to-digital converter. Input profile must be determined in consideration of these characteristics for effective software failure probability quantification. Another important characteristic of software testing is that we do not have to repeat the test for the same input value since the software response is deterministic for each specific digital input. With these considerations, we propose an effective software testing method for quantifying the failure probability. As an example application, the input profile of the digital RPS is developed based on the typical plant data. The proposed method in this study is expected to provide a simple but realistic mean to quantify the software failure probability based on input profile and system dynamics.
Design of Softgauge for Surface Profile Evaluation
International Nuclear Information System (INIS)
Nie, M Q; Liu, X J; Jiang, X Q
2006-01-01
A concept of softgauge has been proposed by ISO in the context of surface texture measurement, in which reference software and reference data are included. In this paper, an effective scheme to build reference software for 2D surface profile measurement is proposed. The advantage of the scheme lies in its effective combination of high numerical calculating capability of MATLAB with perfect interface programming capability of VC. Preliminary reference software is developed, and typical algorithms are tested
Exact analytical density profiles and surface tension
Indian Academy of Sciences (India)
journal of. May 2005 physics pp. 785–801. Classical charged fluids at equilibrium near ... is provided by the excess surface tension for an air–water interface, which is determined ... the potential drop created by the electric layer which appears as soon as the fluid has ...... radii, by symmetry, the charge density profile is flat,.
Surface glycosylation profiles of urine extracellular vesicles.
Directory of Open Access Journals (Sweden)
Jared Q Gerlach
Full Text Available Urinary extracellular vesicles (uEVs are released by cells throughout the nephron and contain biomolecules from their cells of origin. Although uEV-associated proteins and RNA have been studied in detail, little information exists regarding uEV glycosylation characteristics. Surface glycosylation profiling by flow cytometry and lectin microarray was applied to uEVs enriched from urine of healthy adults by ultracentrifugation and centrifugal filtration. The carbohydrate specificity of lectin microarray profiles was confirmed by competitive sugar inhibition and carbohydrate-specific enzyme hydrolysis. Glycosylation profiles of uEVs and purified Tamm Horsfall protein were compared. In both flow cytometry and lectin microarray assays, uEVs demonstrated surface binding, at low to moderate intensities, of a broad range of lectins whether prepared by ultracentrifugation or centrifugal filtration. In general, ultracentrifugation-prepared uEVs demonstrated higher lectin binding intensities than centrifugal filtration-prepared uEVs consistent with lesser amounts of co-purified non-vesicular proteins. The surface glycosylation profiles of uEVs showed little inter-individual variation and were distinct from those of Tamm Horsfall protein, which bound a limited number of lectins. In a pilot study, lectin microarray was used to compare uEVs from individuals with autosomal dominant polycystic kidney disease to those of age-matched controls. The lectin microarray profiles of polycystic kidney disease and healthy uEVs showed differences in binding intensity of 6/43 lectins. Our results reveal a complex surface glycosylation profile of uEVs that is accessible to lectin-based analysis following multiple uEV enrichment techniques, is distinct from co-purified Tamm Horsfall protein and may demonstrate disease-specific modifications.
Risk Profile Indicators and Spanish Banks’ Probability of Default from a Regulatory Approach
Directory of Open Access Journals (Sweden)
Pilar Gómez-Fernández-Aguado
2018-04-01
Full Text Available This paper analyses the relationships between the traditional bank risk profile indicators and a new measure of banks’ probability of default that considers the Basel regulatory framework. First, based on the SYstemic Model of Bank Originated Losses (SYMBOL, we calculated the individual probabilities of default (PD of a representative sample of Spanish credit institutions during the period of 2008–2016. Then, panel data regressions were estimated to explore the influence of the risk indicators on the PD. Our findings on the Spanish banking system could be important to regulatory and supervisory authorities. First, the PD based on the SYMBOL model could be used to analyse bank risk from a regulatory approach. Second, the results might be useful for designing new regulations focused on the key factors that affect the banks’ probability of default. Third, our findings reveal that the emphasis on regulation and supervision should differ by type of entity.
Inoue, N.
2017-12-01
The conditional probability of surface ruptures is affected by various factors, such as shallow material properties, process of earthquakes, ground motions and so on. Toda (2013) pointed out difference of the conditional probability of strike and reverse fault by considering the fault dip and width of seismogenic layer. This study evaluated conditional probability of surface rupture based on following procedures. Fault geometry was determined from the randomly generated magnitude based on The Headquarters for Earthquake Research Promotion (2017) method. If the defined fault plane was not saturated in the assumed width of the seismogenic layer, the fault plane depth was randomly provided within the seismogenic layer. The logistic analysis was performed to two data sets: surface displacement calculated by dislocation methods (Wang et al., 2003) from the defined source fault, the depth of top of the defined source fault. The estimated conditional probability from surface displacement indicated higher probability of reverse faults than that of strike faults, and this result coincides to previous similar studies (i.e. Kagawa et al., 2004; Kataoka and Kusakabe, 2005). On the contrary, the probability estimated from the depth of the source fault indicated higher probability of thrust faults than that of strike and reverse faults, and this trend is similar to the conditional probability of PFDHA results (Youngs et al., 2003; Moss and Ross, 2011). The probability of combined simulated results of thrust and reverse also shows low probability. The worldwide compiled reverse fault data include low fault dip angle earthquake. On the other hand, in the case of Japanese reverse fault, there is possibility that the conditional probability of reverse faults with less low dip angle earthquake shows low probability and indicates similar probability of strike fault (i.e. Takao et al., 2013). In the future, numerical simulation by considering failure condition of surface by the source
Atomic profile imaging of ceramic oxide surfaces
International Nuclear Information System (INIS)
Bursill, L.A.; Peng JuLin; Sellar, J.R.
1989-01-01
Atomic surface profile imaging is an electron optical technique capable of revealing directly the surface crystallography of ceramic oxides. Use of an image-intensifier with a TV camera allows fluctuations in surface morphology and surface reactivity to be recorded and analyzed using digitized image data. This paper reviews aspects of the electron optical techniques, including interpretations based upon computer-simulation image-matching techniques. An extensive range of applications is then presented for ceramic oxides of commercial interest for advanced materials applications: including uranium oxide (UO 2 ); magnesium and nickel oxide (MgO,NiO); ceramic superconductor YBa 2 Cu 3 O 6.7 ); barium titanate (BaTiO 3 ); sapphire (α-A1 2 O 3 ); haematite (α-Fe-2O 3 ); monoclinic, tetragonal and cubic monocrystalline forms of zirconia (ZrO 2 ), lead zirconium titanate (PZT + 6 mol.% NiNbO 3 ) and ZBLAN fluoride glass. Atomic scale detail has been obtained of local structures such as steps associated with vicinal surfaces, facetting parallel to stable low energy crystallographic planes, monolayer formation on certain facets, relaxation and reconstructions, oriented overgrowth of lower oxides, chemical decomposition of complex oxides into component oxides, as well as amorphous coatings. This remarkable variety of observed surface stabilization mechanisms is discussed in terms of novel double-layer electrostatic depolarization mechanisms, as well as classical concepts of the physics and chemistry of surfaces (ionization and affinity energies and work function). 46 refs., 16 figs
Shallow surface depth profiling with atomic resolution
International Nuclear Information System (INIS)
Xi, J.; Dastoor, P.C.; King, B.V.; O'Connor, D.J.
1999-01-01
It is possible to derive atomic layer-by-layer composition depth profiles from popular electron spectroscopic techniques, such as X-ray photoelectron spectroscopy (XPS) or Auger electron spectroscopy (AES). When ion sputtering assisted AES or XPS is used, the changes that occur during the establishment of the steady state in the sputtering process make these techniques increasingly inaccurate for depths less than 3nm. Therefore non-destructive techniques of angle-resolved XPS (ARXPS) or AES (ARAES) have to be used in this case. In this paper several data processing algorithms have been used to extract the atomic resolved depth profiles of a shallow surface (down to 1nm) from ARXPS and ARAES data
Probability function of breaking-limited surface elevation. [wind generated waves of ocean
Tung, C. C.; Huang, N. E.; Yuan, Y.; Long, S. R.
1989-01-01
The effect of wave breaking on the probability function of surface elevation is examined. The surface elevation limited by wave breaking zeta sub b(t) is first related to the original wave elevation zeta(t) and its second derivative. An approximate, second-order, nonlinear, non-Gaussian model for zeta(t) of arbitrary but moderate bandwidth is presented, and an expression for the probability density function zeta sub b(t) is derived. The results show clearly that the effect of wave breaking on the probability density function of surface elevation is to introduce a secondary hump on the positive side of the probability density function, a phenomenon also observed in wind wave tank experiments.
A probability measure for random surfaces of arbitrary genus and bosonic strings in 4 dimensions
International Nuclear Information System (INIS)
Albeverio, S.; Hoeegh-Krohn, R.; Paycha, S.; Scarlatti, S.
1989-01-01
We define a probability measure describing random surfaces in R D , 3≤D≤13, parametrized by compact Riemann surfaces of arbitrary genus. The measure involves the path space measure for scalar fields with exponential interaction in 2 space time dimensions. We show that it gives a mathematical realization of Polyakov's heuristic measure for bosonic strings. (orig.)
Watson, Ryan J; Veale, Jaimie F; Saewyc, Elizabeth M
2017-05-01
Research has documented high rates of disordered eating for lesbian, gay, and bisexual youth, but prevalence and patterns of disordered eating among transgender youth remain unexplored. This is despite unique challenges faced by this group, including gender-related body image and the use of hormones. We explore the relationship between disordered eating and risk and protective factors for transgender youth. An online survey of 923 transgender youth (aged 14-25) across Canada was conducted, primarily using measures from existing youth health surveys. Analyses were stratified by gender identity and included logistic regressions with probability profiles to illustrate combinations of risk and protective factors for eating disordered behaviors. Enacted stigma (the higher rates of harassment and discrimination sexual minority youth experience) was linked to higher odds of reported past year binge eating and fasting or vomiting to lose weight, while protective factors, including family connectedness, school connectedness, caring friends, and social support, were linked to lower odds of past year disordered eating. Youth with the highest levels of enacted stigma and no protective factors had high probabilities of past year eating disordered behaviors. Our study found high prevalence of disorders. Risk for these behaviors was linked to stigma and violence exposure, but offset by social supports. Health professionals should assess transgender youth for disordered eating behaviors and supportive resources. © 2016 Wiley Periodicals, Inc.(Int J Eat Disord 2017; 50:515-522). © 2016 Wiley Periodicals, Inc.
International Nuclear Information System (INIS)
Shang Yanlong; Cai Qi; Chen Lisheng; Zhang Yangwei
2012-01-01
In this paper, the combined method of response surface and importance sampling was applied for calculation of parameter failure probability of the thermodynamic system. The mathematics model was present for the parameter failure of physics process in the thermodynamic system, by which the combination arithmetic model of response surface and importance sampling was established, then the performance degradation model of the components and the simulation process of parameter failure in the physics process of thermodynamic system were also present. The parameter failure probability of the purification water system in nuclear reactor was obtained by the combination method. The results show that the combination method is an effective method for the calculation of the parameter failure probability of the thermodynamic system with high dimensionality and non-linear characteristics, because of the satisfactory precision with less computing time than the direct sampling method and the drawbacks of response surface method. (authors)
International Nuclear Information System (INIS)
Neskovic, N.; Ciric, D.; Perovic, B.
1982-01-01
The survival probability in small angle scattering of low energy alkali ions from alkali covered metal surfaces is considered. The model is based on the momentum approximation. The projectiles are K + ions and the target is the (001)Ni+K surface. The incident energy is 100 eV and the incident angle 5 0 . The interaction potential of the projectile and the target consists of the Born-Mayer, the dipole and the image charge potentials. The transition probability function corresponds to the resonant electron transition to the 4s projectile energy level. (orig.)
Probability distribution for the Gaussian curvature of the zero level surface of a random function
Hannay, J. H.
2018-04-01
A rather natural construction for a smooth random surface in space is the level surface of value zero, or ‘nodal’ surface f(x,y,z) = 0, of a (real) random function f; the interface between positive and negative regions of the function. A physically significant local attribute at a point of a curved surface is its Gaussian curvature (the product of its principal curvatures) because, when integrated over the surface it gives the Euler characteristic. Here the probability distribution for the Gaussian curvature at a random point on the nodal surface f = 0 is calculated for a statistically homogeneous (‘stationary’) and isotropic zero mean Gaussian random function f. Capitalizing on the isotropy, a ‘fixer’ device for axes supplies the probability distribution directly as a multiple integral. Its evaluation yields an explicit algebraic function with a simple average. Indeed, this average Gaussian curvature has long been known. For a non-zero level surface instead of the nodal one, the probability distribution is not fully tractable, but is supplied as an integral expression.
Huang, N. E.; Long, S. R.
1980-01-01
Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.
Probability- and curve-based fractal reconstruction on 2D DEM terrain profile
International Nuclear Information System (INIS)
Lai, F.-J.; Huang, Y.M.
2009-01-01
Data compression and reconstruction has been playing important roles in information science and engineering. As part of them, image compression and reconstruction that mainly deal with image data set reduction for storage or transmission and data set restoration with least loss is still a topic deserved a great deal of works to focus on. In this paper we propose a new scheme in comparison with the well-known Improved Douglas-Peucker (IDP) method to extract characteristic or feature points of two-dimensional digital elevation model (2D DEM) terrain profile to compress data set. As for reconstruction in use of fractal interpolation, we propose a probability-based method to speed up the fractal interpolation execution to a rate as high as triple or even ninefold of the regular. In addition, a curve-based method is proposed in the study to determine the vertical scaling factor that much affects the generation of the interpolated data points to significantly improve the reconstruction performance. Finally, an evaluation is made to show the advantage of employing the proposed new method to extract characteristic points associated with our novel fractal interpolation scheme.
International Nuclear Information System (INIS)
Carter, G.; Katardjiev, I.V.; Nobes, M.J.
1989-01-01
The quasi-linear partial differential continuity equations that describe the evolution of the depth profiles and surface concentrations of marker atoms in kinematically equivalent systems undergoing sputtering, ion collection and atomic mixing are solved using the method of characteristics. It is shown how atomic mixing probabilities can be deduced from measurements of ion collection depth profiles with increasing ion fluence, and how this information can be used to predict surface concentration evolution. Even with this information, however, it is shown that it is not possible to deconvolute directly the surface concentration measurements to provide initial depth profiles, except when only ion collection and sputtering from the surface layer alone occur. It is demonstrated further that optimal recovery of initial concentration depth profiles could be ensured if the concentration-measuring analytical probe preferentially sampled depths near and at the maximum depth of bombardment-induced perturbations. (author)
The loaded surface profile: a new technique for the investigation of contact surfaces
McBride, J.W.
2006-01-01
Contact between rough surfaces produces a complex contact profile. The contact area is usually estimated according to roughness statistics in conjunction withsurface models or by examining the surfaces before and after contact. Most of the existing literature on loaded surface profiles is theoretical or numerical in nature. This paper presents a methodology for a new system to measure the loaded surface profile, based on a non-contact 3D laser profiler. The system allows the measurement of...
The prediction of BRDFs from surface profile measurements
International Nuclear Information System (INIS)
Church, E.L.; Takacs, P.Z.; Leonard, T.A.
1989-01-01
This paper discusses methods of predicting the BRDF of smooth surfaces from profile measurements of their surface finish. The conversion of optical profile data to the BRDF at the same wavelength is essentially independent of scattering models, while the conversion of mechanical measurements, and wavelength scaling in general, are model dependent. Procedures are illustrated for several surfaces, including two from the recent HeNe BRDF round robin, and results are compared with measured data. Reasonable agreement is found except for surfaces which involve significant scattering from isolated surface defects which are poorly sampled in the profile data
Reaction probability of molecular deuterium with a disordered InSb (110) surface
International Nuclear Information System (INIS)
Wolf, B.; Zehe, A.
1987-01-01
A detailed experimental analysis of the interaction of molecular deuterium with sputter-damaged InSb surfaces by the aid of SIMS is given. The sticking probability of D 2 and its transformation to a chemisorbed state resulting in InD + signals in SIMS measurements can be determined by adsorption experimens both with and without a hot tungsten filament. The calculated sticking probability of D 2 = 2 x 10 -4 is at least three orders of magnitude higher than the known-value for a cleavage plane of InSb
Energy Technology Data Exchange (ETDEWEB)
Flytzani-Stephanopoulos, M; Schmidt, L D
1979-01-01
Simple models incorporating surface reaction and diffusion of volatile products through a boundary layer are developed to calculate effective rates of evaporation and local surface profiles on surfaces having active and inactive regions. The coupling between surface heterogeneities with respect to a particular reaction and external mass transfer may provide a mechanism for the surface rearrangement and metal loss encountered in several catalytic systems of practical interest. Calculated transport rates for the volatilization of platinum in oxidizing environments and the rearrangement of this metal during the ammonia oxidation reaction agree well with published experimental data.
Mean-field behavior for the survival probability and the point-to-surface connectivity
Sakai, A
2003-01-01
We consider the critical survival probability for oriented percolation and the contact process, and the point-to-surface connectivity for critical percolation. By similarity, let \\rho denote the critical expoents for both quantities. We prove in a unified fashion that, if \\rho exists and if both two-point function and its certain restricted version exhibit the same mean-field behavior, then \\rho=2 for percolation with d>7 and \\rho=1 for the time-oriented models with d>4.
Quantitative sputter profiling at surfaces and interfaces
International Nuclear Information System (INIS)
Kirschner, J.; Etzkorn, H.W.
1981-01-01
The key problem in quantitative sputter profiling, that of a sliding depth scale has been solved by combined Auger/X-ray microanalysis. By means of this technique and for the model system Ge/Si (amorphous) the following questions are treated quantitatively: shape of the sputter profiles when sputtering through an interface and origin of their asymmetry; precise location of the interface plane on the depth profile; broadening effects due to limited depth of information and their correction; origin and amount of bombardment induced broadening for different primary ions and energies; depth dependence of the broadening, and basic limits to depth resolution. Comparisons are made to recent theoretical calculations based on recoil mixing in the collision cascade and very good agreement is found
Surface, segregation profile for Ni50Pd50(100)
DEFF Research Database (Denmark)
Christensen, Asbjørn; Ruban, Andrei; Skriver, Hans Lomholt
1997-01-01
A recent dynamical LEED study [G.N. Derry, C.B. McVey, P.J. Rous, Surf. Sci. 326 (1995) 59] reported an oscillatory surface segregation profile in the Ni50Pd50(100) system with the surface layer enriched by Pd. We have performed ab-initio total-energy calculations for the surface of this alloy...... system using the coherent potential approximation and obtain an oscillatory segregation profile, in agreement with experiments. We discuss the energetic origin of the oscillatory segregation profile in terms of effective cluster interactions. We include relaxation effects by means of the semi...
Otero, Federico; Norte, Federico; Araneo, Diego
2018-01-01
The aim of this work is to obtain an index for predicting the probability of occurrence of zonda event at surface level from sounding data at Mendoza city, Argentine. To accomplish this goal, surface zonda wind events were previously found with an objective classification method (OCM) only considering the surface station values. Once obtained the dates and the onset time of each event, the prior closest sounding for each event was taken to realize a principal component analysis (PCA) that is used to identify the leading patterns of the vertical structure of the atmosphere previously to a zonda wind event. These components were used to construct the index model. For the PCA an entry matrix of temperature ( T) and dew point temperature (Td) anomalies for the standard levels between 850 and 300 hPa was build. The analysis yielded six significant components with a 94 % of the variance explained and the leading patterns of favorable weather conditions for the development of the phenomenon were obtained. A zonda/non-zonda indicator c can be estimated by a logistic multiple regressions depending on the PCA component loadings, determining a zonda probability index \\widehat{c} calculable from T and Td profiles and it depends on the climatological features of the region. The index showed 74.7 % efficiency. The same analysis was performed by adding surface values of T and Td from Mendoza Aero station increasing the index efficiency to 87.8 %. The results revealed four significantly correlated PCs with a major improvement in differentiating zonda cases and a reducing of the uncertainty interval.
Characterization of the intrinsic density profiles for liquid surfaces
International Nuclear Information System (INIS)
Chacon, Enrique; Tarazona, Pedro
2005-01-01
This paper presents recent advances in the characterization of the intrinsic structures in computer simulations of liquid surfaces. The use of operational definitions for the intrinsic surface, associated with each molecular configuration of a liquid slab, gives direct access to the intrinsic profile and to the wavevector dependent surface tension. However, the characteristics of these functions depend on the definition used for the intrinsic surface. We discuss the pathologies associated with a local Gibbs dividing surface definition, and consider the alternative definition of a minimal area surface, going though a set of surface pivots, self-consistently chosen to represent the first liquid layer
A Study on Water Surface Profiles of Rivers with Constriction
Qian, Chaochao; Yamada, Tadashi
2013-04-01
Water surface profile of rivers with constrictions is precious in both classic hydraulics and river management practice. This study was conducted to clarify the essences of the water surface profiles. 3 cases of experiments and 1D numerical calculations with different discharges were made in the study and analysis solutions of the non-linear basic equation of surface profile in varied flow without considering friction were derived. The manning's number was kept in the same in each case by using crosspiece roughness. We found a new type of water surface profile of varied flow from the results of 1D numerical calculation and that of experiments and named it as Mc curve because of its mild condition with constriction segment. This kind of curves appears as a nature phenomenon ubiquitously. The process of water surface forming is dynamic and bore occurs at the upper side of constriction during increasing discharge before the surface profile formed. As a theoretical work, 3 analysis solutions were derived included 2 physical-meaning solutions in the study by using Man-Machine system. One of the derived physical-meaning solutions was confirmed that it is validity by comparing to the results of 1D numerical calculation and that of experiments. The solution represents a flow profile from under critical condition at the upper side to super critical condition at the down side of constriction segment. The other derived physical-meaning solution represents a flow profile from super critical condition at the upper side to under critical condition at the down side of constriction segment. These two kinds of flow profiles exist in the nature but no theoretical solution can express the phenomenon. We find the depth distribution only concerned with unit width discharge distribution and critical depth under a constant discharge from the derived solutions. Therefor, the profile can be gained simply and precisely by using the theoretical solutions instead of numerical calculation even
Shu, Shi; Morrison, Glenn C
2011-05-15
Ozone can react homogeneously with unsaturated organic compounds in buildings to generate undesirable products. However, these reactions can also occur on indoor surfaces, especially for low-volatility organics. Conversion rates of ozone with α-terpineol, a representative low-volatility compound, were quantified on surfaces that mimic indoor substrates. Rates were measured for α-terpineol adsorbed to beads of glass, polyvinylchloride (PVC), and dry latex paint, in a plug flow reactor. A newly defined second-order surface reaction rate coefficient, k(2), was derived from the flow reactor model. The value of k(2) ranged from 0.68 × 10(-14) cm(4)s(-1)molecule(-1) for α-terpineol adsorbed to PVC to 3.17 × 10(-14) cm(4)s(-1)molecule(-1) for glass, but was insensitive to relative humidity. Further, k(2) is only weakly influenced by the adsorbed mass but instead appears to be more strongly related to the interfacial activity α-terpineol. The minimum reaction probability ranged from 3.79 × 10(-6) for glass at 20% RH to 6.75 × 10(-5) for PVC at 50% RH. The combination of high equilibrium surface coverage and high reactivity for α-terpineol suggests that surface conversion rates are fast enough to compete with or even overwhelm other removal mechanisms in buildings such as gas-phase conversion and air exchange.
Flux surface shape and current profile optimization in tokamaks
International Nuclear Information System (INIS)
Dobrott, D.R.; Miller, R.L.
1977-01-01
Axisymmetric tokamak equilibria of noncircular cross section are analyzed numerically to study the effects of flux surface shape and current profile on ideal and resistive interchange stability. Various current profiles are examined for circles, ellipses, dees, and doublets. A numerical code separately analyzes stability in the neighborhood of the magnetic axis and in the remainder of the plasma using the criteria of Mercier and Glasser, Greene, and Johnson. Results are interpreted in terms of flux surface averaged quantities such as magnetic well, shear, and the spatial variation in the magnetic field energy density over the cross section. The maximum stable β is found to vary significantly with shape and current profile. For current profiles varying linearly with poloidal flux, the highest β's found were for doublets. Finally, an algorithm is presented which optimizes the current profile for circles and dees by making the plasma everywhere marginally stable
Optimized Estimation of Surface Layer Characteristics from Profiling Measurements
Directory of Open Access Journals (Sweden)
Doreene Kang
2016-01-01
Full Text Available New sampling techniques such as tethered-balloon-based measurements or small unmanned aerial vehicles are capable of providing multiple profiles of the Marine Atmospheric Surface Layer (MASL in a short time period. It is desirable to obtain surface fluxes from these measurements, especially when direct flux measurements are difficult to obtain. The profiling data is different from the traditional mean profiles obtained at two or more fixed levels in the surface layer from which surface fluxes of momentum, sensible heat, and latent heat are derived based on Monin-Obukhov Similarity Theory (MOST. This research develops an improved method to derive surface fluxes and the corresponding MASL mean profiles of wind, temperature, and humidity with a least-squares optimization method using the profiling measurements. This approach allows the use of all available independent data. We use a weighted cost function based on the framework of MOST with the cost being optimized using a quasi-Newton method. This approach was applied to seven sets of data collected from the Monterey Bay. The derived fluxes and mean profiles show reasonable results. An empirical bias analysis is conducted using 1000 synthetic datasets to evaluate the robustness of the method.
Profiles of US and CT imaging features with a high probability of appendicitis
van Randen, A.; Laméris, W.; van Es, H.W.; ten Hove, W.; Bouma, W.H.; van Leeuwen, M.S.; van Keulen, E.M.; van der Hulst, V.P.M.; Henneman, O.D.; Bossuyt, P.M.; Boermeester, M.A.; Stoker, J.
2010-01-01
To identify and evaluate profiles of US and CT features associated with acute appendicitis. Consecutive patients presenting with acute abdominal pain at the emergency department were invited to participate in this study. All patients underwent US and CT. Imaging features known to be associated with
Ion induced optical emission for surface and depth profile analysis
International Nuclear Information System (INIS)
White, C.W.
1977-01-01
Low-energy ion bombardment of solid surfaces results in the emission of infrared, visible, and ultraviolet radiation produced by inelastic ion-solid collision processes. The emitted optical radiation provides important insight into low-energy particle-solid interactions and provides the basis for an analysis technique which can be used for surface and depth profile analysis with high sensitivity. The different kinds of collision induced optical radiation emitted as a result of low-energy particle-solid collisions are reviewed. Line radiation arising from excited states of sputtered atoms or molecules is shown to provide the basis for surface and depth profile analysis. The spectral characteristics of this type of radiation are discussed and applications of the ion induced optical emission technique are presented. These applications include measurements of ion implant profiles, detection sensitivities for submonolayer quantities of impurities on elemental surfaces, and the detection of elemental impurities on complex organic substrates
Virtual environment assessment for laser-based vision surface profiling
ElSoussi, Adnane; Al Alami, Abed ElRahman; Abu-Nabah, Bassam A.
2015-03-01
Oil and gas businesses have been raising the demand from original equipment manufacturers (OEMs) to implement a reliable metrology method in assessing surface profiles of welds before and after grinding. This certainly mandates the deviation from the commonly used surface measurement gauges, which are not only operator dependent, but also limited to discrete measurements along the weld. Due to its potential accuracy and speed, the use of laser-based vision surface profiling systems have been progressively rising as part of manufacturing quality control. This effort presents a virtual environment that lends itself for developing and evaluating existing laser vision sensor (LVS) calibration and measurement techniques. A combination of two known calibration techniques is implemented to deliver a calibrated LVS system. System calibration is implemented virtually and experimentally to scan simulated and 3D printed features of known profiles, respectively. Scanned data is inverted and compared with the input profiles to validate the virtual environment capability for LVS surface profiling and preliminary assess the measurement technique for weld profiling applications. Moreover, this effort brings 3D scanning capability a step closer towards robust quality control applications in a manufacturing environment.
Profiles of US and CT imaging features with a high probability of appendicitis
International Nuclear Information System (INIS)
Randen, A. van; Lameris, W.; Es, H.W. van; Hove, W. ten; Bouma, W.H.; Leeuwen, M.S. van; Keulen, E.M. van; Hulst, V.P.M. van der; Henneman, O.D.; Bossuyt, P.M.; Boermeester, M.A.; Stoker, J.
2010-01-01
To identify and evaluate profiles of US and CT features associated with acute appendicitis. Consecutive patients presenting with acute abdominal pain at the emergency department were invited to participate in this study. All patients underwent US and CT. Imaging features known to be associated with appendicitis, and an imaging diagnosis were prospectively recorded by two independent radiologists. A final diagnosis was assigned after 6 months. Associations between appendiceal imaging features and a final diagnosis of appendicitis were evaluated with logistic regression analysis. Appendicitis was assigned to 284 of 942 evaluated patients (30%). All evaluated features were associated with appendicitis. Imaging profiles were created after multivariable logistic regression analysis. Of 147 patients with a thickened appendix, local transducer tenderness and peri-appendiceal fat infiltration on US, 139 (95%) had appendicitis. On CT, 119 patients in whom the appendix was completely visualised, thickened, with peri-appendiceal fat infiltration and appendiceal enhancement, 114 had a final diagnosis of appendicitis (96%). When at least two of these essential features were present on US or CT, sensitivity was 92% (95% CI 89-96%) and 96% (95% CI 93-98%), respectively. Most patients with appendicitis can be categorised within a few imaging profiles on US and CT. When two of the essential features are present the diagnosis of appendicitis can be made accurately. (orig.)
Profiles of US and CT imaging features with a high probability of appendicitis
Energy Technology Data Exchange (ETDEWEB)
Randen, A. van; Lameris, W. [University of Amsterdam, Department of Radiology, Academic Medical Center, Amsterdam (Netherlands); University of Amsterdam, Department of Surgery, Academic Medical Center, Amsterdam (Netherlands); Es, H.W. van [St Antonius Hospital, Department of Radiology, Nieuwegein (Netherlands); Hove, W. ten; Bouma, W.H. [Gelre Hospitals, Department of Surgery, Apeldoorn (Netherlands); Leeuwen, M.S. van [University Medical Centre, Department of Radiology, Utrecht (Netherlands); Keulen, E.M. van [Tergooi Hospitals, Department of Radiology, Hilversum (Netherlands); Hulst, V.P.M. van der [Onze Lieve Vrouwe Gasthuis, Department of Radiology, Amsterdam (Netherlands); Henneman, O.D. [Bronovo Hospital, Department of Radiology, The Hague (Netherlands); Bossuyt, P.M. [University of Amsterdam, Department of Clinical Epidemiology, Biostatistics, and Bioinformatics, Academic Medical Center, Amsterdam (Netherlands); Boermeester, M.A. [University of Amsterdam, Department of Surgery, Academic Medical Center, Amsterdam (Netherlands); Stoker, J. [University of Amsterdam, Department of Radiology, Academic Medical Center, Amsterdam (Netherlands)
2010-07-15
To identify and evaluate profiles of US and CT features associated with acute appendicitis. Consecutive patients presenting with acute abdominal pain at the emergency department were invited to participate in this study. All patients underwent US and CT. Imaging features known to be associated with appendicitis, and an imaging diagnosis were prospectively recorded by two independent radiologists. A final diagnosis was assigned after 6 months. Associations between appendiceal imaging features and a final diagnosis of appendicitis were evaluated with logistic regression analysis. Appendicitis was assigned to 284 of 942 evaluated patients (30%). All evaluated features were associated with appendicitis. Imaging profiles were created after multivariable logistic regression analysis. Of 147 patients with a thickened appendix, local transducer tenderness and peri-appendiceal fat infiltration on US, 139 (95%) had appendicitis. On CT, 119 patients in whom the appendix was completely visualised, thickened, with peri-appendiceal fat infiltration and appendiceal enhancement, 114 had a final diagnosis of appendicitis (96%). When at least two of these essential features were present on US or CT, sensitivity was 92% (95% CI 89-96%) and 96% (95% CI 93-98%), respectively. Most patients with appendicitis can be categorised within a few imaging profiles on US and CT. When two of the essential features are present the diagnosis of appendicitis can be made accurately. (orig.)
Optical surface profiling of orb-web spider capture silks
Energy Technology Data Exchange (ETDEWEB)
Kane, D M; Joyce, A M; Staib, G R [Department of Physics, Macquarie University, Sydney, NSW 2109 (Australia); Herberstein, M E, E-mail: deb.kane@mq.edu.a [Department of Biological Sciences, Macquarie University, Sydney, NSW 2109 (Australia)
2010-09-15
Much spider silk research to date has focused on its mechanical properties. However, the webs of many orb-web spiders have evolved for over 136 million years to evade visual detection by insect prey. It is therefore a photonic device in addition to being a mechanical device. Herein we use optical surface profiling of capture silks from the webs of adult female St Andrews cross spiders (Argiope keyserlingi) to successfully measure the geometry of adhesive silk droplets and to show a bowing in the aqueous layer on the spider capture silk between adhesive droplets. Optical surface profiling shows geometric features of the capture silk that have not been previously measured and contributes to understanding the links between the physical form and biological function. The research also demonstrates non-standard use of an optical surface profiler to measure the maximum width of a transparent micro-sized droplet (microlens).
Optical surface profiling of orb-web spider capture silks
International Nuclear Information System (INIS)
Kane, D M; Joyce, A M; Staib, G R; Herberstein, M E
2010-01-01
Much spider silk research to date has focused on its mechanical properties. However, the webs of many orb-web spiders have evolved for over 136 million years to evade visual detection by insect prey. It is therefore a photonic device in addition to being a mechanical device. Herein we use optical surface profiling of capture silks from the webs of adult female St Andrews cross spiders (Argiope keyserlingi) to successfully measure the geometry of adhesive silk droplets and to show a bowing in the aqueous layer on the spider capture silk between adhesive droplets. Optical surface profiling shows geometric features of the capture silk that have not been previously measured and contributes to understanding the links between the physical form and biological function. The research also demonstrates non-standard use of an optical surface profiler to measure the maximum width of a transparent micro-sized droplet (microlens).
Simple laser vision sensor calibration for surface profiling applications
Abu-Nabah, Bassam A.; ElSoussi, Adnane O.; Al Alami, Abed ElRahman K.
2016-09-01
Due to the relatively large structures in the Oil and Gas industry, original equipment manufacturers (OEMs) have been implementing custom-designed laser vision sensor (LVS) surface profiling systems as part of quality control in their manufacturing processes. The rough manufacturing environment and the continuous movement and misalignment of these custom-designed tools adversely affect the accuracy of laser-based vision surface profiling applications. Accordingly, Oil and Gas businesses have been raising the demand from the OEMs to implement practical and robust LVS calibration techniques prior to running any visual inspections. This effort introduces an LVS calibration technique representing a simplified version of two known calibration techniques, which are commonly implemented to obtain a calibrated LVS system for surface profiling applications. Both calibration techniques are implemented virtually and experimentally to scan simulated and three-dimensional (3D) printed features of known profiles, respectively. Scanned data is transformed from the camera frame to points in the world coordinate system and compared with the input profiles to validate the introduced calibration technique capability against the more complex approach and preliminarily assess the measurement technique for weld profiling applications. Moreover, the sensitivity to stand-off distances is analyzed to illustrate the practicality of the presented technique.
Flow profiling of a surface-acoustic-wave nanopump
Guttenberg, Z.; Rathgeber, A.; Keller, S.; Rädler, J. O.; Wixforth, A.; Kostur, M.; Schindler, M.; Talkner, P.
2004-11-01
The flow profile in a capillary gap and the pumping efficiency of an acoustic micropump employing surface acoustic waves is investigated both experimentally and theoretically. Ultrasonic surface waves on a piezoelectric substrate strongly couple to a thin liquid layer and generate a quadrupolar streaming pattern within the fluid. We use fluorescence correlation spectroscopy and fluorescence microscopy as complementary tools to investigate the resulting flow profile. The velocity was found to depend on the applied power approximately linearly and to decrease with the inverse third power of the distance from the ultrasound generator on the chip. The found properties reveal acoustic streaming as a promising tool for the controlled agitation during microarray hybridization.
Jian, Jhih-Wei; Elumalai, Pavadai; Pitti, Thejkiran; Wu, Chih Yuan; Tsai, Keng-Chang; Chang, Jeng-Yih; Peng, Hung-Pin; Yang, An-Suei
2016-01-01
Predicting ligand binding sites (LBSs) on protein structures, which are obtained either from experimental or computational methods, is a useful first step in functional annotation or structure-based drug design for the protein structures. In this work, the structure-based machine learning algorithm ISMBLab-LIG was developed to predict LBSs on protein surfaces with input attributes derived from the three-dimensional probability density maps of interacting atoms, which were reconstructed on the query protein surfaces and were relatively insensitive to local conformational variations of the tentative ligand binding sites. The prediction accuracy of the ISMBLab-LIG predictors is comparable to that of the best LBS predictors benchmarked on several well-established testing datasets. More importantly, the ISMBLab-LIG algorithm has substantial tolerance to the prediction uncertainties of computationally derived protein structure models. As such, the method is particularly useful for predicting LBSs not only on experimental protein structures without known LBS templates in the database but also on computationally predicted model protein structures with structural uncertainties in the tentative ligand binding sites. PMID:27513851
Probability of a surface rupture offset beneath a nuclear test reactor
International Nuclear Information System (INIS)
Reed, J.W.; Meehan, R.L.; Crellin, G.L.
1981-01-01
A probabilistic analysis was conducted to determine the likelihood of a surface rupture offset of any size beneath the 50 megawatt General Electric Test Reactor (GETR), which is located at the Vallecitos Nuclear Center near Pleasanton, California. Geologic faults have been observed at the GETR site. These faults may be due to surface folds, landslides, or deep tectonic movement. They are referred to in the paper as 'existing faults;' however, use of this term does not imply that they are tectonic in origin. The objective of the analysis was to evaluate whether a conservative estimate of the probability of occurrence of a future fault movement is sufficiently low so that movement beneath the reactor building need not be considered as a design basis event. The reactor building is located between two existing faults which are approximately 1320 feet apart. If a fault movement occurs in the future, it is conservatively assumed to occur either on the existing faults or between the faults, or on a fault(s) and between the two faults at the same time. The probabilistic model included the possibility of movements occurring due to unknown, undiscovered faults in the region. For this part, movements were assumed to occur according to a Poisson process. For the possibility of new faults occurring due to the two existing faults, a hazard function was used which increases with time since the last offset. (orig./RW)
Surface activity, lipid profiles and their implications in cervical cancer.
Directory of Open Access Journals (Sweden)
Preetha A
2005-01-01
Full Text Available Background: The profiles of lipids in normal and cancerous tissues may differ revealing information about cancer development and progression. Lipids being surface active, changes in lipid profiles can manifest as altered surface activity profiles. Langmuir monolayers offer a convenient model for evaluating surface activity of biological membranes. Aims: The aims of this study were to quantify phospholipids and their effects on surface activity of normal and cancerous human cervical tissues as well as to evaluate the role of phosphatidylcholine (PC and sphingomyelin (SM in cervical cancer using Langmuir monolayers. Methods and Materials: Lipid quantification was done using thin layer chromatography and phosphorus assay. Surface activity was evaluated using Langmuir monolayers. Monolayers were formed on the surface of deionized water by spreading tissue organic phase corresponding to 1 mg of tissue and studying their surface pressure-area isotherms at body temperature. The PC and SM contents of cancerous human cervical tissues were higher than those of the normal human cervical tissues. Role of PC and SM were evaluated by adding varying amounts of these lipids to normal cervical pooled organic phase. Statistical analysis: Student′s t-test (p < 0.05 and one-way analysis of variance (ANOVA was used. Results: Our results reveals that the phosphatidylglycerol level in cancerous cervical tissue was nearly five folds higher than that in normal cervical tissue. Also PC and sphingomyelin SM were found to be the major phospholipid components in cancerous and normal cervical tissues respectively. The addition of either 1.5 µg DPPC or 0.5 µg SM /mg of tissue to the normal organic phase changed its surface activity profile to that of the cancerous tissues. Statistically significant surface activity parameters showed that PC and SM have remarkable roles in shifting the normal cervical lipophilic surface activity towards that of cancerous lipophilic
Jabar, Syaheed B; Filipowicz, Alex; Anderson, Britt
2017-11-01
When a location is cued, targets appearing at that location are detected more quickly. When a target feature is cued, targets bearing that feature are detected more quickly. These attentional cueing effects are only superficially similar. More detailed analyses find distinct temporal and accuracy profiles for the two different types of cues. This pattern parallels work with probability manipulations, where both feature and spatial probability are known to affect detection accuracy and reaction times. However, little has been done by way of comparing these effects. Are probability manipulations on space and features distinct? In a series of five experiments, we systematically varied spatial probability and feature probability along two dimensions (orientation or color). In addition, we decomposed response times into initiation and movement components. Targets appearing at the probable location were reported more quickly and more accurately regardless of whether the report was based on orientation or color. On the other hand, when either color probability or orientation probability was manipulated, response time and accuracy improvements were specific for that probable feature dimension. Decomposition of the response time benefits demonstrated that spatial probability only affected initiation times, whereas manipulations of feature probability affected both initiation and movement times. As detection was made more difficult, the two effects further diverged, with spatial probability disproportionally affecting initiation times and feature probability disproportionately affecting accuracy. In conclusion, all manipulations of probability, whether spatial or featural, affect detection. However, only feature probability affects perceptual precision, and precision effects are specific to the probable attribute.
Directory of Open Access Journals (Sweden)
Toochukwu Ekwutosi OGBULIE
2013-05-01
Full Text Available Plasmid analysis of bacteria isolated from agricultural soil experimentally contaminated with crude oil was carried out and the resultant bands’ depicting the different molecular sizes of the plasmid DNA molecules per isolate was obtained. There was no visible band observed for Klebsiella indicating that the organism lack plasmid DNA that confers degradative ability to it, possibly the gene could be borne on the chromosomal DNA which enabled its persistence in the polluted soil. Molecular characterization was undertaken to confirm the identities of the possible microorganisms that may be present in crude oil-contaminated soil. The result of the DNA extracted and amplified in a PCR using EcoRI and EcoRV restriction enzymes for cutting the DNA of the bacterial cells indicated no visible band for cuts made with EcoRV restriction enzyme showing that the enzyme is not specific for bacterial DNA of isolates in the samples, hence there was no amplification. By contrast though, visible bands of amplicons were observed using EcoRI restriction enzymes. The resultant visible bands of microbial profile obtained using the universal RAPD primer with nucleotide sequence of 5’—CTC AAA GCA TCT AGG TCC A---3’ showed that only Pseudomonas fluorescens and Bacillus mycoides had visible bands at identical position on the gel indicating that both species possibly had identical sequence or genes of negligible differences coding for degradation of hydrocarbons as shown by similar values in molecular weight and positions in the gel electrophoresis field.
Flow profiling of a surface acoustic wave nanopump
Guttenberg, Z.; Rathgeber, A.; Keller, S.; Rädler, J. O.; Wixforth, A.; Kostur, M.; Schindler, M.; Talkner, P.
2004-01-01
The flow profile in a capillary gap and the pumping efficiency of an acoustic micropump employing Surface Acoustic Waves is investigated both experimentally and theoretically. Such ultrasonic surface waves on a piezoelectric substrate strongly couple to a thin liquid layer and generate an internal streaming within the fluid. Such acoustic streaming can be used for controlled agitation during, e.g., microarray hybridization. We use fluorescence correlation spectroscopy and fluorescence microsc...
Optimization for sinusoidal profiles in surface relief gratings ...
Indian Academy of Sciences (India)
2014-02-07
Feb 7, 2014 ... filometry [7–9] and monitoring of surface self-diffusion of solids under ultrahigh vacuum conditions [10]. In the present work, recording parameters, i.e. exposure time and deve- lopment time for fabrication of such holographic gratings have been optimized to obtain nearly perfect sinusoidal profiles in the ...
Directory of Open Access Journals (Sweden)
V. I. Mordachev
2009-01-01
Full Text Available The paper provides results of modeling distribution of signal probability of radio-electronic aids located over the Earth surface at a specific height and determining an electromagnetic environment on its surface according to a power parameter and an input direction angle at an optionally selected observation point being on the earth surface.
V. I. Mordachev
2009-01-01
The paper provides results of modeling distribution of signal probability of radio-electronic aids located over the Earth surface at a specific height and determining an electromagnetic environment on its surface according to a power parameter and an input direction angle at an optionally selected observation point being on the earth surface.
Reducing Surface Clutter in Cloud Profiling Radar Data
Tanelli, Simone; Pak, Kyung; Durden, Stephen; Im, Eastwood
2008-01-01
An algorithm has been devised to reduce ground clutter in the data products of the CloudSat Cloud Profiling Radar (CPR), which is a nadir-looking radar instrument, in orbit around the Earth, that measures power backscattered by clouds as a function of distance from the instrument. Ground clutter contaminates the CPR data in the lowest 1 km of the atmospheric profile, heretofore making it impossible to use CPR data to satisfy the scientific interest in studying clouds and light rainfall at low altitude. The algorithm is based partly on the fact that the CloudSat orbit is such that the geodetic altitude of the CPR varies continuously over a range of approximately 25 km. As the geodetic altitude changes, the radar timing parameters are changed at intervals defined by flight software in order to keep the troposphere inside a data-collection time window. However, within each interval, the surface of the Earth continuously "scans through" (that is, it moves across) a few range bins of the data time window. For each radar profile, only few samples [one for every range-bin increment ((Delta)r = 240 m)] of the surface-clutter signature are available around the range bin in which the peak of surface return is observed, but samples in consecutive radar profiles are offset slightly (by amounts much less than (Delta)r) with respect to each other according to the relative change in geodetic altitude. As a consequence, in a case in which the surface area under examination is homogenous (e.g., an ocean surface), a sequence of consecutive radar profiles of the surface in that area contains samples of the surface response with range resolution (Delta)p much finer than the range-bin increment ((Delta)p 10 dB and a reduction of the contaminated altitude over ocean from about 1 km to about 0.5 km (over the ocean). The algorithm has been embedded in CloudSat L1B processing as of Release 04 (July 2007), and the estimated flat surface clutter is removed in L2B-GEOPROF product from the
Hong Shen
2011-01-01
The concepts of curve profile, curve intercept, curve intercept density, curve profile area density, intersection density in containing intersection (or intersection density relied on intersection reference), curve profile intersection density in surface (or curve intercept intersection density relied on intersection of containing curve), and curve profile area density in surface (AS) were defined. AS expressed the amount of curve profile area of Y phase in the unit containing surface area, S...
Fatty acid methyl ester profiles of bat wing surface lipids.
Pannkuk, Evan L; Fuller, Nathan W; Moore, Patrick R; Gilmore, David F; Savary, Brett J; Risch, Thomas S
2014-11-01
Sebocytes are specialized epithelial cells that rupture to secrete sebaceous lipids (sebum) across the mammalian integument. Sebum protects the integument from UV radiation, and maintains host microbial communities among other functions. Native glandular sebum is composed primarily of triacylglycerides (TAG) and wax esters (WE). Upon secretion (mature sebum), these lipids combine with minor cellular membrane components comprising total surface lipids. TAG and WE are further cleaved to smaller molecules through oxidation or host enzymatic digestion, resulting in a complex mixture of glycerolipids (e.g., TAG), sterols, unesterified fatty acids (FFA), WE, cholesteryl esters, and squalene comprising surface lipid. We are interested if fatty acid methyl ester (FAME) profiling of bat surface lipid could predict species specificity to the cutaneous fungal disease, white nose syndrome (WNS). We collected sebaceous secretions from 13 bat spp. using Sebutape(®) and converted them to FAME with an acid catalyzed transesterification. We found that Sebutape(®) adhesive patches removed ~6× more total lipid than Sebutape(®) indicator strips. Juvenile eastern red bats (Lasiurus borealis) had significantly higher 18:1 than adults, but 14:0, 16:1, and 20:0 were higher in adults. FAME profiles among several bat species were similar. We concluded that bat surface lipid FAME profiling does not provide a robust model predicting species susceptibility to WNS. However, these results provide baseline data that can be used for lipid roles in future ecological studies, such as life history, diet, or migration.
Ignition probability of fine dead surface fuels in native Patagonia forests of Argentina
Energy Technology Data Exchange (ETDEWEB)
Bianchi, L.; Defosse, G. E.
2014-06-01
Aim of study: The Canadian Forest Fire Weather Index (FWI) is being implemented all over the world. This index is being adapted to the Argentinean ecosystems since the year 2000. With the objective of calibrating the Fine Fuel Moisture Code (FFMC) of the FWI system to Patagonian forests, we studied the relationship between ignition probability and fine dead surface fuel moisture content (MC) as an indicator of potential fire ignition. Area of study: The study area is located in northwestern Patagonia, Argentina, and comprised two main forest types (cypress and nire) grown under a Mediterranean climate, with a dry summer and precipitations during winter and autumn ({approx}500-800 mm per year). Material and methods: We conducted lab ignition tests fires to determine the threshold of fine dead fuel ignition at different MC levels. Moisture content of dead fine surface fuels in the field was measured every 10-15 days from November to March for three seasons. We calculated the FFMC during these seasons and correlated it with the measured MC by applying a logistic regression model. We combined the results of the ignition tests and of the regressions to suggest FFMC categories for estimating fire danger in Patagonian forests. Main results: The ignition threshold occurred at MC values of 21.5 and 25.0% for cypress and nire sites, respectively. The MC measured varied from 7.3 to 129.6%, and the calculated FFMC varied between 13.4 and 92.6. Highly significant regressions resulted when FFMC was related to MC. The ignition threshold corresponded to a FFMC = 85. We proposed to divide the FFMC scale in three fire danger categories: Low (FFMC {<=} 85), High (85 < FFMC{<=}89) and Extreme (FFMC > 89). Research highlights: Our results provide a useful tool for predicting fire danger in these ecosystems, and are a contribution to the development of the Argentinean Fire Danger Rating and a reference for similar studies in other countries where the FWI is being implemented. (Author)
Ignition probability of fine dead surface fuels of native Patagonian forests or Argentina
Directory of Open Access Journals (Sweden)
Lucas O. Bianchi
2014-04-01
Full Text Available Aim of study: The Canadian Forest Fire Weather Index (FWI is being implemented all over the world. This index is being adapted to the Argentinean ecosystems since the year 2000. With the objective of calibrating the Fine Fuel Moisture Code (FFMC of the FWI system to Patagonian forests, we studied the relationship between ignition probability and fine dead surface fuel moisture content (MC as an indicator of potential fire ignition.Area of study: The study area is located in northwestern Patagonia, Argentina, and comprised two main forest types (cypress and ñire grown under a Mediterranean climate, with a dry summer and precipitations during winter and autumn (~500-800 mm per year.Material and Methods: We conducted lab ignition tests fires to determine the threshold of fine dead fuel ignition at different MC levels. Moisture content of dead fine surface fuels in the field was measured every 10-15 days from November to March for three seasons. We calculated the FFMC during these seasons and correlated it with the measured MC by applying a logistic regression model. We combined the results of the ignition tests and of the regressions to suggest FFMC categories for estimating fire danger in Patagonian forests.Main results: The ignition threshold occurred at MC values of 21.5 and 25.0% for cypress and ñire sites, respectively. The MC measured varied from 7.3 to 129.6%, and the calculated FFMC varied between 13.4 and 92.6. Highly significant regressions resulted when FFMC was related to MC. The ignition threshold corresponded to a FFMC=85. We proposed to divide the FFMC scale in three fire danger categories: Low (FFMC≤85, High (85
Surface influence upon vertical profiles in the nocturnal boundary layer
Garratt, J. R.
1983-05-01
Near-surface wind profiles in the nocturnal boundary layer, depth h, above relatively flat, tree-covered terrain are described in the context of the analysis of Garratt (1980) for the unstable atmospheric boundary layer. The observations at two sites imply a surface-based transition layer, of depth z *, within which the observed non-dimensional profiles Φ M 0 are a modified form of the inertial sub-layer relation Φ _M ( {{z L}} = ( {{{1 + 5_Z } L}} ) according to Φ _M^{{0}} ˜eq ( {{{1 + 5z} L}} )exp [ { - 0.7( {{{1 - z} z}_ * } )] , where z is height above the zero-plane displacement and L is the Monin-Obukhov length. At both sites the depth z * is significantly smaller than the appropriate neutral value ( z * N ) found from the previous analysis, as might be expected in the presence of a buoyant sink for turbulent kinetic energy.
Control Surface Fault Diagnosis with Specified Detection Probability - Real Event Experiences
DEFF Research Database (Denmark)
Hansen, Søren; Blanke, Mogens
2013-01-01
desired levels of false alarms and detection probabilities. Self-tuning residual generators are employed for diagnosis and are combined with statistical change detection to form a setup for robust fault diagnosis. On-line estimation of test statistics is used to obtain a detection threshold and a desired...... false alarm probability. A data based method is used to determine the validity of the methods proposed. Verification is achieved using real data and shows that the presented diagnosis method is efficient and could have avoided incidents where faults led to loss of aircraft....
International Nuclear Information System (INIS)
Oo, T.N.; Iwata, T.; Kimura, M.; Akahane, T.
2005-01-01
The investigation of the surface alignment of liquid crystal (LC) multilayers evaporated on a photoaligned polyimide vertical alignment (PI-VA) film was carried out by means of a novel three-dimensional (3-D) surface profiler. The photoinduced anisotropy of the partially UV-exposed PI-VA film can be visualized as a topological image of LC multilayers. It seems that the topology of LC multilayers is indicating the orientational distribution of LC molecules on the treated film. Moreover, it was shown that the surface profiler can be used to produce non-contact images with high vertical resolution (∼ 0.01 nm). Copyright (2003) AD-TECH - International Foundation for the Advancement of Technology Ltd
Directory of Open Access Journals (Sweden)
Ching-Tai Chen
Full Text Available Protein-protein interactions are key to many biological processes. Computational methodologies devised to predict protein-protein interaction (PPI sites on protein surfaces are important tools in providing insights into the biological functions of proteins and in developing therapeutics targeting the protein-protein interaction sites. One of the general features of PPI sites is that the core regions from the two interacting protein surfaces are complementary to each other, similar to the interior of proteins in packing density and in the physicochemical nature of the amino acid composition. In this work, we simulated the physicochemical complementarities by constructing three-dimensional probability density maps of non-covalent interacting atoms on the protein surfaces. The interacting probabilities were derived from the interior of known structures. Machine learning algorithms were applied to learn the characteristic patterns of the probability density maps specific to the PPI sites. The trained predictors for PPI sites were cross-validated with the training cases (consisting of 432 proteins and were tested on an independent dataset (consisting of 142 proteins. The residue-based Matthews correlation coefficient for the independent test set was 0.423; the accuracy, precision, sensitivity, specificity were 0.753, 0.519, 0.677, and 0.779 respectively. The benchmark results indicate that the optimized machine learning models are among the best predictors in identifying PPI sites on protein surfaces. In particular, the PPI site prediction accuracy increases with increasing size of the PPI site and with increasing hydrophobicity in amino acid composition of the PPI interface; the core interface regions are more likely to be recognized with high prediction confidence. The results indicate that the physicochemical complementarity patterns on protein surfaces are important determinants in PPIs, and a substantial portion of the PPI sites can be predicted
Chen, Ching-Tai; Peng, Hung-Pin; Jian, Jhih-Wei; Tsai, Keng-Chang; Chang, Jeng-Yih; Yang, Ei-Wen; Chen, Jun-Bo; Ho, Shinn-Ying; Hsu, Wen-Lian; Yang, An-Suei
2012-01-01
Protein-protein interactions are key to many biological processes. Computational methodologies devised to predict protein-protein interaction (PPI) sites on protein surfaces are important tools in providing insights into the biological functions of proteins and in developing therapeutics targeting the protein-protein interaction sites. One of the general features of PPI sites is that the core regions from the two interacting protein surfaces are complementary to each other, similar to the interior of proteins in packing density and in the physicochemical nature of the amino acid composition. In this work, we simulated the physicochemical complementarities by constructing three-dimensional probability density maps of non-covalent interacting atoms on the protein surfaces. The interacting probabilities were derived from the interior of known structures. Machine learning algorithms were applied to learn the characteristic patterns of the probability density maps specific to the PPI sites. The trained predictors for PPI sites were cross-validated with the training cases (consisting of 432 proteins) and were tested on an independent dataset (consisting of 142 proteins). The residue-based Matthews correlation coefficient for the independent test set was 0.423; the accuracy, precision, sensitivity, specificity were 0.753, 0.519, 0.677, and 0.779 respectively. The benchmark results indicate that the optimized machine learning models are among the best predictors in identifying PPI sites on protein surfaces. In particular, the PPI site prediction accuracy increases with increasing size of the PPI site and with increasing hydrophobicity in amino acid composition of the PPI interface; the core interface regions are more likely to be recognized with high prediction confidence. The results indicate that the physicochemical complementarity patterns on protein surfaces are important determinants in PPIs, and a substantial portion of the PPI sites can be predicted correctly with
SURFACE BRIGHTNESS PROFILES OF DWARF GALAXIES. II. COLOR TRENDS AND MASS PROFILES
Energy Technology Data Exchange (ETDEWEB)
Herrmann, Kimberly A. [Penn State Mont Alto, 1 Campus Drive, Mont Alto, PA 17237 (United States); Hunter, Deidre A. [Lowell Observatory, 1400 West Mars Hill Road, Flagstaff, AZ 86001 (United States); Elmegreen, Bruce G., E-mail: kah259@psu.edu, E-mail: dah@lowell.edu, E-mail: bge@us.ibm.com [IBM T. J. Watson Research Center, 1101 Kitchawan Road, Yorktown Heights, NY 10598 (United States)
2016-06-01
In this second paper of a series, we explore the B − V , U − B , and FUV−NUV radial color trends from a multi-wavelength sample of 141 dwarf disk galaxies. Like spirals, dwarf galaxies have three types of radial surface brightness profiles: (I) single exponential throughout the observed extent (the minority), (II) down-bending (the majority), and (III) up-bending. We find that the colors of (1) Type I dwarfs generally become redder with increasing radius, unlike spirals which have a blueing trend that flattens beyond ∼1.5 disk scale lengths, (2) Type II dwarfs come in six different “flavors,” one of which mimics the “U” shape of spirals, and (3) Type III dwarfs have a stretched “S” shape where the central colors are flattish, become steeply redder toward the surface brightness break, then remain roughly constant beyond, which is similar to spiral Type III color profiles, but without the central outward bluing. Faint (−9 > M{sub B} > −14) Type II dwarfs tend to have continuously red or “U” shaped colors and steeper color slopes than bright (−14 > M{sub B} > −19) Type II dwarfs, which additionally have colors that become bluer or remain constant with increasing radius. Sm dwarfs and BCDs tend to have at least some blue and red radial color trend, respectively. Additionally, we determine stellar surface mass density (Σ) profiles and use them to show that the break in Σ generally remains in Type II dwarfs (unlike Type II spirals) but generally disappears in Type III dwarfs (unlike Type III spirals). Moreover, the break in Σ is strong, intermediate, and weak in faint dwarfs, bright dwarfs, and spirals, respectively, indicating that Σ may straighten with increasing galaxy mass. Finally, the average stellar surface mass density at the surface brightness break is roughly 1−2 M {sub ⊙} pc{sup −2} for Type II dwarfs but higher at 5.9 M {sub ⊙} pc{sup −2} or 27 M {sub ⊙} pc{sup −2} for
Probability of growth of small damage sites on the exit surface of fused silica optics.
Negres, Raluca A; Abdulla, Ghaleb M; Cross, David A; Liao, Zhi M; Carr, Christopher W
2012-06-04
Growth of laser damage on fused silica optical components depends on several key parameters including laser fluence, wavelength, pulse duration, and site size. Here we investigate the growth behavior of small damage sites on the exit surface of SiO₂ optics under exposure to tightly controlled laser pulses. Results demonstrate that the onset of damage growth is not governed by a threshold, but is probabilistic in nature and depends both on the current size of a damage site and the laser fluence to which it is exposed. We also develop models for use in growth prediction. In addition, we show that laser exposure history also influences the behavior of individual sites.
Directory of Open Access Journals (Sweden)
Shaoyan Zhang
Full Text Available Chronic rhinosinusitis engenders enormous morbidity in the general population, and is often refractory to medical intervention. Compounds that augment mucociliary clearance in airway epithelia represent a novel treatment strategy for diseases of mucus stasis. A dominant fluid and electrolyte secretory pathway in the nasal airways is governed by the cystic fibrosis transmembrane conductance regulator (CFTR. The objectives of the present study were to test resveratrol, a strong potentiator of CFTR channel open probability, in preparation for a clinical trial of mucociliary activators in human sinus disease.Primary sinonasal epithelial cells, immortalized bronchoepithelial cells (wild type and F508del CFTR, and HEK293 cells expressing exogenous human CFTR were investigated by Ussing chamber as well as patch clamp technique under non-phosphorylating conditions. Effects on airway surface liquid depth were measured using confocal laser scanning microscopy. Impact on CFTR gene expression was measured by quantitative reverse transcriptase polymerase chain reaction.Resveratrol is a robust CFTR channel potentiator in numerous mammalian species. The compound also activated temperature corrected F508del CFTR and enhanced CFTR-dependent chloride secretion in human sinus epithelium ex vivo to an extent comparable to the recently approved CFTR potentiator, ivacaftor. Using inside out patches from apical membranes of murine cells, resveratrol stimulated an ~8 picosiemens chloride channel consistent with CFTR. This observation was confirmed in HEK293 cells expressing exogenous CFTR. Treatment of sinonasal epithelium resulted in a significant increase in airway surface liquid depth (in µm: 8.08+/-1.68 vs. 6.11+/-0.47,control,p<0.05. There was no increase CFTR mRNA.Resveratrol is a potent chloride secretagogue from the mucosal surface of sinonasal epithelium, and hydrates airway surface liquid by increasing CFTR channel open probability. The foundation for a
National Oceanic and Atmospheric Administration, Department of Commerce — Profile curvature was calculated from the bathymetry surface for each raster cell using the ArcGIS 3D Analyst "Curvature" Tool. Profile curvature describes the rate...
Theoretical fringe profiles with crossed Babinet compensators in testing concave aspheric surfaces.
Saxena, A K; Lancelot, J P
1982-11-15
This paper presents the theory for the use of crossed Babinet compensators in testing concave aspheric surfaces. Theoretical fringe profiles for a sphere and for an aspheric surface with primary aberration are shown. Advantages of this method are discussed.
Directory of Open Access Journals (Sweden)
P. J. Sheridan
2012-12-01
Full Text Available Between June 2006 and September 2009, an instrumented light aircraft measured over 400 vertical profiles of aerosol and trace gas properties over eastern and central Illinois. The primary objectives of this program were to (1 measure the in situ aerosol properties and determine their vertical and temporal variability and (2 relate these aircraft measurements to concurrent surface and satellite measurements. The primary profile location was within 15 km of the NOAA/ESRL surface aerosol monitoring station near Bondville, Illinois. Identical instruments at the surface and on the aircraft ensured that the data from both platforms would be directly comparable and permitted a determination of how representative surface aerosol properties were of the lower column. Aircraft profiles were also conducted occasionally at two other nearby locations to increase the frequency of A-Train satellite underflights for the purpose of comparing in situ and satellite-retrieved aerosol data. Measurements of aerosol properties conducted at low relative humidity over the Bondville site compare well with the analogous surface aerosol data and do not indicate any major sampling issues or that the aerosol is radically different at the surface compared with the lowest flyby altitude of ~ 240 m above ground level. Statistical analyses of the in situ vertical profile data indicate that aerosol light scattering and absorption (related to aerosol amount decreases substantially with increasing altitude. Parameters related to the nature of the aerosol (e.g., single-scattering albedo, Ångström exponent, etc., however, are relatively constant throughout the mixed layer, and do not vary as much as the aerosol amount throughout the profile. While individual profiles often showed more variability, the median in situ single-scattering albedo was 0.93–0.95 for all sampled altitudes. Several parameters (e.g., submicrometer scattering fraction, hemispheric backscattering fraction, and
MEASURING PROTOPLANETARY DISK GAS SURFACE DENSITY PROFILES WITH ALMA
Energy Technology Data Exchange (ETDEWEB)
Williams, Jonathan P.; McPartland, Conor, E-mail: jpw@ifa.hawaii.edu [Institute for Astronomy, University of Hawaii at Manoa, Honolulu, HI 96822 (United States)
2016-10-10
The gas and dust are spatially segregated in protoplanetary disks due to the vertical settling and radial drift of large grains. A fuller accounting of the mass content and distribution in disks therefore requires spectral line observations. We extend the modeling approach presented in Williams and Best to show that gas surface density profiles can be measured from high fidelity {sup 13}CO integrated intensity images. We demonstrate the methodology by fitting ALMA observations of the HD 163296 disk to determine a gas mass, M {sub gas} = 0.048 M {sub ⊙}, and accretion disk characteristic size R {sub c} = 213 au and gradient γ = 0.39. The same parameters match the C{sup 18}O 2–1 image and indicate an abundance ratio [{sup 12}CO]/[C{sup 18}O] of 700 independent of radius. To test how well this methodology can be applied to future line surveys of smaller, lower mass T Tauri disks, we create a large {sup 13}CO 2–1 image library and fit simulated data. For disks with gas masses 3–10 M {sub Jup} at 150 pc, ALMA observations with a resolution of 0.″2–0.″3 and integration times of ∼20 minutes allow reliable estimates of R {sub c} to within about 10 au and γ to within about 0.2. Economic gas imaging surveys are therefore feasible and offer the opportunity to open up a new dimension for studying disk structure and its evolution toward planet formation.
Jenkins, M B; Endale, D M; Fisher, D S; Gay, P A
2009-02-01
To better understand the transport and enumeration of dilute densities of Escherichia coli O157:H7 in agricultural watersheds, we developed a culture-based, five tube-multiple dilution most probable number (MPN) method. The MPN method combined a filtration technique for large volumes of surface water with standard selective media, biochemical and immunological tests, and a TaqMan confirmation step. This method determined E. coli O157:H7 concentrations as low as 0.1 MPN per litre, with a 95% confidence level of 0.01-0.7 MPN per litre. Escherichia coli O157:H7 densities ranged from not detectable to 9 MPN per litre for pond inflow, from not detectable to 0.9 MPN per litre for pond outflow and from not detectable to 8.3 MPN per litre for within pond. The MPN methodology was extended to mass flux determinations. Fluxes of E. coli O157:H7 ranged from 10(4) MPN per hour. This culture-based method can detect small numbers of viable/culturable E. coli O157:H7 in surface waters of watersheds containing animal agriculture and wildlife. This MPN method will improve our understanding of the transport and fate of E. coli O157:H7 in agricultural watersheds, and can be the basis of collections of environmental E. coli O157:H7.
Temperature profiles on the gadolinium surface during electron beam evaporation
Energy Technology Data Exchange (ETDEWEB)
Ohba, Hironori; Shibata, Takemasa [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment
1995-03-01
The distributions of surface temperature of gadolinium in a water-cooled copper crucible during electron beam evaporation were measured by optical pyrometry. The surface temperatures were obtained from the radiation intensity ratio of the evaporating surface and a reference light source using Planck`s law of radiation. The emitted radiation from the evaporating surface and a reference source was detected by a CCD sensor through a band pass filter of 650 nm. The measured surface temperature generally agreed with those estimated from the deposition rate and the data of the saturated vapor pressure. At high input powers, it was found that the measured value had small difference with the estimated one due to variation of the surface condition. (author).
Temperature profiles on the gadolinium surface during electron beam evaporation
International Nuclear Information System (INIS)
Ohba, Hironori; Shibata, Takemasa
1995-01-01
The distributions of surface temperature of gadolinium in a water-cooled copper crucible during electron beam evaporation were measured by optical pyrometry. The surface temperatures were obtained from the radiation intensity ratio of the evaporating surface and a reference light source using Planck's law of radiation. The emitted radiation from the evaporating surface and a reference source was detected by a CCD sensor through a band pass filter of 650 nm. The measured surface temperature generally agreed with those estimated from the deposition rate and the data of the saturated vapor pressure. At high input powers, it was found that the measured value had small difference with the estimated one due to variation of the surface condition. (author)
Anisotropic characterization of rock fracture surfaces subjected to profile analysis
International Nuclear Information System (INIS)
Zhou, H.W.; Xie, H.
2004-01-01
The mechanical parameters of a rock fracture are dependent on its surface roughness anisotropy. In this Letter, we show how quantitatively describe the anisotropy of a rock fracture surface. A parameter, referred to as the index for the accumulation power spectral density psd*, is proposed to characterize the anisotropy of a rock fracture surface. Variation of psd*, with orientation angle θ of sampling, is also discussed
How well Can We Classify SWOT-derived Water Surface Profiles?
Frasson, R. P. M.; Wei, R.; Picamilh, C.; Durand, M. T.
2015-12-01
The upcoming Surface Water Ocean Topography (SWOT) mission will detect water bodies and measure water surface elevation throughout the globe. Within its continental high resolution mask, SWOT is expected to deliver measurements of river width, water elevation and slope of rivers wider than ~50 m. The definition of river reaches is an integral step of the computation of discharge based on SWOT's observables. As poorly defined reaches can negatively affect the accuracy of discharge estimations, we seek strategies to break up rivers into physically meaningful sections. In the present work, we investigate how accurately we can classify water surface profiles based on simulated SWOT observations. We assume that most river sections can be classified as either M1 (mild slope, with depth larger than the normal depth), or A1 (adverse slope with depth larger than the critical depth). This assumption allows the classification to be based solely on the second derivative of water surface profiles, with convex profiles being classified as A1 and concave profiles as M1. We consider a HEC-RAS model of the Sacramento River as a representation of the true state of the river. We employ the SWOT instrument simulator to generate a synthetic pass of the river, which includes our best estimates of height measurement noise and geolocation errors. We process the resulting point cloud of water surface heights with the RiverObs package, which delineates the river center line and draws the water surface profile. Next, we identify inflection points in the water surface profile and classify the sections between the inflection points. Finally, we compare our limited classification of simulated SWOT-derived water surface profile to the "exact" classification of the modeled Sacramento River. With this exercise, we expect to determine if SWOT observations can be used to find inflection points in water surface profiles, which would bring knowledge of flow regimes into the definition of river reaches.
Comparison of two methods of surface profile extraction from multiple ultrasonic range measurements
Barshan, B; Baskent, D
Two novel methods for surface profile extraction based on multiple ultrasonic range measurements are described and compared. One of the methods employs morphological processing techniques, whereas the other employs a spatial voting scheme followed by simple thresholding. Morphological processing
The way we measure: comparison of methods to derive radial surface brightness profiles
Peters, S. P. C.; van der Kruit, P. C.; de Jong, R. S.
The breaks and truncations in the luminosity profile of face-on spiral galaxies offer valuable insights in their formation history. The traditional method of deriving the surface photometry profile for face-on galaxies is to use elliptical averaging. In this paper, we explore the question whether
The outer disks of early-type galaxies. I. Surface-brightness profiles of barred galaxies
Erwin, Peter; Pohlen, Michael; Beckman, John E.
We present a study of 66 barred, early-type (S0-Sb) disk galaxies, focused on the disk surface brightness profile outside the bar region, with the aim of throwing light on the nature of Freeman type I and II profiles, their origins, and their possible relation to disk truncations. This paper
Deep learning for galaxy surface brightness profile fitting
Tuccillo, D.; Huertas-Company, M.; Decencière, E.; Velasco-Forero, S.; Domínguez Sánchez, H.; Dimauro, P.
2018-03-01
Numerous ongoing and future large area surveys (e.g. Dark Energy Survey, EUCLID, Large Synoptic Survey Telescope, Wide Field Infrared Survey Telescope) will increase by several orders of magnitude the volume of data that can be exploited for galaxy morphology studies. The full potential of these surveys can be unlocked only with the development of automated, fast, and reliable analysis methods. In this paper, we present DeepLeGATo, a new method for 2-D photometric galaxy profile modelling, based on convolutional neural networks. Our code is trained and validated on analytic profiles (HST/CANDELS F160W filter) and it is able to retrieve the full set of parameters of one-component Sérsic models: total magnitude, effective radius, Sérsic index, and axis ratio. We show detailed comparisons between our code and GALFIT. On simulated data, our method is more accurate than GALFIT and ˜3000 time faster on GPU (˜50 times when running on the same CPU). On real data, DeepLeGATo trained on simulations behaves similarly to GALFIT on isolated galaxies. With a fast domain adaptation step made with the 0.1-0.8 per cent the size of the training set, our code is easily capable to reproduce the results obtained with GALFIT even on crowded regions. DeepLeGATo does not require any human intervention beyond the training step, rendering it much automated than traditional profiling methods. The development of this method for more complex models (two-component galaxies, variable point spread function, dense sky regions) could constitute a fundamental tool in the era of big data in astronomy.
Directory of Open Access Journals (Sweden)
Nandi Soumyadeep
2007-03-01
Full Text Available Abstract Background Profile Hidden Markov Models (HMM are statistical representations of protein families derived from patterns of sequence conservation in multiple alignments and have been used in identifying remote homologues with considerable success. These conservation patterns arise from fold specific signals, shared across multiple families, and function specific signals unique to the families. The availability of sequences pre-classified according to their function permits the use of negative training sequences to improve the specificity of the HMM, both by optimizing the threshold cutoff and by modifying emission probabilities to minimize the influence of fold-specific signals. A protocol to generate family specific HMMs is described that first constructs a profile HMM from an alignment of the family's sequences and then uses this model to identify sequences belonging to other classes that score above the default threshold (false positives. Ten-fold cross validation is used to optimise the discrimination threshold score for the model. The advent of fast multiple alignment methods enables the use of the profile alignments to align the true and false positive sequences, and the resulting alignments are used to modify the emission probabilities in the original model. Results The protocol, called HMM-ModE, was validated on a set of sequences belonging to six sub-families of the AGC family of kinases. These sequences have an average sequence similarity of 63% among the group though each sub-group has a different substrate specificity. The optimisation of discrimination threshold, by using negative sequences scored against the model improves specificity in test cases from an average of 21% to 98%. Further discrimination by the HMM after modifying model probabilities using negative training sequences is provided in a few cases, the average specificity rising to 99%. Similar improvements were obtained with a sample of G-Protein coupled receptors
Geology along topographic profile for near-surface test facility
International Nuclear Information System (INIS)
Fecht, K.R.
1978-01-01
The U.S. Department of Energy, through the Basalt Waste Isolation Program within Rockwell Hanford Operations, is investigating the feasibility of terminal storage of radioactive waste in deep caverns constructed in the Columbia River Basalt. A portion of the geological work conducted in support of the Engineering Design Unit to evaluate the west end of Gable Mountain as a site for in situ testing of the thermomechanical behavior of basalt is reported. The surficial geology of the west end of Gable Mountain was mapped in a reconnaissance fashion at a scale of 1:62,500 to identify geologic features which could affect siting of the proposed facilities. A detailed study of the geological conditions was conducted along a traverse across the most probable site for the proposed project
Surface density profile and surface tension of the one-component classical plasma
International Nuclear Information System (INIS)
Ballone, P.; Senatore, G.; Trieste Univ.; Tosi, M.P.; Oxford Univ.
1982-08-01
The density profile and the interfacial tension of two classical plasmas in equilibrium at different densities are evaluated in the square-density-gradient approximation. For equilibrium in the absence of applied external voltage, the profile is oscillatory in the higher-density plasma and the interfacial tension is positive. The amplitude and phase of these oscillations and the magnitude of the interfacial tension are related to the width of the background profile. Approximate representations of the equilibrium profile by matching of its asymptotic forms are analyzed. A comparison with computer simulation data and a critical discussion of a local-density theory are also presented. (author)
Abu-Nabah, Bassam A.
Recent research results indicated that eddy current conductivity measurements can be exploited for nondestructive evaluation of near-surface residual stresses in surface-treated nickel-base superalloy components. Most of the previous experimental studies were conducted on highly peened (Almen 10-16A) specimens that exhibit harmful cold work in excess of 30% plastic strain. Such high level of cold work causes thermo-mechanical relaxation at relatively modest operational temperatures; therefore the obtained results were not directly relevant to engine manufacturers and end users. The main reason for choosing peening intensities in excess of recommended normal levels was that in low-conductivity engine alloys the eddy current penetration depth could not be forced below 0.2 mm without expanding the measurements above 10 MHz which is beyond the operational range of most commercial eddy current instruments. As for shot-peened components, it was initially felt that the residual stress effect was more difficult to separate from cold work, texture, and inhomogeneity effects in titanium alloys than in nickel-base superalloys. In addition, titanium alloys have almost 50% lower electric conductivity than nickel-base superalloys; therefore require proportionally higher inspection frequencies, which was not feasible until our recent breakthrough in instrument development. Our work has been focused on six main aspects of this continuing research, namely, (i) the development of an iterative inversion technique to better retrieve the depth-dependent conductivity profile from the measured frequency-dependent apparent eddy current conductivity (AECC), (ii) the extension of the frequency range up to 80 MHz to better capture the peak compressive residual stress in nickel-base superalloys using a new eddy current conductivity measuring system, which offers better reproducibility, accuracy and measurement speed than the previously used conventional systems, (iii) the lift-off effect on
Huddleston, Lisa L.; Roeder, William; Merceret, Francis J.
2010-01-01
A technique has been developed to calculate the probability that any nearby lightning stroke is within any radius of any point of interest. In practice, this provides the probability that a nearby lightning stroke was within a key distance of a facility, rather than the error ellipses centered on the stroke. This process takes the current bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to get the probability that the stroke is inside any specified radius. This new facility-centric technique will be much more useful to the space launch customers and may supersede the lightning error ellipse approach discussed in [5], [6].
DEFF Research Database (Denmark)
Stangegaard, Michael; Wang, Zhenyu; Kutter, Jörg Peter
2006-01-01
There is an ever increasing need to find surfaces that are biocompatible for applications like medical implants and microfluidics-based cell culture systems. The biocompatibility of five different surfaces with different hydrophobicity was determined using gene expression profiling as well as more...
Generating strain signals under consideration of road surface profiles
Putra, T. E.; Abdullah, S.; Schramm, D.; Nuawi, M. Z.; Bruckmann, T.
2015-08-01
The current study aimed to develop the mechanism for generating strain signal utilising computer-based simulation. The strain data, caused by the acceleration, were undertaken from a fatigue data acquisition involving car movements. Using a mathematical model, the measured strain signals yielded to acceleration data used to describe the bumpiness of road surfaces. The acceleration signals were considered as an external disturbance on generating strain signals. Based on this comparison, both the actual and simulated strain data have similar pattern. The results are expected to provide new knowledge to generate a strain signal via a simulation.
Yu, Han
2016-04-26
We demonstrate that diffraction stack migration can be used to discover the distribution of near-surface faults. The methodology is based on the assumption that near-surface faults generate detectable back-scattered surface waves from impinging surface waves. We first isolate the back-scattered surface waves by muting or FK filtering, and then migrate them by diffraction migration using the surface wave velocity as the migration velocity. Instead of summing events along trial quasi-hyperbolas, surface wave migration sums events along trial quasi-linear trajectories that correspond to the moveout of back-scattered surface waves. We have also proposed a natural migration method that utilizes the intrinsic traveltime property of the direct and the back-scattered waves at faults. For the synthetic data sets and the land data collected in Aqaba, where surface wave velocity has unexpected perturbations, we migrate the back-scattered surface waves with both predicted velocity profiles and natural Green\\'s function without velocity information. Because the latter approach avoids the need for an accurate velocity model in event summation, both the prestack and stacked migration images show competitive quality. Results with both synthetic data and field records validate the feasibility of this method. We believe applying this method to global or passive seismic data can open new opportunities in unveiling tectonic features.
Chemical profiles of body surfaces and nests from six Bornean stingless bee species.
Leonhardt, Sara Diana; Blüthgen, Nico; Schmitt, Thomas
2011-01-01
Stingless bees (Apidae: Meliponini) are the most diverse group of Apid bees and represent common pollinators in tropical ecosystems. Like honeybees they live in large eusocial colonies and rely on complex chemical recognition and communication systems. In contrast to honeybees, their ecology and especially their chemical ecology have received only little attention, particularly in the Old World. We previously have analyzed the chemical profiles of six paleotropical stingless bee species from Borneo and revealed the presence of species-specific cuticular terpenes- an environmentally derived compound class so far unique among social insects. Here, we compared the bees' surface profiles to the chemistry of their nest material. Terpenes, alkanes, and alkenes were the dominant compound groups on both body surfaces and nest material. However, bee profiles and nests strongly differed in their chemical composition. Body surfaces thus did not merely mirror nests, rendering a passive compound transfer from nests to bees unlikely. The difference between nests and bees was particularly pronounced when all resin-derived compounds (terpenes) were excluded and only genetically determined compounds were considered. When terpenes were included, bee profiles and nest material still differed, because whole groups of terpenes (e.g., sesquiterpenes) were found in nest material of some species, but missing in their chemical profile, indicating that bees are able to influence the terpene composition both in their nests and on their surfaces.
International Nuclear Information System (INIS)
Šimek, M; Ambrico, P F
2012-01-01
Radial distributions of electronically excited species produced during surface streamer propagation were obtained by applying the Abel inverse transform to projected luminosities of single streamers. The streamers were generated in an argon and nitrogen surface coplanar dielectric barrier discharge at atmospheric pressure and their magnified microscopic images were registered with high time resolution. Selected regions of the projected luminosities were processed by the Abel inverse transform procedure based on the Hankel–Fourier method assuming cylindrical symmetry of the streamer channel. Projected as well as Abel-inverted profiles were fitted by Gaussian functions. It is shown that the projected profiles, in addition to the Abel-inverted ones, can be well approximated by the sum of two coaxial Gaussians with two different half-widths and weights. The sharper Gaussian component with higher weight characterizes the radial dependence of the primary cathode-directed streamer-channel luminosity. The second (broader) Gaussian component probably originates either from the pre-breakdown Townsend phase or from the second wave propagating towards the anode. (paper)
Surface layer and bloom dynamics observed with the Prince William Sound Autonomous Profiler
Campbell, R. W.
2016-02-01
As part of a recent long term monitoring effort, deployments of a WETLabs Autonomous Moored Profiler (AMP) began Prince William Sound (PWS) in 2013. The PWS AMP consists of a positively buoyant instrument frame, with a winch and associated electronics that profiles the frame from a park depth (usually 55 m) to the surface by releasing and retrieving a thin UHMWPE tether; it generally conducts a daily cast and measures temperature, salinity, chlorophyll-a fluorescence, turbidity, and oxygen and nitrate concentrations. Upward and downward looking ADCPs are mounted on a float below the profiler, and an in situ plankton imager is in development and will be installed in 2016. Autonomous profilers are a relatively new technology, and early deployments experienced a number of failures from which valuable lessons may be learned. Nevertheless, an unprecedented time series of the seasonal biogeochemical procession in the surface waters coastal Gulf of Alaska was collected in 2014 and 2015. The northern Gulf of Alaska has experienced a widespread warm anomaly since early 2014, and surface layer temperature anomalies in PWS were strongly positive during winter 2014. The spring bloom observed by the profiler began 2-3 weeks earlier than average, with surface nitrate depleted by late April. Although surface temperatures were still above average in 2015, bloom timing was much later, with a short vigorous bloom in late April and a subsurface bloom in late May that coincided with significant nitrate drawdown. As well as the vernal blooms, wind-driven upwelling events lead to several small productivity pulses that were evident in changes in nitrate and oxygen concentrations, and chlorophyll-a fluorescence. As well as providing a mechanistic understanding of surface layer biogeochemistry, high frequency observations such as these put historical observations in context, and provide new insights into the scales of variability in the annual cycles of the surface ocean in the North
DEFF Research Database (Denmark)
Asmussen, Søren; Albrecher, Hansjörg
The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....
Heterogeneous free-surface profile of B4C polycrystal under shock compression
International Nuclear Information System (INIS)
Mashimo, T.; Uchino, M.
1997-01-01
Observations of the free-surface behavior under shock compression by the gapped-flat mirror method were performed on B 4 C and Si 3 N 4 ceramics to study their shock-yielding properties. Jagged profiles of the moving free-surface in the plastic region, with a special scale of about one mm and a maximum local displacement of a few 10s of μm, were observed for B 4 C polycrystals. Similar profiles for Si 3 N 4 polycrystals were smooth. Such profiles for B 4 C polycrystals were also observed in the elastic region. It is suggested that these observations reflect the heterogeneous nature of shock compression in solids, and further indicate that a macroscopic slip system plays an important role in the elastoplastic transition of B 4 C material under shock compression and decompression. copyright 1997 American Institute of Physics
Fekete, Gábor; Fodor, Emese; Pesznyák, Csilla
2015-03-08
A novel method has been put forward for very large electron beam profile measurement. With this method, absorbed dose profiles can be measured at any depth in a solid phantom for total skin electron therapy. Electron beam dose profiles were collected with two different methods. Profile measurements were performed at 0.2 and 1.2 cm depths with a parallel plate and a thimble chamber, respectively. 108cm × 108 cm and 45 cm × 45 cm projected size electron beams were scanned by vertically moving phantom and detector at 300 cm source-to-surface distance with 90° and 270° gantry angles. The profiles collected this way were used as reference. Afterwards, the phantom was fixed on the central axis and the gantry was rotated with certain angular steps. After applying correction for the different source-to-detector distances and incidence of angle, the profiles measured in the two different setups were compared. Correction formalism has been developed. The agreement between the cross profiles taken at the depth of maximum dose with the 'classical' scanning and with the new moving gantry method was better than 0.5 % in the measuring range from zero to 71.9 cm. Inverse square and attenuation corrections had to be applied. The profiles measured with the parallel plate chamber agree better than 1%, except for the penumbra region, where the maximum difference is 1.5%. With the moving gantry method, very large electron field profiles can be measured at any depth in a solid phantom with high accuracy and reproducibility and with much less time per step. No special instrumentation is needed. The method can be used for commissioning of very large electron beams for computer-assisted treatment planning, for designing beam modifiers to improve dose uniformity, and for verification of computed dose profiles.
International Nuclear Information System (INIS)
Shen, Y.; Lo, C. C. H.; Frishman, A. M.; Lee, C.; Nakagawa, N.
2007-01-01
This paper describes an eddy current model-based method for inverting near-surface conductivity deviation profiles of surface treated materials from swept-high frequency eddy current (SHFEC) data. This work forms part of our current research directed towards the development of an electromagnetic nondestructive technique for assessing residual stress of shot-peened superalloy components. The inversion procedure is based on the use of a parameterized function to describe the near-surface conductivity as a function of depth for a shot-peened surface, and the laterally uniform multi-layer theory of Cheng, Dodd and Deeds to calculate the resulting coil impedance deviations. The convergence of the inversion procedure has been tested against synthesized eddy current data. As a demonstration, the conductivity deviation profiles of a series of Inconel 718 specimens, shot peened at various Almen intensities, have been obtained by inversion. Several consistency tests were conducted to examine the reliability of the inverted conductivity profiles. The results show that conductivity deviation profiles can be reliably determined from SHFEC data within the accuracy of the current measurement system
Compact Wideband and Low-Profile Antenna Mountable on Large Metallic Surfaces
DEFF Research Database (Denmark)
Zhang, Shuai; Pedersen, Gert F.
2017-01-01
This paper proposes a compact wideband and low-profile antenna mountable on large metallic surfaces. Six rows of coupled microstrip resonators with different lengths are printed on a Teflon block. The lengths of the microstrip resonators in different rows are gradually reduced along the end-fire...
Generalized Probability-Probability Plots
Mushkudiani, N.A.; Einmahl, J.H.J.
2004-01-01
We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P
Shiryaev, Albert N
2016-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.
Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...
ANALYSIS OF THE SURFACE PROFILE AND ITS MATERIAL SHARE DURING THE GRINDING INCONEL 718 ALLOY
Directory of Open Access Journals (Sweden)
Martin Novák
2015-05-01
Full Text Available Grinding is still an important method for surface finishing. At FPTM JEPU research, which deals with this issue is conducted. Experiments are carried out with grinding various materials under different conditions and then selected components of the surface integrity are evaluated. They include roughness Ra, Rm and Rz, Material ratio curve (Abbott Firestone curve and also the obtained roundness. This article deals with grinding nickel Inconel 718 alloy, when selected cutting grinding conditions were used and subsequently the surface profile and the material ratio curve were measured and evaluated.
International Nuclear Information System (INIS)
Elliner, D.I.
1999-09-01
The continual reduction in size of semiconductor structures and depths of junctions is putting a greater strain on characterization techniques. Accurate device and process modelling requires quantified electrical and dopant profiles from the topmost few nanometres. Secondary ion mass spectrometry (SIMS) is an analytical technique commonly used in the semiconductor industry to measure concentration depth profiles. To allow the quantification of the features that are closer to the surface, lower energy ions are employed, which also improves the available depth resolution. The development of the floating ion gun (FLIG) has made it possible to use sub keV beam energies on a routine basis, allowing quantified dopant profiles to be obtained within the first few nanometres of the surface. This thesis demonstrates that, when profiling with oxygen ion beams, greatest certainty in the retained dose is achieved at normal incidence, and when analysing boron accurate profile shapes are only obtained when the primary beam energy is less than half that of the implant. It was shown that it is now possible to profile, though with slower erosion rates and a limited dynamic range, with 100 eV oxygen (0 2 + ) ion beams. Profile features that had developed during rapid thermal annealing, that could only be observed when ultra low energy ion beams were used, were investigated using various analytical techniques. Explanations of the apparently inactive dopant were proposed, and included suggestions for cluster molecules. The oxide thickness of fully formed altered layers has also been investigated. The results indicate that a fundamental change in the mechanism of oxide formation occurs, and interfaces that are sharper than those grown by thermal oxidation can be produced using sub-keV ion beams. (author)
Measurement of surface temperature profiles on liquid uranium metal during electron beam evaporation
Energy Technology Data Exchange (ETDEWEB)
Ohba, Hironori; Shibata, Takemasa [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment
1998-11-01
Surface temperature distributions of liquid uranium in a water-cooled copper crucible during electron beam evaporation were measured. Evaporation surface was imaged by a lens through a band-path filter (650{+-}5 nm) and a double mirror system on a charge coupled device (CCD) camera. The video signals of the recorded image were connected to an image processor and converted to two-dimensional spectral radiance profiles. The surface temperatures were obtained from the spectral radiation intensity ratio of the evaporation surface and a freezing point of uranium and/or a reference light source using Planck`s law of radiation. The maximum temperature exceeded 3000 K and had saturation tendency with increasing electron beam input. The measured surface temperatures agreed with those estimated from deposition rates and data of saturated vapor pressure of uranium. (author)
Determining the near-surface current profile from measurements of the wave dispersion relation
Smeltzer, Benjamin; Maxwell, Peter; Aesøy, Eirik; Ellingsen, Simen
2017-11-01
The current-induced Doppler shifts of waves can yield information about the background mean flow, providing an attractive method of inferring the current profile in the upper layer of the ocean. We present measurements of waves propagating on shear currents in a laboratory water channel, as well as theoretical investigations of inversion techniques for determining the vertical current structure. Spatial and temporal measurements of the free surface profile obtained using a synthetic Schlieren method are analyzed to determine the wave dispersion relation and Doppler shifts as a function of wavelength. The vertical current profile can then be inferred from the Doppler shifts using an inversion algorithm. Most existing algorithms rely on a priori assumptions of the shape of the current profile, and developing a method that uses less stringent assumptions is a focus of this study, allowing for measurement of more general current profiles. The accuracy of current inversion algorithms are evaluated by comparison to measurements of the mean flow profile from particle image velocimetry (PIV), and a discussion of the sensitivity to errors in the Doppler shifts is presented.
Quantum Probabilities as Behavioral Probabilities
Directory of Open Access Journals (Sweden)
Vyacheslav I. Yukalov
2017-03-01
Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.
DEFF Research Database (Denmark)
Rojas-Nandayapa, Leonardo
Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...
Improving the surface metrology accuracy of optical profilers by using multiple measurements
Xu, Xudong; Huang, Qiushi; Shen, Zhengxiang; Wang, Zhanshan
2016-10-01
The performance of high-resolution optical systems is affected by small angle scattering at the mid-spatial-frequency irregularities of the optical surface. Characterizing these irregularities is, therefore, important. However, surface measurements obtained with optical profilers are influenced by additive white noise, as indicated by the heavy-tail effect observable on their power spectral density (PSD). A multiple-measurement method is used to reduce the effects of white noise by averaging individual measurements. The intensity of white noise is determined using a model based on the theoretical PSD of fractal surface measurements with additive white noise. The intensity of white noise decreases as the number of times of multiple measurements increases. Using multiple measurements also increases the highest observed spatial frequency; this increase is derived and calculated. Additionally, the accuracy obtained using multiple measurements is carefully studied, with the analysis of both the residual reference error after calibration, and the random errors appearing in the range of measured spatial frequencies. The resulting insights on the effects of white noise in optical profiler measurements and the methods to mitigate them may prove invaluable to improve the quality of surface metrology with optical profilers.
International Nuclear Information System (INIS)
Evans, R.; Kumaravadivel, R.
1976-01-01
A simple scheme for determining the ion density profile and the surface tension of a liquid metal is described. Assuming that the interaction between metallic pseudo-ions is of the form introduced by Evans, an approximate expression for the excess free energy of the system is derived using the thermodynamic perturbation theory of Weeks, Chandler and Anderson. This excess free energy is then minimized with respect to a parameter which specifies the ion density profile, and the surface tension is given directly. From a consideration of the dependence of the interionic forces on the electron density it is predicted that the ions should take up a very steep density profile at the liquid metal surface. This behaviour is contrasted with that to be expected for rare-gas fluids in which the interatomic forces are density-independent. The values of the surface tension calculated for liquid Na, K and Al from a simplified version of the theory are in reasonable agreement with experiment. (author)
Global Properties of M31's Stellar Halo from the SPLASH Survey. I. Surface Brightness Profile
Gilbert, Karoline M.; Guhathakurta, Puragra; Beaton, Rachael L.; Bullock, James; Geha, Marla C.; Kalirai, Jason S.; Kirby, Evan N.; Majewski, Steven R.; Ostheimer, James C.; Patterson, Richard J.; Tollerud, Erik J.; Tanaka, Mikito; Chiba, Masashi
2012-11-01
We present the surface brightness profile of M31's stellar halo out to a projected radius of 175 kpc. The surface brightness estimates are based on confirmed samples of M31 red giant branch stars derived from Keck/DEIMOS spectroscopic observations. A set of empirical spectroscopic and photometric M31 membership diagnostics is used to identify and reject foreground and background contaminants. This enables us to trace the stellar halo of M31 to larger projected distances and fainter surface brightnesses than previous photometric studies. The surface brightness profile of M31's halo follows a power law with index -2.2 ± 0.2 and extends to a projected distance of at least ~175 kpc (~2/3 of M31's virial radius), with no evidence of a downward break at large radii. The best-fit elliptical isophotes have b/a = 0.94 with the major axis of the halo aligned along the minor axis of M31's disk, consistent with a prolate halo, although the data are also consistent with M31's halo having spherical symmetry. The fact that tidal debris features are kinematically cold is used to identify substructure in the spectroscopic fields out to projected radii of 90 kpc and investigate the effect of this substructure on the surface brightness profile. The scatter in the surface brightness profile is reduced when kinematically identified tidal debris features in M31 are statistically subtracted; the remaining profile indicates that a comparatively diffuse stellar component to M31's stellar halo exists to large distances. Beyond 90 kpc, kinematically cold tidal debris features cannot be identified due to small number statistics; nevertheless, the significant field-to-field variation in surface brightness beyond 90 kpc suggests that the outermost region of M31's halo is also comprised to a significant degree of stars stripped from accreted objects. The data presented herein were obtained at the W. M. Keck Observatory, which is operated as a scientific partnership among the California
Chi, Sheng; Lee, Shu-Sheng; Huang, Jen, Jen-Yu; Lai, Ti-Yu; Jan, Chia-Ming; Hu, Po-Chi
2016-04-01
As the progress of optical technologies, different commercial 3D surface contour scanners are on the market nowadays. Most of them are used for reconstructing the surface profile of mold or mechanical objects which are larger than 50 mm×50 mm× 50 mm, and the scanning system size is about 300 mm×300 mm×100 mm. There are seldom optical systems commercialized for surface profile fast scanning for small object size less than 10 mm×10 mm×10 mm. Therefore, a miniature optical system has been designed and developed in this research work for this purpose. Since the most used scanning method of such system is line scan technology, we have developed pseudo-phase shifting digital projection technology by adopting projecting fringes and phase reconstruction method. A projector was used to project a digital fringe patterns on the object, and the fringes intensity images of the reference plane and of the sample object were recorded by a CMOS camera. The phase difference between the plane and object can be calculated from the fringes images, and the surface profile of the object was reconstructed by using the phase differences. The traditional phase shifting method was accomplished by using PZT actuator or precisely controlled motor to adjust the light source or grating and this is one of the limitations for high speed scanning. Compared with the traditional optical setup, we utilized a micro projector to project the digital fringe patterns on the sample. This diminished the phase shifting processing time and the controlled phase differences between the shifted phases become more precise. Besides, the optical path design based on a portable device scanning system was used to minimize the size and reduce the number of the system components. A screwdriver section about 7mm×5mm×5mm has been scanned and its surface profile was successfully restored. The experimental results showed that the measurement area of our system can be smaller than 10mm×10mm, the precision reached to
DEFF Research Database (Denmark)
Lu, Kaiyuan; Rasmussen, Peter Omand; Ritchie, Ewen
2008-01-01
) synchronous motors. This paper presents an AC+DC measurement method for determination of the d-axis and q-axis high frequency inductance profiles of SMPM synchronous motors. This method uses DC currents to set a desired magnetic working point on the motor laminations, and then superimpose balanced small AC......Accurate knowledge of the high frequency inductance profile plays an important role in many designs of sensorless controllers for Surface inductance. A special algorithm is used to decouple the cross-coupling effects between the d-axis and the q-axis, which allows Mounted Permanent Magnet (SMPM...... signals to measure the incremental a separate determination of the d, q inductance profiles as functions of the d, q currents. Experimental results on a commercial SMPM motor using the proposed method are presented in this paper....
Deep Vs Profiling Along the Top of Yucca Mountain Using a Vibroseis Source and Surface Waves
International Nuclear Information System (INIS)
Stokoe, K.; Rosenblad, B.; Wong, I.; Bay, J.; Thomas, P.; Silva, W.
2004-01-01
Yucca Mountain, Nevada, was approved as the site for development of the geologic repository for high-level radioactive waste and spent nuclear fuel in the United States. The U.S. Department of Energy has been conducting studies to characterize the site and assess its future performance as a geologic repository. As part of these studies, a program of deep seismic profiling, to depths of 200 m, was conducted along the top of Yucca Mountain to evaluate the shear-wave velocity (V s ) structure of the repository block. The resulting V s data were used as input into the development of ground motions for the preclosure seismic design of the repository and for postclosure performance assessment. The noninvasive spectral-analysis-of-surface-waves (SASW) method was employed in the deep profiling. Field measurements involved the use of a modified Vibroseis as the seismic source. The modifications allowed the Vibroseis to be controlled by a signal analyzer so that slow frequency sweeps could be performed while simultaneous narrow-band filtering was performed on the receiver outputs. This process optimized input energy from the source and signal analysis of the receiver outputs. Six deep V s profiles and five intermediate-depth (about 100 m) profiles were performed along the top of Yucca Mountain over a distance of about 5 km. In addition, eleven shallower profiles (averaging about 45-m deep) were measured using a bulldozer source. The shallower profiles were used to augment the deeper profiles and to evaluate further the near-surface velocity structure. The V s profiles exhibit a strong velocity gradient within 5 m of the surface, with the mean V s value more than doubling. Below this depth, V s gradually increases from a mean value of about 900 to 1000 m/s at a depth of 150 m. Between the depths of 150 and 210 m, V s increases more rapidly to about 1350 m/s, but this trend is based on limited data. At depths less than 50 m, anisotropy in V s was measured for surveys conducted
Połowniak, Piotr; Sobolak, Mariusz
2017-12-01
In this article, a mathematical description of tooth flank surface of the globoidal worm and worm wheel generated by the hourglass worm hob with straight tooth axial profile is presented. The kinematic system of globoidal worm gear is shown. The equation of globoid helix and tooth axial profile of worm is derived to determine worm tooth surface. Based on the equation of meshing the contact lines are obtained. The mathematical description of globoidal worm wheel tooth flank is performed on the basis of contact lines and generating the tooth side by the extreme cutting edge of worm hob. The presented mathematical model of tooth flank of TA worm and worm wheel can be used e.g. to analyse the contact pattern of the gear.
Cure, David; Weller, Thomas; Miranda, Felix A.
2011-01-01
In this paper, a comparison between Jerusalem Cross (JC) and Square Patch (SP) based Frequency Selected Surfaces (FSS) for low profile antenna applications is presented. The comparison is aimed at understanding the performance of low profile antennas backed by high impedance surfaces. In particular, an end loaded planar open sleeve dipole (ELPOSD) antenna is examined due to the various parameters within its configuration, offering significant design flexibility and a wide operating bandwidth. Measured data of the antennas demonstrate that increasing the number of unit cells improves the fractional bandwidth. The antenna bandwidth increased from 0.8% to 1.8% and from 0.8% to 2.7% for the JC and SP structures, respectively. The number of unit cells was increased from 48 to 80 for the JC-FSS and from 24 to 48 for the SP-FSS.
Lu, Feng; Matsushita, Yasuyuki; Sato, Imari; Okabe, Takahiro; Sato, Yoichi
2015-10-01
We propose an uncalibrated photometric stereo method that works with general and unknown isotropic reflectances. Our method uses a pixel intensity profile, which is a sequence of radiance intensities recorded at a pixel under unknown varying directional illumination. We show that for general isotropic materials and uniformly distributed light directions, the geodesic distance between intensity profiles is linearly related to the angular difference of their corresponding surface normals, and that the intensity distribution of the intensity profile reveals reflectance properties. Based on these observations, we develop two methods for surface normal estimation; one for a general setting that uses only the recorded intensity profiles, the other for the case where a BRDF database is available while the exact BRDF of the target scene is still unknown. Quantitative and qualitative evaluations are conducted using both synthetic and real-world scenes, which show the state-of-the-art accuracy of smaller than 10 degree without using reference data and 5 degree with reference data for all 100 materials in MERL database.
Simulations geometric structures of the stepped profile bearing surface of the piston
Directory of Open Access Journals (Sweden)
Wroblewski Emil
2017-01-01
Full Text Available The main node piston-pin-piston rings are most responsible for the formation of mechanical losses. It is advisable to reduce friction losses in the piston-cylinder group lead to an increase in the overall efficiency of the engine and thus reduce the fuel consumption. The method to reduce the area covered by the oil film is a modification of the bearing surface of the piston by adjusting the profile. In this paper the results of simulation for the stepped microgeometry piston bearing surface are presented.
Surface profiling of normally responding and nonreleasing basophils by flow cytometry
DEFF Research Database (Denmark)
Kistrup, Kasper; Poulsen, Lars Kærgaard; Jensen, Bettina Margrethe
a maximum release blood mononuclear cells were purified by density centrifugation and using flow cytometry, basophils, defined as FceRIa+CD3-CD14-CD19-CD56-,were analysed for surface expression of relevant markers. All samples were compensated and analysed in logicle display. All gates......c, C3aR, C5aR CCR3, FPR1, ST2, CRTH2 on anti-IgE respondsive and nonreleasing basophils by flow cytometry, thereby generating a surface profile of the two phenotypes. Methods Fresh buffy coat blood (
Pollock, Samuel B; Hu, Amy; Mou, Yun; Martinko, Alexander J; Julien, Olivier; Hornsby, Michael; Ploder, Lynda; Adams, Jarrett J; Geng, Huimin; Müschen, Markus; Sidhu, Sachdev S; Moffat, Jason; Wells, James A
2018-03-13
Human cells express thousands of different surface proteins that can be used for cell classification, or to distinguish healthy and disease conditions. A method capable of profiling a substantial fraction of the surface proteome simultaneously and inexpensively would enable more accurate and complete classification of cell states. We present a highly multiplexed and quantitative surface proteomic method using genetically barcoded antibodies called phage-antibody next-generation sequencing (PhaNGS). Using 144 preselected antibodies displayed on filamentous phage (Fab-phage) against 44 receptor targets, we assess changes in B cell surface proteins after the development of drug resistance in a patient with acute lymphoblastic leukemia (ALL) and in adaptation to oncogene expression in a Myc-inducible Burkitt lymphoma model. We further show PhaNGS can be applied at the single-cell level. Our results reveal that a common set of proteins including FLT3, NCR3LG1, and ROR1 dominate the response to similar oncogenic perturbations in B cells. Linking high-affinity, selective, genetically encoded binders to NGS enables direct and highly multiplexed protein detection, comparable to RNA-sequencing for mRNA. PhaNGS has the potential to profile a substantial fraction of the surface proteome simultaneously and inexpensively to enable more accurate and complete classification of cell states. Copyright © 2018 the Author(s). Published by PNAS.
Grinstead, Charles M; Snell, J Laurie
2011-01-01
This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.
Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V
1997-01-01
This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.
Energy Technology Data Exchange (ETDEWEB)
Aguiar, Lais A.; Melo, P.F. Frutuoso e [Universidade Federal, Rio de Janeiro, RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia. Programa de Engenharia Nuclear]. E-mail: lais@con.ufrj.br; frutuoso@con.ufrj.br; Passos, Erivaldo; Alves, Antonio Sergio [ELETRONUCLEAR, Rio de Janeiro, RJ (Brazil). Div. de Seguranca Nuclear]. E-mail: epassos@eletronuclear.gov.br; asergi@eletronuclear.gov.br
2005-07-01
The safety analysis of a near surface repository for medium and low activity wastes leads to investigating accident scenarios related to water infiltration phenomena. The probability of radionuclide release through the infiltration water could be estimated with the aid of suitable probabilistic models. For the analysis, the repository system is divided into two subsystems: the first, due to the barriers against the water infiltration (backfill material and container), and the second one comprising the barriers against the leaching of radionuclide to the biosphere (solid matrix and geosphere). The repository system is supposed to have its components (barriers) working in an active parallel mode. The probability of the system failure is obtained from the logical structure of a failure tree. The study was based on the Probabilistic Safety Assessment (PSA) technique for the most significant radionuclides within the radioactive packages system of low and medium activity, and so the probability of failure of the system for each radionuclide during the time period of institutional control was obtained. (author)
An Optimal Estimation Method to Obtain Surface Layer Turbulent Fluxes from Profile Measurements
Kang, D.
2015-12-01
In the absence of direct turbulence measurements, the turbulence characteristics of the atmospheric surface layer are often derived from measurements of the surface layer mean properties based on Monin-Obukhov Similarity Theory (MOST). This approach requires two levels of the ensemble mean wind, temperature, and water vapor, from which the fluxes of momentum, sensible heat, and water vapor can be obtained. When only one measurement level is available, the roughness heights and the assumed properties of the corresponding variables at the respective roughness heights are used. In practice, the temporal mean with large number of samples are used in place of the ensemble mean. However, in many situations the samples of data are taken from multiple levels. It is thus desirable to derive the boundary layer flux properties using all measurements. In this study, we used an optimal estimation approach to derive surface layer properties based on all available measurements. This approach assumes that the samples are taken from a population whose ensemble mean profile follows the MOST. An optimized estimate is obtained when the results yield a minimum cost function defined as a weighted summation of all error variance at each sample altitude. The weights are based one sample data variance and the altitude of the measurements. This method was applied to measurements in the marine atmospheric surface layer from a small boat using radiosonde on a tethered balloon where temperature and relative humidity profiles in the lowest 50 m were made repeatedly in about 30 minutes. We will present the resultant fluxes and the derived MOST mean profiles using different sets of measurements. The advantage of this method over the 'traditional' methods will be illustrated. Some limitations of this optimization method will also be discussed. Its application to quantify the effects of marine surface layer environment on radar and communication signal propagation will be shown as well.
Adaptive Sampling based 3D Profile Measuring Method for Free-Form Surface
Duan, Xianyin; Zou, Yu; Gao, Qiang; Peng, Fangyu; Zhou, Min; Jiang, Guozhang
2018-03-01
In order to solve the problem of adaptability and scanning efficiency of the current surface profile detection device, a high precision and high efficiency detection approach is proposed for surface contour of free-form surface parts based on self- adaptability. The contact mechanical probe and the non-contact laser probe are synthetically integrated according to the sampling approach of adaptive front-end path detection. First, the front-end path is measured by the non-contact laser probe, and the detection path is planned by the internal algorithm of the measuring instrument. Then a reasonable measurement sampling is completed according to the planned path by the contact mechanical probe. The detection approach can effectively improve the measurement efficiency of the free-form surface contours and can simultaneously detect the surface contours of unknown free-form surfaces with different curvatures and even different rate of curvature. The detection approach proposed in this paper also has important reference value for free-form surface contour detection.
Surface profile measurement by using the integrated Linnik WLSI and confocal microscope system
Wang, Wei-Chung; Shen, Ming-Hsing; Hwang, Chi-Hung; Yu, Yun-Ting; Wang, Tzu-Fong
2017-06-01
The white-light scanning interferometer (WLSI) and confocal microscope (CM) are the two major optical inspection systems for measuring three-dimensional (3D) surface profile (SP) of micro specimens. Nevertheless, in practical applications, WLSI is more suitable for measuring smooth and low-slope surfaces. On the other hand, CM is more suitable for measuring uneven-reflective and low-reflective surfaces. As for aspect of surface profiles to be measured, the characteristics of WLSI and CM are also different. WLSI is generally used in semiconductor industry while CM is more popular in printed circuit board industry. In this paper, a self-assembled multi-function optical system was integrated to perform Linnik white-light scanning interferometer (Linnik WLSI) and CM. A connecting part composed of tubes, lenses and interferometer was used to conjunct finite and infinite optical systems for Linnik WLSI and CM in the self-assembled optical system. By adopting the flexibility of tubes and lenses, switching to perform two different optical measurements can be easily achieved. Furthermore, based on the shape from focus method with energy of Laplacian filter, the CM was developed to enhance the on focal information of each pixel so that the CM can provide all-in-focus image for performing the 3D SP measurement and analysis simultaneously. As for Linnik WLSI, eleven-step phase shifting algorithm was used to analyze vertical scanning signals and determine the 3D SP.
International Nuclear Information System (INIS)
2004-01-01
Profiles is a synthetic overview of more than 100 national energy markets in the world, providing insightful facts and key energy statistics. A Profile is structured around 6 main items and completed by key statistics: Ministries, public agencies, energy policy are concerned; main companies in the oil, gas, electricity and coal sectors, status, shareholders; reserve, production, imports and exports, electricity and refining capacities; deregulation of prices, subsidies, taxes; consumption trends by sector, energy market shares; main energy projects, production and consumption prospects. Statistical Profiles are present in about 3 pages the main data and indicators on oil, gas, coal and electricity. (A.L.B.)
Directory of Open Access Journals (Sweden)
Stephen R Griffiths
Full Text Available Thermal properties of tree hollows play a major role in survival and reproduction of hollow-dependent fauna. Artificial hollows (nest boxes are increasingly being used to supplement the loss of natural hollows; however, the factors that drive nest box thermal profiles have received surprisingly little attention. We investigated how differences in surface reflectance influenced temperature profiles of nest boxes painted three different colors (dark-green, light-green, and white: total solar reflectance 5.9%, 64.4%, and 90.3% respectively using boxes designed for three groups of mammals: insectivorous bats, marsupial gliders and brushtail possums. Across the three different box designs, dark-green (low reflectance boxes experienced the highest average and maximum daytime temperatures, had the greatest magnitude of variation in daytime temperatures within the box, and were consistently substantially warmer than light-green boxes (medium reflectance, white boxes (high reflectance, and ambient air temperatures. Results from biophysical model simulations demonstrated that variation in diurnal temperature profiles generated by painting boxes either high or low reflectance colors could have significant ecophysiological consequences for animals occupying boxes, with animals in dark-green boxes at high risk of acute heat-stress and dehydration during extreme heat events. Conversely in cold weather, our modelling indicated that there are higher cumulative energy costs for mammals, particularly smaller animals, occupying light-green boxes. Given their widespread use as a conservation tool, we suggest that before boxes are installed, consideration should be given to the effect of color on nest box temperature profiles, and the resultant thermal suitability of boxes for wildlife, particularly during extremes in weather. Managers of nest box programs should consider using several different colors and installing boxes across a range of both orientations and
Line printing solution-processable small molecules with uniform surface profile via ink-jet printer.
Liu, Huimin; Xu, Wei; Tan, Wanyi; Zhu, Xuhui; Wang, Jian; Peng, Junbiao; Cao, Yong
2016-03-01
Line printing offers a feasible approach to remove the pixel well structure which is widely used to confine the ink-jet printed solution. In the study, a uniform line is printed by an ink-jet printer. To achieve a uniform surface profile of the printed line, 10vol% low-volatile solvent DMA (3,4-Dimethylanisole) is mixed with high-volatile solvent Pxy (p-xylene) as the solvent. After a solution-processable small molecule is dissolved, the surface tension of DMA solution becomes lower than that of Pxy solution, which creates an inward Marangoni flow during the solvent evaporation. The inward Marangoni flow balances out the outward capillary flow, thereby forming a flat film surface. The line width of the printed line depends on the contact angle of the solution on the hole injection layer. Copyright © 2015 Elsevier Inc. All rights reserved.
Radial Surface Density Profiles of Gas and Dust in the Debris Disk around 49 Ceti
Energy Technology Data Exchange (ETDEWEB)
Hughes, A. Meredith; Lieman-Sifry, Jesse; Flaherty, Kevin M.; Daley, Cail M. [Department of Astronomy, Van Vleck Observatory, Wesleyan University, 96 Foss Hill Drive, Middletown, CT 06459 (United States); Roberge, Aki [Exoplanets and Stellar Astrophysics Laboratory, NASA Goddard Space Flight Center, Code 667, Greenbelt, MD 20771 (United States); Kóspál, Ágnes; Moór, Attila; Ábrahám, Peter [Konkoly Observatory, Research Centre for Astronomy and Earth Sciences, Hungarian Academy of Sciences, P.O. Box 67, 1525 Budapest (Hungary); Kamp, Inga [Kapteyn Astronomical Institute, University of Groningen, Postbus 800, 9700 AV Groningen (Netherlands); Wilner, David J.; Andrews, Sean M. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, MS-51, Cambridge, MA 02138 (United States); Kastner, Joel H., E-mail: amhughes@astro.wesleyan.edu [Rochester Institute of Technology, 54 Lomb Memorial Drive, Rochester, NY 14623 (United States)
2017-04-20
We present ∼0.″4 resolution images of CO(3–2) and associated continuum emission from the gas-bearing debris disk around the nearby A star 49 Ceti, observed with the Atacama Large Millimeter/Submillimeter Array (ALMA). We analyze the ALMA visibilities in tandem with the broadband spectral energy distribution to measure the radial surface density profiles of dust and gas emission from the system. The dust surface density decreases with radius between ∼100 and 310 au, with a marginally significant enhancement of surface density at a radius of ∼110 au. The SED requires an inner disk of small grains in addition to the outer disk of larger grains resolved by ALMA. The gas disk exhibits a surface density profile that increases with radius, contrary to most previous spatially resolved observations of circumstellar gas disks. While ∼80% of the CO flux is well described by an axisymmetric power-law disk in Keplerian rotation about the central star, residuals at ∼20% of the peak flux exhibit a departure from axisymmetry suggestive of spiral arms or a warp in the gas disk. The radial extent of the gas disk (∼220 au) is smaller than that of the dust disk (∼300 au), consistent with recent observations of other gas-bearing debris disks. While there are so far only three broad debris disks with well characterized radial dust profiles at millimeter wavelengths, 49 Ceti’s disk shows a markedly different structure from two radially resolved gas-poor debris disks, implying that the physical processes generating and sculpting the gas and dust are fundamentally different.
Lynch, Gillian C.; Halvick, Philippe; Zhao, Meishan; Truhlar, Donald G.; Yu, Chin-Hui; Kouri, Donald J.; Schwenke, David W.
1991-01-01
Accurate three-dimensional quantum mechanical reaction probabilities are presented for the reaction F + H2 yields HF + H on the new global potential energy surface 5SEC for total angular momentum J = 0 over a range of translational energies from 0.15 to 4.6 kcal/mol. It is found that the v-prime = 3 HF vibrational product state has a threshold as low as for v-prime = 2.
On the extension of the wind profile over homogeneous terrain beyond the surface boundary layer
DEFF Research Database (Denmark)
Gryning, Sven-Erik; Batchvarova, Ekaterina; Brümmer, B.
2007-01-01
-Obukhov similarity. Above the surface layer the second length scale (L-MBL ) becomes independent of height but not of stability, and at the top of the boundary layer the third length scale is assumed to be negligible. A simple model for the combined length scale that controls the wind profile and its stability...... dependence is formulated by inverse summation. Based on these assumptions the wind profile for the entire boundary layer is derived. A parameterization of L-MBL is formulated using the geostrophic drag law, which relates friction velocity and geostrophic wind. The empirical parameterization of the resistance...... law functions A and B in the geostrophic drag law is uncertain, making it impractical. Therefore an expression for the length scale, L-MBL , for applied use is suggested, based on measurements from the two sites....
Modelling of composition and stress profiles in low temperature surface engineered stainless steel
DEFF Research Database (Denmark)
Jespersen, Freja Nygaard; Hattel, Jesper Henri; Somers, Marcel A. J.
2015-01-01
temperature, time and gas composition is a prerequisite for targeted process optimization. A realistic model to simulate the developing case has to take the following influences on composition and stress into account: - a concentration dependent diffusion coefficient - trapping of nitrogen by chromium atoms...... stresses are introduced in the developing case, arising from the volume expansion that accompanies the dissolution of high interstitial contents in expanded austenite. Modelling of the composition and stress profiles developing during low temperature surface engineering from the processing parameters...... - the effect of residual stress on diffusive flux - the effect of residual stress on solubility of interstitials - plastic accommodation of residual stress. The effect of all these contributions on composition and stress profiles will be addressed....
Directory of Open Access Journals (Sweden)
Razumov M.
2017-03-01
Full Text Available This article describes machining technology of polyhedral surfaces with varying profile, which is provided by planetary motion of multiblade block tools. The features of the technology and urgency of the problem is indicated. The purpose of the study is to determine the minimum value of the clearance angle of the tool. Also, the study is carried out about changing the value of the front and rear corners during the formation of polygonal surface using a planetary gear. The scheme of calculating the impact of various factors on the value of the minimum clearance angle of the tool and kinematic front and rear corners of the instrument is provided. The mathematical formula for calculating the minimum clearance angle of the tool is given. Also, given the formula for determining the front and rear corners of the tool during driving. This study can be used in the calculation of the design operations forming multifaceted external surfaces with a variable profile by using the planetary gear.
Surface damage and gas trapping profile measurements in copper during 20 kev He+ irradiation
International Nuclear Information System (INIS)
Terreault, B.; Veilleux, G.
1980-01-01
Surface damage due to 20 keV he + irradiation of OFHC Cu was studied by optical and scanning electron microscopy, and by gas trapping profile measurements with proton backscattering and elastic recoil detection. Both annealed (1 h at 773 K) and unannealed Cu were implanted, at 300 K (0.22Tsub(m)) and 500 K(0.37 Tsub(m)), up to fluences of 3 x 10 18 cm -2 . Additional results with thin (1 μm) evaporated films and stressed cold-rolled foils (3 μm) were obtained. At 500 K in bulk OFHC Cu pores and/or large (approx. 1μm) but scattered blisters appear; at 300 K in bulk or thin film Cu blisters are large and abundant. In all these cases a very large (>=1.5 x 10 1 7 cm -2 ) and sudden release of deeply implanted helium takes place, leading to a depleted profile at a depth of about 90 nm (approx. Rsub(p)). In contrast in cold-rolled foils the blisters are small (approx. 0.4 μm) and the profiles are undepleted. These results are explained by fissuration of helium-pressurized cavities. At high fluence blisters disappear, leaving a porous structure at 500 K and a rough micro-relief at 300 K; the helim profiles are flat and very wide (2-3 Rsub(p)). Blister disappearance, absence of flaking, and porous structure are discussed in terms of the width of the profiles and the formation of a helium-saturated, highly damaged (recrystallized), and permeable layer. (orig.)
Mobile depth profiling and sub-surface imaging techniques for historical paintings—A review
International Nuclear Information System (INIS)
Alfeld, Matthias; Broekaert, José A.C.
2013-01-01
Hidden, sub-surface paint layers and features contain valuable information for the art-historical investigation of a painting's past and for its conservation for coming generations. The number of techniques available for the study of these features has been considerably extended in the last decades and established techniques have been refined. This review focuses on mobile non-destructive subsurface imaging and depth profiling techniques, which allow for the in-situ investigation of easel paintings, i.e. paintings on a portable support. Among the techniques discussed are: X-ray radiography and infrared reflectography, which are long established methods and are in use for several decades. Their capabilities of element/species specific imaging have been extended by the introduction of energy/wavelength resolved measurements. Scanning macro-X-ray fluorescence analysis made it for the first time possible to acquire elemental distribution images in-situ and optical coherence tomography allows for the non-destructive study the surface paint layers in virtual cross-sections. These techniques and their variants are presented next to other techniques, such as Terahertz imaging, Nuclear Magnetic Resonance depth profiling and established techniques for non destructive testing (thermography, ultrasonic imaging and laser based interference methods) applied in the conservation of historical paintings. Next to selected case studies the capabilities and limitations of the techniques are discussed. - Highlights: • All mobile sub-surface and depth-profiling techniques for paintings are reviewed. • The number of techniques available has increased considerably in the last years. • X-ray radiography and infrared reflectography are still the most used techniques. • Scanning macro-XRF and optical coherence tomography begin to establish. • Industrial non destructive testing techniques support the preservation of paintings
Mobile depth profiling and sub-surface imaging techniques for historical paintings—A review
Energy Technology Data Exchange (ETDEWEB)
Alfeld, Matthias, E-mail: matthias.alfeld@desy.de [University of Hamburg, Department of Chemistry, Martin-Luther-King Platz 6, D-20146 Hamburg (Germany); University of Antwerp, Department of Chemistry, Groenenbrogerlaan 171, B-2020 Antwerp (Belgium); Broekaert, José A.C., E-mail: jose.broekaert@chemie.uni-hamburg.de [University of Hamburg, Department of Chemistry, Martin-Luther-King Platz 6, D-20146 Hamburg (Germany)
2013-10-01
Hidden, sub-surface paint layers and features contain valuable information for the art-historical investigation of a painting's past and for its conservation for coming generations. The number of techniques available for the study of these features has been considerably extended in the last decades and established techniques have been refined. This review focuses on mobile non-destructive subsurface imaging and depth profiling techniques, which allow for the in-situ investigation of easel paintings, i.e. paintings on a portable support. Among the techniques discussed are: X-ray radiography and infrared reflectography, which are long established methods and are in use for several decades. Their capabilities of element/species specific imaging have been extended by the introduction of energy/wavelength resolved measurements. Scanning macro-X-ray fluorescence analysis made it for the first time possible to acquire elemental distribution images in-situ and optical coherence tomography allows for the non-destructive study the surface paint layers in virtual cross-sections. These techniques and their variants are presented next to other techniques, such as Terahertz imaging, Nuclear Magnetic Resonance depth profiling and established techniques for non destructive testing (thermography, ultrasonic imaging and laser based interference methods) applied in the conservation of historical paintings. Next to selected case studies the capabilities and limitations of the techniques are discussed. - Highlights: • All mobile sub-surface and depth-profiling techniques for paintings are reviewed. • The number of techniques available has increased considerably in the last years. • X-ray radiography and infrared reflectography are still the most used techniques. • Scanning macro-XRF and optical coherence tomography begin to establish. • Industrial non destructive testing techniques support the preservation of paintings.
Rayleigh beacon for measuring the surface profile of a radio telescope.
Padin, S
2014-12-01
Millimeter-wavelength Rayleigh scattering from water droplets in a cloud is proposed as a means of generating a bright beacon for measuring the surface profile of a radio telescope. A λ=3 mm transmitter, with an output power of a few watts, illuminating a stratiform cloud, can generate a beacon with the same flux as Mars in 10 GHz bandwidth, but the beacon has a narrow line width, so it is extremely bright. The key advantage of the beacon is that it can be used at any time, and positioned anywhere in the sky, as long as there are clouds.
International Nuclear Information System (INIS)
Requardt, H.; Deimling, M.; Weber, H.
1986-01-01
Sagittal and axial images obtained using a surface coil suffer from the extreme intensity profile caused by physical properties of the coil and the anatomic entity of subcutaneous fat. The authors present a measuring device that reduces these disadvantages by means of Helmholtz-type coils, and sequences that reduce the fat signal by dephasing its signal part. The extremely short repetition time (<30 msec) allows acquisition times shorter than 10 sec. Breath-holding for this short period to avoid movement artifacts is possible. Images are presented that illustrate the enhanced contrast of spinal tissue and surrounding structures. Comparisons are made with spin-echo and CHESS images
Directory of Open Access Journals (Sweden)
I.I. Ahmed
2018-04-01
Full Text Available The development of residual stresses during fabrication is inevitable and often neglected with dire consequences during the service life of the fabricated components. In this work, the surface residual stress profile following the martensitic stainless steel (MSS pipe welding was investigated with X-ray diffraction technique. The results revealed the presence of residual stresses equilibrated across the weldment zones. Tensile residual stress observed in weld metal was balanced by compressive residual stresses in the parent material on the opposing sides of weld metal. Keywords: Residual stress, Weld, Stainless steel, X-ray, HAZ
Surface waves at the interface with an antisymmetric gain/loss profile
International Nuclear Information System (INIS)
Ctyroky, Jiri; Kuzmiak, Vladimir; Eyderman, Sergey
2010-01-01
We studied properties of strongly guiding two-mode waveguides with antisymmetric gain/loss profile which constitute photonic analogues of quantum mechanical structures with parity-time symmetry breaking. For both TE and TM polarizations, the dependences of effective indices of the guided modes vs. gain/loss coefficient exhibit a degenerate critical point that defines two regimes with profoundly different behavior. In addition, we have shown that the interface between the two media supports propagation of a strongly confined non-attenuated TM polarized surface wave. We examined the properties of the surface wave obtained by both the modal and FDTD method and discuss the differences between the results obtained by both techniques as both the material and geometrical parameters are varied.
Exploring the Plant–Microbe Interface by Profiling the Surface-Associated Proteins of Barley Grains
DEFF Research Database (Denmark)
Sultan, Abida; Andersen, Birgit; Svensson, Birte
2016-01-01
Cereal grains are colonized by a microbial community that actively interacts with the plant via secretion of various enzymes, hormones, and metabolites. Microorganisms decompose plant tissues by a collection of depolymerizing enzymes, including β-1,4-xylanases, that are in turn inhibited by plant...... xylanase inhibitors. To gain insight into the importance of the microbial consortia and their interaction with barley grains, we used a combined gel-based (2-DE coupled to MALDI-TOF-TOF MS) and gel-free (LC–MS/MS) proteomics approach complemented with enzyme activity assays to profile the surface......-associated proteins and xylanolytic activities of two barley cultivars. The surface-associated proteome was dominated by plant proteins with roles in defense and stress-responses, while the relatively less abundant microbial (bacterial and fungal) proteins were involved in cell-wall and polysaccharide degradation...
International Nuclear Information System (INIS)
Boaga, J; Vignoli, G; Cassiani, G
2011-01-01
Inversion is a critical step in all geophysical techniques, and is generally fraught with ill-posedness. In the case of seismic surface wave studies, the inverse problem can lead to different equivalent subsoil models and consequently to different local seismic response analyses. This can have a large impact on an earthquake engineering design. In this paper, we discuss the consequences of non-uniqueness of surface wave inversion on seismic responses, with both numerical and experimental data. Our goal is to evaluate the consequences on common seismic response analysis in the case of different impedance contrast conditions. We verify the implications of inversion uncertainty, and consequently of data information content, on realistic local site responses. A stochastic process is used to generate a set of 1D shear wave velocity profiles from several specific subsurface models. All these profiles are characterized as being equivalent, i.e. their responses, in terms of a dispersion curve, are compatible with the uncertainty in the same surface wave data. The generated 1D shear velocity models are then subjected to a conventional one-dimensional seismic ground response analysis using a realistic input motion. While recent analyses claim that the consequences of surface wave inversion uncertainties are very limited, our test points out that a relationship exists between inversion confidence and seismic responses in different subsoils. In the case of regular and relatively smooth increase of shear wave velocities with depth, as is usual in sedimentary plains, our results show that the choice of a specific model among equivalent solutions strongly influences the seismic response. On the other hand, when the shallow subsoil is characterized by a strong impedance contrast (thus revealing a characteristic soil resonance period), as is common in the presence of a shallow bedrock, equivalent solutions provide practically the same seismic amplification, especially in the
International Nuclear Information System (INIS)
Daley, T.M.; Majer, E.L.; Karageorgi, E.
1994-08-01
This report presents results from surface and borehole seismic profiling performed by the Lawrence Berkeley Laboratory (LBL) on Yucca Mountain. This work was performed as part of the site characterization effort for the potential high-level nuclear waste repository. Their objective was to provide seismic imaging from the near surface (200 to 300 ft. depth) to the repository horizon and below, if possible. Among the issues addressed by this seismic imaging work are location and depth of fracturing and faulting, geologic identification of reflecting horizons, and spatial continuity of reflecting horizons. The authors believe their results are generally positive, with tome specific successes. This was the first attempt at this scale using modem seismic imaging techniques to determine geologic features on Yucca Mountain. The principle purpose of this report is to present the interpretation of the seismic reflection section in a geologic context. Three surface reflection profiles were acquired and processed as part of this study. Because of environmental concerns, all three lines were on preexisting roads. Line 1 crossed the mapped surface trace of the Ghost Dance fault and it was intended to study the dip and depth extent of the fault system. Line 2 was acquired along Drill Hole wash and was intended to help the ESF north ramp design activities. Line 3 was acquired along Yucca Crest and was designed to image geologic horizons which were thought to be less faulted along the ridge. Unfortunately, line 3 proved to have poor data quality, in part because of winds, poor field conditions and limited time. Their processing and interpretation efforts were focused on lines 1 and 2 and their associated VSP studies
Sheridan P. J.; Andrews, E.; Ogren, J A.; Tackett, J. L.; Winker, D. M.
2012-01-01
Between June 2006 and September 2009, an instrumented light aircraft measured over 400 vertical profiles of aerosol and trace gas properties over eastern and central Illinois. The primary objectives of this program were to (1) measure the in situ aerosol properties and determine their vertical and temporal variability and (2) relate these aircraft measurements to concurrent surface and satellite measurements. Underflights of the CALIPSO satellite show reasonable agreement in a majority of retrieved profiles between aircraft-measured extinction at 532 nm (adjusted to ambient relative humidity) and CALIPSO-retrieved extinction, and suggest that routine aircraft profiling programs can be used to better understand and validate satellite retrieval algorithms. CALIPSO tended to overestimate the aerosol extinction at this location in some boundary layer flight segments when scattered or broken clouds were present, which could be related to problems with CALIPSO cloud screening methods. The in situ aircraft-collected aerosol data suggest extinction thresholds for the likelihood of aerosol layers being detected by the CALIOP lidar. These statistical data offer guidance as to the likelihood of CALIPSO's ability to retrieve aerosol extinction at various locations around the globe.
Difference in Dwarf Galaxy Surface Brightness Profiles as a Function of Environment
Lee, Youngdae; Park, Hong Soo; Kim, Sang Chul; Moon, Dae-Sik; Lee, Jae-Joon; Kim, Dong-Jin; Cha, Sang-Mok
2018-05-01
We investigate surface brightness profiles (SBPs) of dwarf galaxies in field, group, and cluster environments. With deep BV I images from the Korea Microlensing Telescope Network Supernova Program, SBPs of 38 dwarfs in the NGC 2784 group are fitted by a single-exponential or double-exponential model. We find that 53% of the dwarfs are fitted with single-exponential profiles (“Type I”), while 47% of the dwarfs show double-exponential profiles; 37% of all dwarfs have smaller sizes for the outer part than the inner part (“Type II”), while 10% have a larger outer than inner part (“Type III”). We compare these results with those in the field and in the Virgo cluster, where the SBP types of 102 field dwarfs are compiled from a previous study and the SBP types of 375 cluster dwarfs are measured using SDSS r-band images. As a result, the distributions of SBP types are different in the three environments. Common SBP types for the field, the NGC 2784 group, and the Virgo cluster are Type II, Type I and II, and Type I and III profiles, respectively. After comparing the sizes of dwarfs in different environments, we suggest that since the sizes of some dwarfs are changed due to environmental effects, SBP types are capable of being transformed and the distributions of SBP types in the three environments are different. We discuss possible environmental mechanisms for the transformation of SBP types. Based on data collected at KMTNet Telescopes and SDSS.
Rixen, M.; Ferreira-Coelho, E.; Signell, R.
2008-01-01
Despite numerous and regular improvements in underlying models, surface drift prediction in the ocean remains a challenging task because of our yet limited understanding of all processes involved. Hence, deterministic approaches to the problem are often limited by empirical assumptions on underlying physics. Multi-model hyper-ensemble forecasts, which exploit the power of an optimal local combination of available information including ocean, atmospheric and wave models, may show superior forecasting skills when compared to individual models because they allow for local correction and/or bias removal. In this work, we explore in greater detail the potential and limitations of the hyper-ensemble method in the Adriatic Sea, using a comprehensive surface drifter database. The performance of the hyper-ensembles and the individual models are discussed by analyzing associated uncertainties and probability distribution maps. Results suggest that the stochastic method may reduce position errors significantly for 12 to 72??h forecasts and hence compete with pure deterministic approaches. ?? 2007 NATO Undersea Research Centre (NURC).
OCT-based profiler for automating ocular surface prosthetic fitting (Conference Presentation)
Mujat, Mircea; Patel, Ankit H.; Maguluri, Gopi N.; Iftimia, Nicusor V.; Patel, Chirag; Agranat, Josh; Tomashevskaya, Olga; Bonte, Eugene; Ferguson, R. Daniel
2016-03-01
The use of a Prosthetic Replacement of the Ocular Surface Environment (PROSE) device is a revolutionary treatment for military patients that have lost their eyelids due to 3rd degree facial burns and for civilians who suffer from a host of corneal diseases. However, custom manual fitting is often a protracted painful, inexact process that requires multiple fitting sessions. Training for new practitioners is a long process. Automated methods to measure the complete corneal and scleral topology would provide a valuable tool for both clinicians and PROSE device manufacturers and would help streamline the fitting process. PSI has developed an ocular anterior-segment profiler based on Optical Coherence Tomography (OCT), which provides a 3D measure of the surface of the sclera and cornea. This device will provide topography data that will be used to expedite and improve the fabrication process for PROSE devices. OCT has been used to image portions of the cornea and sclera and to measure surface topology for smaller contact lenses [1-3]. However, current state-of-the-art anterior eye OCT systems can only scan about 16 mm of the eye's anterior surface, which is not sufficient for covering the sclera around the cornea. In addition, there is no systematic method for scanning and aligning/stitching the full scleral/corneal surface and commercial segmentation software is not optimized for the PROSE application. Although preliminary, our results demonstrate the capability of PSI's approach to generate accurate surface plots over relatively large areas of the eye, which is not currently possible with any other existing platform. Testing the technology on human volunteers is currently underway at Boston Foundation for Sight.
Rostkier-Edelstein, Dorita; Hacker, Joshua
2013-04-01
Surface observations comprise a wide, non-expensive and reliable source of information about the state of the near-surface planetary boundary layer (PBL). Operational data assimilation systems have encountered several difficulties in effectively assimilating them, among others due to their local-scale representativeness, the transient coupling between the surface and the atmosphere aloft and the balance constraints usually used. A long-term goal of this work is to find an efficient system for probabilistic PBL nowcasting that can be employed wherever surface observations are present. Earlier work showed that surface observations can be an important source of information with a single column model (SCM) and an ensemble filter (EF). Here we extend that work to quantify the probabilistic skill of ensemble SCM predictions with a model including added complexity. We adopt a factor separation analysis to quantify the contribution of surface assimilation relative to that of selected model components (parameterized radiation and externally imposed horizontal advection) to the probabilistic skill of the system, and of any beneficial or detrimental interactions between them. To assess the real utility of the flow-dependent covariances estimated with the EF and of the SCM of the PBL we compare the skill of the SCM/EF system to that of a reference one based on climatological covariances and a 30-min persistence model. It consists of a dressing technique, whereby a deterministic 3D mesoscale forecast (e.g. from WRF model) is adjusted and dressed with uncertainty using a seasonal sample of mesoscale forecasts and surface forecast errors. Results show that assimilation of surface observations can improve deterministic and probabilistic profile predictions more significantly than major model improvements. Flow-dependent covariances estimated with the SCM/EF show clear advantage over the use of climatological covariances when the flow is characterized by wide variability, when
3D Surface Profile and Color Stability of Tooth Colored Filling Materials after Bleaching
Directory of Open Access Journals (Sweden)
Bryant Anthony Irawan
2015-01-01
Full Text Available This study aims to evaluate the effects of vital tooth bleaching with carbamide peroxide home bleaching and in-office bleaching on the color stability and 3D surface profile of dental restorative filling materials. Thirty discs (n=30 measure 6 mm in diameter and 2 mm thick for each of three restorative materials. These are nanofilled composite Filtek Z350 XT, the submicron composite Estelite Σ Quick, and nanofilled glass ionomer Ketac N100 nanoionomer and were fabricated in shade A2. Each group was further divided into three subgroups (n=10: subgroup A (Opalescence PF, subgroup B (Opalescence Boost in-office bleaching, and subgroup C (distilled water serving as control. Samples were bleached according to the manufacturer’s instructions for a period of two weeks. The Commission Internationale de L’Eclairage (CIE L*, a*, b* system was chosen for image processing, while 3D surface profile was tested with atomic force microscopy (AFM. Statistical analyses were performed with the Mann-Whitney tests and Krusal-Wallis with a P value of ≤0.05. The three restorative materials showed significant color changes (ΔE; P≤0.05. In diminishing order, the mean color changes recorded were Estelite Σ (3.82 ± 1.6 > Ketac Nano (2.97 ± 1.2 > Filtek Z350 XT (2.25 ± 1.0. However, none of the tested materials showed statistically significant changes in surface roughness; P>0.05.
Stryker, S. A.; Dimarco, S. F.; Stoessel, M. M.; Wang, Z.
2010-12-01
The northwest Indian Ocean is a region of complex circulation and atmospheric influence. The Persian (Arabian) Gulf and Red Sea contribute toward the complexity of the region. This study encompasses the surface and deep circulation in the region ranging from 0°N-35°N and 40°E-80°E from January 2002-December 2009. Emphasis is in the Persian Gulf, Oman Sea and Arabian Sea (roughly from 21°N-26°N and 56°E-63°E) using a variety of in situ and observation data sets. While there is a lot known about the Persian Gulf and Arabian Sea, little is known about the Oman Sea. Circulation in the northwest Indian Ocean is largely influenced by seasonal monsoon winds. From the winter monsoon to the summer monsoon, current direction reverses. Marginal sea inflow and outflow are also seasonally variable, which greatly impacts the physical water mass properties in the region. In situ and observation data sets include data from Argo floats (US GODAE), surface drifters (AOML) and an observation system consisting of 4 independent moorings and a cabled ocean observatory in the Oman Sea. The observing system in the Oman Sea was installed by Lighthouse R & D Enterprises, Inc. beginning in 2005, and measures current, temperature, conductivity, pressure, dissolved oxygen and turbidity, using the Aanderaa Recording Doppler Current Profiler (RDCP) 600 and the Aanderaa Recording Current Meter (RCM) 11. The cabled ocean observatory measures dissolved oxygen, temperature and salinity between 65 m and 1000 m and reports in real-time. Argo floats in the region have a parking depth range from 500 m to 2000 m. At 1000 m depth, 98% of the velocity magnitudes range from less than 1 cm/s to 20 cm/s. The Somali Current and Northeast/Southwest Monsoon Currents are present, reversing from summer to winter. At 2000 m depth, the Somali and Monsoon Currents are still present but have smaller velocities with 98% ranging from less than 1 cm/s to 13 cm/s. At both 1000 m and 2000 m, larger velocities occur
DEFF Research Database (Denmark)
Chen, Jin; Cheng, Jiangtao; Shen, Wenzhong
2013-01-01
Aerodynamic of airfoil performance is closely related to the continuity of its surface curvature, and airfoil profiles with a better aerodynamic performance plays an important role in the design of wind turbine. The surface curvature distribution along the chord direction and pressure distributio...
Zhang, Huifang; Yang, Minghong; Xu, Xueke; Wu, Lunzhe; Yang, Weiguang; Shao, Jianda
2017-10-01
The surface figure control of the conventional annular polishing system is realized ordinarily by the interaction between the conditioner and the lap. The surface profile of the pitch lap corrected by the marble conditioner has been measured and analyzed as a function of kinematics, loading conditions, and polishing time. The surface profile measuring equipment of the large lap based on laser alignment was developed with the accuracy of about 1μm. The conditioning mechanism of the conditioner is simply determined by the kinematics and fully fitting principle, but the unexpected surface profile deviation of the lap emerged frequently due to numerous influencing factors including the geometrical relationship, the pressure distribution at the conditioner/lap interface. Both factors are quantitatively evaluated and described, and have been combined to develop a spatial and temporal model to simulate the surface profile evolution of pitch lap. The simulations are consistent with the experiments. This study is an important step toward deterministic full-aperture annular polishing, providing a beneficial guidance for the surface profile correction of the pitch lap.
Water surface temperature profiles for the Rhine River derived from Landsat ETM+ data
Fricke, Katharina; Baschek, Björn
2013-10-01
Water temperature influences physical and chemical parameters of rivers and streams and is an important parameter for water quality. It is a crucial factor for the existence and the growth of animal and plant species in the river ecosystem. The aim of the research project "Remote sensing of water surface temperature" at the Federal Institute of Hydrology (BfG), Germany, is to supplement point measurements of water temperature with remote sensing methodology. The research area investigated here is the Upper and Middle Rhine River, where continuous measurements of water temperature are already available for several water quality monitoring stations. Satellite imagery is used to complement these point measurements and to generate longitudinal temperature profiles for a better systematic understanding of the changes in river temperature along its course. Several products for sea surface temperature derived from radiances in the thermal infrared are available, but for water temperature from rivers less research has been carried out. Problems arise from the characteristics of the river valley and morphology and the proximity to the riverbank. Depending on the river width, a certain spatial resolution of the satellite images is necessary to allow for an accurate identification of the river surface and the calculation of water temperature. The bands from the Landsat ETM+ sensor in the thermal infrared region offer a possibility to extract the river surface temperatures (RST) of a sufficiently wide river such as the Rhine. Additionally, problems such as cloud cover, shadowing effects, georeferencing errors, different emissivity of water and land, scattering of thermal radiation, adjacency and mixed pixel effects had to be accounted for and their effects on the radiance temperatures will be discussed. For this purpose, several temperature data sets derived from radiance and in situ measurements were com- pared. The observed radiance temperatures are strongly influenced by
Development of Pseudorandom Binary Arrays for Calibration of Surface Profile Metrology Tools
International Nuclear Information System (INIS)
Barber, S.K.; Takacs, P.; Soldate, P.; Anderson, E.H.; Cambie, R.; McKinney, W.R.; Voronov, D.L.; Yashchuk, V.V.
2009-01-01
Optical metrology tools, especially for short wavelengths (extreme ultraviolet and x-ray), must cover a wide range of spatial frequencies from the very low, which affects figure, to the important mid-spatial frequencies and the high spatial frequency range, which produces undesirable scattering. A major difficulty in using surface profilometers arises due to the unknown point-spread function (PSF) of the instruments [G. D. Boreman, Modulation Transfer Function in Optical and Electro-Optical Systems (SPIE, Bellingham, WA, 2001)] that is responsible for distortion of the measured surface profile. Generally, the distortion due to the PSF is difficult to account for because the PSF is a complex function that comes to the measurement via the convolution operation, while the measured profile is described with a real function. Accounting for instrumental PSF becomes significantly simpler if the result of measurement of a profile is presented in the spatial frequency domain as a power spectral density (PSD) distribution [J. W. Goodman, Introduction to Fourier Optics, 3rd ed. (Roberts and Company, Englewood, CO, 2005)]. For example, measured PSD distributions provide a closed set of data necessary for three-dimensional calculations of scattering of light by the optical surfaces [E. L. Church et al., Opt. Eng. (Bellingham) 18, 125 (1979); J. C. Stover, Optical Scattering, 2nd ed. (SPIE Optical Engineering Press, Bellingham, WA, 1995)]. The distortion of the surface PSD distribution due to the PSF can be modeled with the modulation transfer function (MTF), which is defined over the spatial frequency bandwidth of the instrument. The measured PSD distribution can be presented as a product of the squared MTF and the ideal PSD distribution inherent for the system under test. Therefore, the instrumental MTF can be evaluated by comparing a measured PSD distribution of a known test surface with the corresponding ideal numerically simulated PSD. The square root of the ratio of the
International Nuclear Information System (INIS)
Kumar, P.; Martin, H.; Jiang, X.
2016-01-01
Non-destructive testing and online measurement of surface features are pressing demands in manufacturing. Thus optical techniques are gaining importance for characterization of complex engineering surfaces. Harnessing integrated optics for miniaturization of interferometry systems onto a silicon wafer and incorporating a compact optical probe would enable the development of a handheld sensor for embedded metrology applications. In this work, we present the progress in the development of a hybrid photonics based metrology sensor device for online surface profile measurements. The measurement principle along with test and measurement results of individual components has been presented. For non-contact measurement, a spectrally encoded lateral scanning probe based on the laser scanning microscopy has been developed to provide fast measurement with lateral resolution limited to the diffraction limit. The probe demonstrates a lateral resolution of ∼3.6 μm while high axial resolution (sub-nanometre) is inherently achieved by interferometry. Further the performance of the hybrid tuneable laser and the scanning probe was evaluated by measuring a standard step height sample of 100 nm.
Energy Technology Data Exchange (ETDEWEB)
Kumar, P.; Martin, H.; Jiang, X. [EPSRC Centre for Innovative Manufacturing in Advanced Metrology, University of Huddersfield, Huddersfield HD1 3DH (United Kingdom)
2016-06-15
Non-destructive testing and online measurement of surface features are pressing demands in manufacturing. Thus optical techniques are gaining importance for characterization of complex engineering surfaces. Harnessing integrated optics for miniaturization of interferometry systems onto a silicon wafer and incorporating a compact optical probe would enable the development of a handheld sensor for embedded metrology applications. In this work, we present the progress in the development of a hybrid photonics based metrology sensor device for online surface profile measurements. The measurement principle along with test and measurement results of individual components has been presented. For non-contact measurement, a spectrally encoded lateral scanning probe based on the laser scanning microscopy has been developed to provide fast measurement with lateral resolution limited to the diffraction limit. The probe demonstrates a lateral resolution of ∼3.6 μm while high axial resolution (sub-nanometre) is inherently achieved by interferometry. Further the performance of the hybrid tuneable laser and the scanning probe was evaluated by measuring a standard step height sample of 100 nm.
Strong orientation dependence of surface mass density profiles of dark haloes at large scales
Osato, Ken; Nishimichi, Takahiro; Oguri, Masamune; Takada, Masahiro; Okumura, Teppei
2018-06-01
We study the dependence of surface mass density profiles, which can be directly measured by weak gravitational lensing, on the orientation of haloes with respect to the line-of-sight direction, using a suite of N-body simulations. We find that, when major axes of haloes are aligned with the line-of-sight direction, surface mass density profiles have higher amplitudes than those averaged over all halo orientations, over all scales from 0.1 to 100 Mpc h-1 we studied. While the orientation dependence at small scales is ascribed to the halo triaxiality, our results indicate even stronger orientation dependence in the so-called two-halo regime, up to 100 Mpc h-1. The orientation dependence for the two-halo term is well approximated by a multiplicative shift of the amplitude and therefore a shift in the halo bias parameter value. The halo bias from the two-halo term can be overestimated or underestimated by up to {˜ } 30 per cent depending on the viewing angle, which translates into the bias in estimated halo masses by up to a factor of 2 from halo bias measurements. The orientation dependence at large scales originates from the anisotropic halo-matter correlation function, which has an elliptical shape with the axis ratio of ˜0.55 up to 100 Mpc h-1. We discuss potential impacts of halo orientation bias on other observables such as optically selected cluster samples and a clustering analysis of large-scale structure tracers such as quasars.
Directory of Open Access Journals (Sweden)
A. Cherkasheva
2013-04-01
Full Text Available Current estimates of global marine primary production range over a factor of two. Improving these estimates requires an accurate knowledge of the chlorophyll vertical profiles, since they are the basis for most primary production models. At high latitudes, the uncertainty in primary production estimates is larger than globally, because here phytoplankton absorption shows specific characteristics due to the low-light adaptation, and in situ data and ocean colour observations are scarce. To date, studies describing the typical chlorophyll profile based on the chlorophyll in the surface layer have not included the Arctic region, or, if it was included, the dependence of the profile shape on surface concentration was neglected. The goal of our study was to derive and describe the typical Greenland Sea chlorophyll profiles, categorized according to the chlorophyll concentration in the surface layer and further monthly resolved profiles. The Greenland Sea was chosen because it is known to be one of the most productive regions of the Arctic and is among the regions in the Arctic where most chlorophyll field data are available. Our database contained 1199 chlorophyll profiles from R/Vs Polarstern and Maria S. Merian cruises combined with data from the ARCSS-PP database (Arctic primary production in situ database for the years 1957–2010. The profiles were categorized according to their mean concentration in the surface layer, and then monthly median profiles within each category were calculated. The category with the surface layer chlorophyll (CHL exceeding 0.7 mg C m−3 showed values gradually decreasing from April to August. A similar seasonal pattern was observed when monthly profiles were averaged over all the surface CHL concentrations. The maxima of all chlorophyll profiles moved from the greater depths to the surface from spring to late summer respectively. The profiles with the smallest surface values always showed a subsurface chlorophyll
The Surface Density Profile of the Galactic Disk from the Terminal Velocity Curve
McGaugh, Stacy S.
2016-01-01
The mass distribution of the Galactic disk is constructed from the terminal velocity curve and the mass discrepancy-acceleration relation. Mass models numerically quantifying the detailed surface density profiles are tabulated. For R0 = 8 kpc, the models have stellar mass 5 spiral galaxy that obeys scaling relations like the Tully-Fisher relation, the size-mass relation, and the disk maximality-surface brightness relation. The stellar disk is maximal, and the spiral arms are massive. The bumps and wiggles in the terminal velocity curve correspond to known spiral features (e.g., the Centaurus arm is a ˜50% overdensity). The rotation curve switches between positive and negative over scales of hundreds of parsecs. The rms amplitude { }1/2≈ 14 {km} {{{s}}}-1 {{kpc}}-1, implying that commonly neglected terms in the Jeans equations may be nonnegligible. The spherically averaged local dark matter density is ρ0,DM ≈ 0.009 {M}⊙ {{pc}}-3 (0.34 {GeV} {{cm}}-3). Adiabatic compression of the dark matter halo may help reconcile the Milky Way with the c-V200 relation expected in ΛCDM while also helping to mitigate the too-big-to-fail problem, but it remains difficult to reconcile the inner bulge/bar-dominated region with a cuspy halo. We note that NGC 3521 is a near twin to the Milky Way, having a similar luminosity, scale length, and rotation curve.
Energy Technology Data Exchange (ETDEWEB)
Pu, Zhaoxia [Univ. of Utah, Salt Lake City, UT (United States)
2015-10-06
Most routine measurements from climate study facilities, such as the Department of Energy’s ARM SGP site, come from individual sites over a long period of time. While single-station data are very useful for many studies, it is challenging to obtain 3-dimensional spatial structures of atmospheric boundary layers that include prominent signatures of deep convection from these data. The principal objective of this project is to create realistic estimates of high-resolution (~ 1km × 1km horizontal grids) atmospheric boundary layer structure and the characteristics of precipitating convection. These characteristics include updraft and downdraft cumulus mass fluxes and cold pool properties over a region the size of a GCM grid column from analyses that assimilate surface mesonet observations of wind, temperature, and water vapor mixing ratio and available profiling data from single or multiple surface stations. The ultimate goal of the project is to enhance our understanding of the properties of mesoscale convective systems and also to improve their representation in analysis and numerical simulations. During the proposed period (09/15/2011–09/14/2014) and the no-cost extension period (09/15/2014–09/14/2015), significant accomplishments have been achieved relating to the stated goals. Efforts have been extended to various research and applications. Results have been published in professional journals and presented in related science team meetings and conferences. These are summarized in the report.
Wideband, Low-Profile, Dual-Polarized Slot Antenna with an AMC Surface for Wireless Communications
Directory of Open Access Journals (Sweden)
Wei Hu
2016-01-01
Full Text Available A wideband dual-polarized slot antenna loaded with artificial magnetic conductor (AMC is proposed for WLAN/WIMAX and LTE applications. The slot antenna mainly consists of two pairs of arrow-shaped slots along the diagonals of the square patch. Stepped microstrip feedlines are placed orthogonally to excite the horizontal and vertical polarizations of the antenna. To realize unidirectional radiation and low profile, an AMC surface composed of 7 × 7 unit cells is designed underneath a distance of 0.09λ0 (λ0 being the wavelength in free space at 2.25 GHz from the slot antenna. Both the dual-polarized slot antenna and the AMC surface are fabricated and measured. Experimental results demonstrate that the proposed antenna achieves for both polarizations a wide impedance bandwidth (return loss 10 dB of 36.7%, operating from 1.96 to 2.84 GHz. The isolation between the two input ports keeps higher than 29 dB whereas the cross-polarization levels basically maintain lower than −30 dB across the entire frequency band. High front-to-back ratios better than 22 dB and a stable gain higher than 8 dBi are obtained over the whole band.
DEFF Research Database (Denmark)
Ruban, Andrei; Abrikosov, I. A.; Kats, D. Ya.
1994-01-01
We have calculated the electronic structure and segregation profiles of the (001) surface of random Cu-Ni alloys with varying bulk concentrations by means of the coherent potential approximation and the linear muffin-tin-orbitals method. Exchange and correlation were included within the local......-density approximation. Temperature effects were accounted for by means of the cluster-variation method and, for comparison, by mean-field theory. The necessary interaction parameters were calculated by the Connolly-Williams method generalized to the case of a surface of a random alloy. We find the segregation profiles...
International Nuclear Information System (INIS)
Dasgupta, I.; Mookerjee, A.
1995-07-01
We present here a first principle method for the calculation of effective cluster interactions for semi-infinite solid alloys required for the study of surface segregation and surface ordering on disordered surfaces. Our method is based on the augmented space recursion coupled with the orbital peeling method of Burke in the framework of the TB-LMTO. Our study of surface segregation in CuNi alloys demonstrates strong copper segregation and a monotonic concentration profile throughout the concentration range. (author). 35 refs, 4 figs, 2 tabs
Augustin, C. M.
2015-12-01
Carbon capture and storage (CCS) has been suggested by the Intergovernmental Panel on Climate Change as a partial solution to the greenhouse gas emissions problem. As CCS has become mainstream, researchers have raised multiple risk assessment issues typical of emerging technologies. In our research, we examine issues occuring when stored carbon dioxide (CO2) migrates to the near-surface or surface. We believe that both the public misperception and the physical reality of potential environmental, health, and commercial impacts of leak events from such subsurface sites have prevented widespread adoption of CCS. This paper is presented in three parts; the first is an evaluation of the systemic risk of a CCS site CO2 leak and models indicating potential likelihood of a leakage event. As the likelihood of a CCS site leak is stochastic and nonlinear, we present several Bayesian simulations for leak events based on research done with other low-probability, high-consequence gaseous pollutant releases. Though we found a large, acute leak to be exceptionally rare, we demonstrate potential for a localized, chronic leak at a CCS site. To that end, we present the second piece of this paper. Using a combination of spatio-temporal models and reaction-path models, we demonstrate the interplay between leak migrations, material interactions, and atmospheric dispersion for leaks of various duration and volume. These leak-event scenarios have implications for human, environmental, and economic health; they also have a significant impact on implementation support. Public acceptance of CCS is essential for a national low-carbon future, and this is what we address in the final part of this paper. We demonstrate that CCS remains unknown to the general public in the United States. Despite its unknown state, we provide survey findings -analyzed in Slovic and Weber's 2002 framework - that show a high unknown, high dread risk perception of leaks from a CCS site. Secondary findings are a
Dangaria, Smit J.
2011-12-01
Stem/progenitor cells are a population of cells capable of providing replacement cells for a given differentiated cell type. We have applied progenitor cell-based technologies to generate novel tissue-engineered implants that use biomimetic strategies with the ultimate goal of achieving full regeneration of lost periodontal tissues. Mesenchymal periodontal tissues such as cementum, alveolar bone (AB), and periodontal ligament (PDL) are neural crest-derived entities that emerge from the dental follicle (DF) at the onset of tooth root formation. Using a systems biology approach we have identified key differences between these periodontal progenitors on the basis of global gene expression profiles, gene cohort expression levels, and epigenetic modifications, in addition to differences in cellular morphologies. On an epigenetic level, DF progenitors featured high levels of the euchromatin marker H3K4me3, whereas PDL cells, AB osteoblasts, and cementoblasts contained high levels of the transcriptional repressor H3K9me3. Secondly, we have tested the influence of natural extracellular hydroxyapatite matrices on periodontal progenitor differentiation. Dimension and structure of extracellular matrix surfaces have powerful influences on cell shape, adhesion, and gene expression. Here we show that natural tooth root topographies induce integrin-mediated extracellular matrix signaling cascades in tandem with cell elongation and polarization to generate physiological periodontium-like tissues. In this study we replanted surface topography instructed periodontal ligament progenitors (PDLPs) into rat alveolar bone sockets for 8 and 16 weeks, resulting in complete attachment of tooth roots to the surrounding alveolar bone with a periodontal ligament fiber apparatus closely matching physiological controls along the entire root surface. Displacement studies and biochemical analyses confirmed that progenitor-based engineered periodontal tissues were similar to control teeth and
Probability Aggregates in Probability Answer Set Programming
Saad, Emad
2013-01-01
Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...
2016-06-01
9 Figure 4. Prototype RHIB-based tethered balloon MAPS used in CASPER Pilot. The...profile measurements over the ocean. The system is designed to make profiling measurements with multiple up/downs using an instrumented tethered balloon ...temperature profiles with high vertical resolution. With the ultimate goal of improving evaporative duct prediction, we use a tethered 2 balloon
Scaling Qualitative Probability
Burgin, Mark
2017-01-01
There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...
Energy Technology Data Exchange (ETDEWEB)
Steinberger, R., E-mail: roland.steinberger@jku.at [Center for Surface and Nanoanalytics, Johannes Kepler University Linz, Altenberger Straße 69, 4040 Linz (Austria); Celedón, C.E., E-mail: carlos.celedon@usm.cl [Institut für Experimentalphysik, Abteilung für Atom- und Oberflächenphysik, Johannes Kepler University Linz, Altenberger Straße 69, 4040 Linz (Austria); Departamento de Física, Universidad Técnica Federico Santa María, Valaparaíso, Casilla 110-V (Chile); Bruckner, B., E-mail: barbara.bruckner@jku.at [Institut für Experimentalphysik, Abteilung für Atom- und Oberflächenphysik, Johannes Kepler University Linz, Altenberger Straße 69, 4040 Linz (Austria); Roth, D., E-mail: dietmar.roth@jku.at [Institut für Experimentalphysik, Abteilung für Atom- und Oberflächenphysik, Johannes Kepler University Linz, Altenberger Straße 69, 4040 Linz (Austria); Duchoslav, J., E-mail: jiri.duchoslav@jku.at [Center for Surface and Nanoanalytics, Johannes Kepler University Linz, Altenberger Straße 69, 4040 Linz (Austria); Arndt, M., E-mail: martin.arndt@voestalpine.com [voestalpine Stahl GmbH, voestalpine-Straße 3, 4031 Linz (Austria); Kürnsteiner, P., E-mail: p.kuernsteiner@mpie.de [Center for Surface and Nanoanalytics, Johannes Kepler University Linz, Altenberger Straße 69, 4040 Linz (Austria); and others
2017-07-31
Highlights: • Investigation on the impact of residual gas prevailing in UHV chambers. • For some metals detrimental oxygen uptake could be observed within shortest time. • Totally different behavior was found: no changes, solely adsorption and oxidation. • The UHV residual gas may severely corrupt results obtained from depth profiling. • A well-considered data acquisition sequence is the key for reliable depth profiles. - Abstract: Depth profiling using surface sensitive analysis methods in combination with sputter ion etching is a common procedure for thorough material investigations, where clean surfaces free of any contamination are essential. Hence, surface analytic studies are mostly performed under ultra-high vacuum (UHV) conditions, but the cleanness of such UHV environments is usually overrated. Consequently, the current study highlights the in principle known impact of the residual gas on metal surfaces (Fe, Mg, Al, Cr and Zn) for various surface analytics methods, like X-ray photoelectron spectroscopy (XPS), Auger electron spectroscopy (AES) and low-energy ion scattering (LEIS). The investigations with modern, state-of-the-art equipment showed different behaviors for the metal surfaces in UHV during acquisition: (i) no impact for Zn, even after long time, (ii) solely adsorption of oxygen for Fe, slight and slow changes for Cr and (iii) adsorption accompanied by oxide formation for Al and Mg. The efficiency of different counter measures was tested and the acquired knowledge was finally used for ZnMgAl coated steel to obtain accurate depth profiles, which exhibited before serious artifacts when data acquisition was performed in an inconsiderate way.
THE INITIAL MASS FUNCTION AND THE SURFACE DENSITY PROFILE OF NGC 6231
Energy Technology Data Exchange (ETDEWEB)
Sung, Hwankyung [Department of Astronomy and Space Science, Sejong University, 98, Kunja-dong, Kwangjin-gu, Seoul 143-747 (Korea, Republic of); Sana, Hugues [Astronomical Institute ' Anton Pannekeok' , Amsterdam University, Science Park 904, 1098-XH Amsterdam (Netherlands); Bessell, Michael S., E-mail: sungh@sejong.ac.kr, E-mail: H.Sana@uva.nl, E-mail: bessell@mso.anu.edu.au [Research School of Astronomy and Astrophysics, Australian National University, MSO, Cotter Road, Weston, ACT 2611 (Australia)
2013-02-01
We have performed new wide-field photometry of the young open cluster NGC 6231 to study the shape of the initial mass function (IMF) and mass segregation. We also investigated the reddening law toward NGC 6231 from optical to mid-infrared color excess ratios, and found that the total-to-selective extinction ratio is R{sub V} = 3.2, which is very close to the normal value. But many early-type stars in the cluster center show large color excess ratios. We derived the surface density profiles of four member groups, and found that they reach the surface density of field stars at about 10', regardless of stellar mass. The IMF of NGC 6231 is derived for the mass range 0.8-45 M{sub Sun }. The slope of the IMF of NGC 6231 ({Gamma} = -1.1 {+-} 0.1) is slightly shallower than the canonical value, but the difference is marginal. In addition, the mass function varies systematically, and is a strong function of radius-it is very shallow at the center, and very steep at the outer ring suggesting the cluster is mass segregated. We confirm the mass segregation for the massive stars (m {approx}> 8 M{sub Sun }) by a minimum spanning tree analysis. Using a Monte Carlo method, we estimate the total mass of NGC 6231 to be about 2.6 ({+-} 0.6) Multiplication-Sign 10{sup 3} M{sub Sun }. We constrain the age of NGC 6231 by comparison with evolutionary isochrones. The age of the low-mass stars ranges from 1 to 7 Myr with a slight peak at 3 Myr. However, the age of the high-mass stars depends on the adopted models and is 3.5 {+-} 0.5 Myr from the non-rotating or moderately rotating models of Brott et al. as well as the non-rotating models of Ekstroem et al. But the age is 4.0-7.0 Myr if the rotating models of Ekstroem et al. are adopted. This latter age is in excellent agreement with the timescale of ejection of the high-mass runaway star HD 153919 from NGC 6231, albeit the younger age cannot be entirely excluded.
Directory of Open Access Journals (Sweden)
Stromberg Arnold J
2009-09-01
Full Text Available Abstract Background Full-thickness articular cartilage lesions that reach to the subchondral bone yet are restricted to the chondral compartment usually fill with a fibrocartilage-like repair tissue which is structurally and biomechanically compromised relative to normal articular cartilage. The objective of this study was to evaluate transcriptional differences between chondrocytes of normal articular cartilage and repair tissue cells four months post-microfracture. Methods Bilateral one-cm2 full-thickness defects were made in the articular surface of both distal femurs of four adult horses followed by subchondral microfracture. Four months postoperatively, repair tissue from the lesion site and grossly normal articular cartilage from within the same femorotibial joint were collected. Total RNA was isolated from the tissue samples, linearly amplified, and applied to a 9,413-probe set equine-specific cDNA microarray. Eight paired comparisons matched by limb and horse were made with a dye-swap experimental design with validation by histological analyses and quantitative real-time polymerase chain reaction (RT-qPCR. Results Statistical analyses revealed 3,327 (35.3% differentially expressed probe sets. Expression of biomarkers typically associated with normal articular cartilage and fibrocartilage repair tissue corroborate earlier studies. Other changes in gene expression previously unassociated with cartilage repair were also revealed and validated by RT-qPCR. Conclusion The magnitude of divergence in transcriptional profiles between normal chondrocytes and the cells that populate repair tissue reveal substantial functional differences between these two cell populations. At the four-month postoperative time point, the relative deficiency within repair tissue of gene transcripts which typically define articular cartilage indicate that while cells occupying the lesion might be of mesenchymal origin, they have not recapitulated differentiation to
Oxygen 18 concentration profile measurements near the surface by 18O(p,α)15N resonance reaction
International Nuclear Information System (INIS)
Amsel, G.; David, D.
1975-01-01
The method of spectrum reduction in nuclear reaction microanalysis does not allow to obtain depth resolutions better than the order of 2000A. Resolutions of the order of 200A may be obtained by using the narrow resonance technique, when applied to thin films. The latter technique was extended to thick targets, with deep concentration profiles presenting a sharp gradient near the surface. This method is presented and illustrated by the study of 18 O profiles in oxygen diffusion measurements in growing ZrO 2 , using the 629keV resonance of the reaction 18 O(p,α) 15 N [fr
Coupled ADCPs can yield complete Reynolds stress tensor profiles in geophysical surface flows
Vermeulen, B.; Hoitink, A.J.F.; Sassi, M.G.
2011-01-01
We introduce a new technique to measure profiles of each term in the Reynolds stress tensor using coupled acoustic Doppler current profilers (ADCPs). The technique is based on the variance method which is extended to the case with eight acoustic beams. Methods to analyze turbulence from a single
Directory of Open Access Journals (Sweden)
Siyuan He
2012-01-01
Full Text Available The range profiles of a two-dimension (2 D perfect electric conductor (PEC ship on a wind-driven rough sea surface are derived by performing an inverse discrete Fourier transform (IDFT on the wide band backscattered field. The rough sea surface is assuming to be a PEC surface. The back scattered field is computed based on EM numerical simulation when the frequencies are sampled between 100 MHz and 700 MHz. Considering the strong coupling interactions between the ship and sea, the complicated multipath effect to the range profile characteristics is fully analyzed based on the multipath imaging mechanisms. The coupling mechanisms could be explained by means of ray theory prediction and numerical extraction of the coupling currents. The comparison of the range profile locations between ray theory prediction and surface current simulation is implemented and analyzed in this paper. Finally, the influence of different sea states on the radar target signatures has been examined and discussed.
International Nuclear Information System (INIS)
Lin, Chern-Sheng; Loh, Guo-Hao; Fu, Shu-Hsien; Chang, Hsun-Kai; Yang, Shih-Wei; Yeh, Mau-Shiun
2010-01-01
In this paper, an automatic evaluation method for the surface profile of a microlens array using an optical interferometric microscope is presented. For inspecting the microlens array, an XY-table is used to position it. With a He–Ne laser beam and optical fiber as a probing light, the measured image is sent to the computer to analyze the surface profile. By binary image slicing and area recognition, this study located the center of each ring and determined the substrate of the microlens array image through the background of the entire microlens array interference image. The maximum and minimum values of every segment brightness curve were determined corresponding to the change in the segment phase angle from 0° to 180°. According to the ratio of the actual ring area and the ideal ring area, the area ratio method was adopted to find the phase-angle variation of the interference ring. Based on the ratio of actual ring brightness and the ideal ring brightness, the brightness ratio method was used to determine the phase-angle variation of the interference ring fringe. The area ratio method and brightness ratio method are interchangeable in precisely determining the phase angles of the innermost and outermost rings of the interference fringe and obtaining different microlens surface altitudes of respective pixels in the segment, to greatly increase the microlens array surface profile inspection accuracy and quality
Steinberger, R.; Celedón, C. E.; Bruckner, B.; Roth, D.; Duchoslav, J.; Arndt, M.; Kürnsteiner, P.; Steck, T.; Faderl, J.; Riener, C. K.; Angeli, G.; Bauer, P.; Stifter, D.
2017-07-01
Depth profiling using surface sensitive analysis methods in combination with sputter ion etching is a common procedure for thorough material investigations, where clean surfaces free of any contamination are essential. Hence, surface analytic studies are mostly performed under ultra-high vacuum (UHV) conditions, but the cleanness of such UHV environments is usually overrated. Consequently, the current study highlights the in principle known impact of the residual gas on metal surfaces (Fe, Mg, Al, Cr and Zn) for various surface analytics methods, like X-ray photoelectron spectroscopy (XPS), Auger electron spectroscopy (AES) and low-energy ion scattering (LEIS). The investigations with modern, state-of-the-art equipment showed different behaviors for the metal surfaces in UHV during acquisition: (i) no impact for Zn, even after long time, (ii) solely adsorption of oxygen for Fe, slight and slow changes for Cr and (iii) adsorption accompanied by oxide formation for Al and Mg. The efficiency of different counter measures was tested and the acquired knowledge was finally used for ZnMgAl coated steel to obtain accurate depth profiles, which exhibited before serious artifacts when data acquisition was performed in an inconsiderate way.
International Nuclear Information System (INIS)
C. Harrington; R. Kelly; K.T. Ebert
2005-01-01
Variations in erosion and deposition for the last fifty years (based on estimates from 137Cs profiles) on surfaces (Late Pleistocene to Late Holocene in age) making up the Fortymile Wash alluvial fan south of Yucca Mountain, is a function of surface age and of desert pavement development or absence. For purposes of comparing erosion and deposition, the surfaces can be examined as three groups: (1) Late Pleistocene surfaces possess areas of desert pavement development with thin Av or sandy A horizons, formed by the trapping capabilities of the pavements. These zones of deposition are complemented by coppice dune formation on similar parts of the surface. Areas on the surface where no pavement development has occurred are erosional in nature with 0.0 +/- 0.0 cm to 1.5 +/- 0.5 cm of erosion occurring primarily by winds blowing across the surface. Overall these surfaces may show either a small net depositional gain or small erosional loss. (2) Early Holocene surfaces have no well-developed desert pavements, but may have residual gravel deposits in small areas on the surfaces. These surfaces show the most consistent erosional surface areas on which it ranges from 1.0 +/-.01 cm to 2.0+/- .01 cm. Fewer depositional forms are found on this age of surface so there is probably a net loss of 1.5 cm across these surfaces. (3) The Late Holocene surfaces show the greatest variability in erosion and deposition. Overbank deposition during floods cover many edges of these surfaces and coppice dune formation also creates depositional features. Erosion rates are highly variable and range from 0.0 +/- 0.0 to a maximum of 2.0+/-.01. Erosion occurs because of the lack of protection of the surface. However, the common areas of deposition probably result in the surface having a small net depositional gain across these surfaces. Thus, the interchannel surfaces of the Fortymile Wash fan show a variety of erosional styles as well as areas of deposition. The fan, therefore, is a dynamic
Briggs, William M.
2012-01-01
The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.
International Nuclear Information System (INIS)
Wang, Qi; Gu, Jin; Shen, Jing; Li, Zhen-fu; Jie, Jian-zheng; Wang, Wen-yue; Wang, Jin; Zhang, Zhong-tao; Li, Zhi-xia; Yan, Li
2009-01-01
Surface enhanced laser desorption and ionization time-of-flight mass spectrometry (SELDI-TOF-MS) analysis on serum samples was reported to be able to detect colorectal cancer (CRC) from normal or control patients. We carried out a validation study of a SELDI-TOF MS approach with IMAC surface sample processing to identify CRC. A retrospective cohort of 338 serum samples including 154 CRCs, 67 control cancers and 117 non-cancerous conditions was profiled using SELDI-TOF-MS. No CRC 'specific' classifier was found. However, a classifier consisting of two protein peaks separates cancer from non-cancerous conditions with high accuracy. In this study, the SELDI-TOF-MS-based protein expression profiling approach did not perform to identify CRC. However, this technique is promising in distinguishing patients with cancer from a non-cancerous population; it may be useful for monitoring recurrence of CRC after treatment
Roncarelli, M.; Ettori, S.; Dolag, K.; Moscardini, L.; Borgani, S.; Murante, G.
2006-12-01
Using a set of hydrodynamical simulations of nine galaxy clusters with masses in the range 1.5 × 1014 matter of tension between simulated and observed properties, and up to the virial radius and beyond, where present observations are unable to provide any constraints. We have modelled the radial profiles between 0.3R200 and 3R200 with power laws with one index, two indexes and a rolling index. The simulated temperature and [0.5-2] keV surface brightness profiles well reproduce the observed behaviours outside the core. The shape of all these profiles in the radial range considered depends mainly on the activity of the gravitational collapse, with no significant difference among models including extraphysics. The profiles steepen in the outskirts, with the slope of the power-law fit that changes from -2.5 to -3.4 in the gas density, from -0.5 to -1.8 in the gas temperature and from -3.5 to -5.0 in the X-ray soft surface brightness. We predict that the gas density, temperature and [0.5-2] keV surface brightness values at R200 are, on average, 0.05, 0.60, 0.008 times the measured values at 0.3R200. At 2R200, these values decrease by an order of magnitude in the gas density and surface brightness, by a factor of 2 in the temperature, putting stringent limits on the detectable properties of the intracluster-medium (ICM) in the virial regions.
Koo, Reginald; Jones, Martin L.
2011-01-01
Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.
Goldberg, Samuel
1960-01-01
Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.
Do, T.; Lu, J. R.; Ghez, A. M.; Morris, M. R.; Yelda, S.; Martinez, G. D.; Wright, S. A.; Matthews, K.
2013-02-01
We present new high angular resolution near-infrared spectroscopic observations of the nuclear star cluster surrounding the Milky Way's central supermassive black hole. Using the integral-field spectrograph OSIRIS on Keck II behind the laser-guide-star adaptive optics system, this spectroscopic survey enables us to separate early-type (young, 4-6 Myr) and late-type (old, >1 Gyr) stars with a completeness of 50% down to K' = 15.5 mag, which corresponds to ~10 M ⊙ for the early-type stars. This work increases the radial extent of reported OSIRIS/Keck measurements by more than a factor of three from 4'' to 14'' (0.16 to 0.56 pc), along the projected disk of young stars. For our analysis, we implement a new method of completeness correction using a combination of star-planting simulations and Bayesian inference. We assign probabilities for the spectral type of every source detected in deep imaging down to K' = 15.5 mag using information from spectra, simulations, number counts, and the distribution of stars. The inferred radial surface-density profiles, Σ(R)vpropR -Γ, for the young stars and late-type giants are consistent with earlier results (Γearly = 0.93 ± 0.09, Γlate = 0.16 ± 0.07). The late-type surface-density profile is approximately flat out to the edge of the survey. While the late-type stellar luminosity function is consistent with the Galactic bulge, the completeness-corrected luminosity function of the early-type stars has significantly more young stars at faint magnitudes compared with previous surveys with similar depth. This luminosity function indicates that the corresponding mass function of the young stars is likely less top-heavy than that inferred from previous surveys.
Quantum probability measures and tomographic probability densities
Amosov, GG; Man'ko, [No Value
2004-01-01
Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the
Directory of Open Access Journals (Sweden)
Craig A Gedye
Full Text Available Cell surface proteins have a wide range of biological functions, and are often used as lineage-specific markers. Antibodies that recognize cell surface antigens are widely used as research tools, diagnostic markers, and even therapeutic agents. The ability to obtain broad cell surface protein profiles would thus be of great value in a wide range of fields. There are however currently few available methods for high-throughput analysis of large numbers of cell surface proteins. We describe here a high-throughput flow cytometry (HT-FC platform for rapid analysis of 363 cell surface antigens. Here we demonstrate that HT-FC provides reproducible results, and use the platform to identify cell surface antigens that are influenced by common cell preparation methods. We show that multiple populations within complex samples such as primary tumors can be simultaneously analyzed by co-staining of cells with lineage-specific antibodies, allowing unprecedented depth of analysis of heterogeneous cell populations. Furthermore, standard informatics methods can be used to visualize, cluster and downsample HT-FC data to reveal novel signatures and biomarkers. We show that the cell surface profile provides sufficient molecular information to classify samples from different cancers and tissue types into biologically relevant clusters using unsupervised hierarchical clustering. Finally, we describe the identification of a candidate lineage marker and its subsequent validation. In summary, HT-FC combines the advantages of a high-throughput screen with a detection method that is sensitive, quantitative, highly reproducible, and allows in-depth analysis of heterogeneous samples. The use of commercially available antibodies means that high quality reagents are immediately available for follow-up studies. HT-FC has a wide range of applications, including biomarker discovery, molecular classification of cancers, or identification of novel lineage specific or stem cell
Gedye, Craig A; Hussain, Ali; Paterson, Joshua; Smrke, Alannah; Saini, Harleen; Sirskyj, Danylo; Pereira, Keira; Lobo, Nazleen; Stewart, Jocelyn; Go, Christopher; Ho, Jenny; Medrano, Mauricio; Hyatt, Elzbieta; Yuan, Julie; Lauriault, Stevan; Meyer, Mona; Kondratyev, Maria; van den Beucken, Twan; Jewett, Michael; Dirks, Peter; Guidos, Cynthia J; Danska, Jayne; Wang, Jean; Wouters, Bradly; Neel, Benjamin; Rottapel, Robert; Ailles, Laurie E
2014-01-01
Cell surface proteins have a wide range of biological functions, and are often used as lineage-specific markers. Antibodies that recognize cell surface antigens are widely used as research tools, diagnostic markers, and even therapeutic agents. The ability to obtain broad cell surface protein profiles would thus be of great value in a wide range of fields. There are however currently few available methods for high-throughput analysis of large numbers of cell surface proteins. We describe here a high-throughput flow cytometry (HT-FC) platform for rapid analysis of 363 cell surface antigens. Here we demonstrate that HT-FC provides reproducible results, and use the platform to identify cell surface antigens that are influenced by common cell preparation methods. We show that multiple populations within complex samples such as primary tumors can be simultaneously analyzed by co-staining of cells with lineage-specific antibodies, allowing unprecedented depth of analysis of heterogeneous cell populations. Furthermore, standard informatics methods can be used to visualize, cluster and downsample HT-FC data to reveal novel signatures and biomarkers. We show that the cell surface profile provides sufficient molecular information to classify samples from different cancers and tissue types into biologically relevant clusters using unsupervised hierarchical clustering. Finally, we describe the identification of a candidate lineage marker and its subsequent validation. In summary, HT-FC combines the advantages of a high-throughput screen with a detection method that is sensitive, quantitative, highly reproducible, and allows in-depth analysis of heterogeneous samples. The use of commercially available antibodies means that high quality reagents are immediately available for follow-up studies. HT-FC has a wide range of applications, including biomarker discovery, molecular classification of cancers, or identification of novel lineage specific or stem cell markers.
Electromagnetic response of the protective pellicle of Euglenoids: influence of the surface profile
Inchaussandague, Marina E.; Gigli, Miriam L.; Skigin, Diana C.; Tolivia, Analía.; Conforti, Visitación
2015-03-01
In a recent paper we have investigated, from an electromagnetic point of view, the role played by the pellicle of Euglenoids -unicellular aquatic organisms- in the protection of the cell against UV radiation.14 By modelling the pellicle as a diffraction grating, we computed the electromagnetic response of different species that exhibit different behaviors against UV radiation. In this previous study, the pellicle profile was approximated by a sinusoidal grating. However, it has been observed in the transversal cut images that the profiles are not exactly sinusoidal, and also vary from sample to sample. Since the electromagnetic response depends on the geometry of the grating, reflectance calculations that take into account a more accurate representation of the actual profile could provide more insight into this problem. In this paper we investigate the electromagnetic response of the pellicle of Euglenoids for different grating profiles. The diffraction problem is solved by using the Chandezon method, which has demonstrated a successful performance for deep gratings of arbitrary profiles. We analyze the influence of the shape, depth and period of the grating on the UV reflectance. We show that the pellicle characteristics are critical parameters to increase the reflectance, thus reducing the penetration of the UV radiation within the cell and therefore, minimizing the damage and increasing the survival of these organisms.
Probable maximum flood control
International Nuclear Information System (INIS)
DeGabriele, C.E.; Wu, C.L.
1991-11-01
This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility
Toward a generalized probability theory: conditional probabilities
International Nuclear Information System (INIS)
Cassinelli, G.
1979-01-01
The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)
Cantero, Francisco; Castro-Orgaz, Oscar; Garcia-Marín, Amanda; Ayuso, José Luis; Dey, Subhasish
2015-10-01
Is the energy equation for gradually-varied flow the best approximation for the free surface profile computations in river flows? Determination of flood inundation in rivers and natural waterways is based on the hydraulic computation of flow profiles. This is usually done using energy-based gradually-varied flow models, like HEC-RAS, that adopts a vertical division method for discharge prediction in compound channel sections. However, this discharge prediction method is not so accurate in the context of advancements over the last three decades. This paper firstly presents a study of the impact of discharge prediction on the gradually-varied flow computations by comparing thirteen different methods for compound channels, where both energy and momentum equations are applied. The discharge, velocity distribution coefficients, specific energy, momentum and flow profiles are determined. After the study of gradually-varied flow predictions, a new theory is developed to produce higher-order energy and momentum equations for rapidly-varied flow in compound channels. These generalized equations enable to describe the flow profiles with more generality than the gradually-varied flow computations. As an outcome, results of gradually-varied flow provide realistic conclusions for computations of flow in compound channels, showing that momentum-based models are in general more accurate; whereas the new theory developed for rapidly-varied flow opens a new research direction, so far not investigated in flows through compound channels.
International Nuclear Information System (INIS)
Michitsugu Mori; Kenichi Tezuka; Yasushi Takeda
2006-01-01
Flow profile factors (PFs), which adjust measurements to real flow rates, also strongly depend on flow profiles. To determine profile factors for actual power plants, manufactures of flowmeters usually conduct factory calibration tests under ambient flow conditions. Indeed, flow measurements with high accuracy for reactor feedwater require them to conduct calibration tests under real conditions, such as liquid conditions and piping layouts. On the contrary, as nuclear power plants are highly aging, readings of flowmeters for reactor feedwater systems drift due to the changes of flow profiles. The causes of those deviations are affected by the change of wall roughness of inner surface of pipings. We have conducted experiments to quantify the effects of flow patterns on the PFs due to pipe roughness and asymmetric flow, and the results of our experiments have shown the effects of elbows and pipe inner roughness, which strongly affect to the creation of the flow patterns. Those changes of flow patterns lead to large errors in measurements with transit time (time-of-flight: TOF) ultrasonic flow meters. In those experiments, changes of pipe roughness result in the changes of PFs with certain errors. Therefore, we must take into account those effects in order to measure the flow rates of feedwater with better accuracy in actual power plants. (authors)
Famiglietti, C.; Fisher, J.; Halverson, G. H.
2017-12-01
This study validates a method of remote sensing near-surface meteorology that vertically interpolates MODIS atmospheric profiles to surface pressure level. The extraction of air temperature and dew point observations at a two-meter reference height from 2001 to 2014 yields global moderate- to fine-resolution near-surface temperature distributions that are compared to geographically and temporally corresponding measurements from 114 ground meteorological stations distributed worldwide. This analysis is the first robust, large-scale validation of the MODIS-derived near-surface air temperature and dew point estimates, both of which serve as key inputs in models of energy, water, and carbon exchange between the land surface and the atmosphere. Results show strong linear correlations between remotely sensed and in-situ near-surface air temperature measurements (R2 = 0.89), as well as between dew point observations (R2 = 0.77). Performance is relatively uniform across climate zones. The extension of mean climate-wise percent errors to the entire remote sensing dataset allows for the determination of MODIS air temperature and dew point uncertainties on a global scale.
Huang, N. E.; Long, S. R.; Bliven, L. F.; Tung, C.-C.
1984-01-01
On the basis of the mapping method developed by Huang et al. (1983), an analytic expression for the non-Gaussian joint probability density function of slope and elevation for nonlinear gravity waves is derived. Various conditional and marginal density functions are also obtained through the joint density function. The analytic results are compared with a series of carefully controlled laboratory observations, and good agreement is noted. Furthermore, the laboratory wind wave field observations indicate that the capillary or capillary-gravity waves may not be the dominant components in determining the total roughness of the wave field. Thus, the analytic results, though derived specifically for the gravity waves, may have more general applications.
International Nuclear Information System (INIS)
Wang, R.
1981-03-01
Due to the complexity of the structural, microstructural and compositional characteristics of spent fuel, basic leaching and dissolution mechanisms were studied with UO 2 matrix material, specifically with single-crystal UO 2 , to isolate individual contributory factors. The effects of oxidation and oxidation-dissolution were investigated in different oxidation conditions, such as in air, oxygenated solutions and deionized water containing H 2 O 2 . In addition, the effects of temperature on dissolution of UO 2 were studied in autoclaves at 75 and 150 0 C. Also, oxidation and dissolution measurements were investigated via electrochemical methods to determine if those techniques could be applied to the characterization of leaching and dissolution of spent fuel in a hot cell. Finally, the effects of radiation were explored since the radiolysis of water may create a localized oxidizing condition at or near the spent fuel-solution interface, even in neutral or reducing conditions as commonly found in deep geological environments. The oxidation and oxidation-dissolution mechanisms for UO 2 are proposed as follows: The UO 2 surface is first oxidized in solution to form a UO/sub 2+x/ surface layer several angstroms thick. This oxidized surface has a high dissolution rate since the UO/sub 2+x/ reacts with the dissolved O 2 , or H 2 O 2 , to form uranyl complex ions in a U(VI) state. As the uranyl ions exceed the solubility limits in solution, they become hydrolyzed to form solid deposits and suspended particles of UO 3 hydrates. The thickness and porosity of the deposited UO 3 hydrate surface-film is dependent on temperature, pH and deposition time. A long-term dissolution rate is then determined by the nature of the surface film, such as porosity, solubility and mechanical properties
1980-11-24
time before and after) or cumulus fractus of bad weath’er, or both ( pannus ), usually below altostratus or nimbostratus. 8 = Cumulus and stratocumulus...vibrous upper part by cumulus, stratocumulus, stratus or pannus . + . from Surface Marine Observations Tape Deck TDF-11 *Fog All clouds in the 0-50...Fractus of bad weather, cr V both ( pannus ), usually below Alto- stratus or N~imbostratus. The term "bad weather* denotes the conditions which coenerally
Time-kill profiles and cell-surface morphological effects of crude ...
African Journals Online (AJOL)
MK1201 mycelial extract on the viability and cell surface morphology of methicillin-susceptible Staphylococcus aureus (MSSA) and methicillin-resistant Staphylococcus aureus (MRSA). Methods: Time-kill assays were conducted by incubating test ...
Buckland, Catherine; Bailey, Richard; Thomas, David
2017-04-01
Two billion people living in drylands are affected by land degradation. Sediment erosion by wind and water removes fertile soil and destabilises landscapes. Vegetation disturbance is a key driver of dryland erosion caused by both natural and human forcings: drought, fire, land use, grazing pressure. A quantified understanding of vegetation cover sensitivities and resultant surface change to forcing factors is needed if the vegetation and landscape response to future climate change and human pressure are to be better predicted. Using quartz luminescence dating and statistical changepoint analysis (Killick & Eckley, 2014) this study demonstrates the ability to identify step-changes in depositional age of near-surface sediments. Lx/Tx luminescence profiles coupled with statistical analysis show the use of near-surface sediments in providing a high-resolution record of recent system response and aeolian system thresholds. This research determines how the environment has recorded and retained sedimentary evidence of drought response and land use disturbances over the last two hundred years across both individual landforms and the wider Nebraska Sandhills. Identifying surface deposition and comparing with records of climate, fire and land use changes allows us to assess the sensitivity and stability of the surface sediment to a range of forcing factors. Killick, R and Eckley, IA. (2014) "changepoint: An R Package for Changepoint Analysis." Journal of Statistical Software, (58) 1-19.
National Oceanic and Atmospheric Administration, Department of Commerce — Temperature profile data were collected using bucket, surface seawater intake, and XBT casts from several vessels in a world wide distribution from December 07, 1995...
Rosas, Jorge
2017-09-26
The land surface temperature (LST) represents a critical element in efforts to characterize global surface energy and water fluxes, as well as being an essential climate variable in its own right. Current satellite platforms provide a range of spatial and temporal resolution radiance data from which LST can be determined. One of the most complete records of data comes via the Landsat series of satellites, which provide a continuous sequence that extends back to 1982. However, for much of this time, Landsat thermal data were provided through a single broadband thermal channel, making surface temperature retrieval challenging. To fully exploit the valuable time-series of thermal information that is available from these satellites requires efforts to better describe and understand the accuracy of temperature retrievals. Here, we contribute to these efforts by examining the impact of atmospheric correction on the estimation of LST, using atmospheric profiles derived from a range of in-situ, reanalysis, and satellite data. Radiance data from the thermal infrared (TIR) sensor onboard Landsat 8 was converted to LST by using the MODTRAN version 5.2 radiative transfer model, allowing the production of an LST time series based upon 28 Landsat overpasses. LST retrievals were then evaluated against in-situ thermal measurements collected over an arid zone farmland comprising both bare soil and vegetated surface types. Atmospheric profiles derived from AIRS, MOD07, ECMWF, NCEP, and balloon-based radiosonde data were used to drive the MODTRAN simulations. In addition to examining the direct impact of using various profile data on LST retrievals, randomly distributed errors were introduced into a range of forcing variables to better understand retrieval uncertainty. Results indicated differences in LST of up to 1 K for perturbations in emissivity and profile measurements, with the analysis also highlighting the challenges in modeling aerosol optical depth (AOD) over arid lands and
Rosas, Jorge; Houborg, Rasmus; McCabe, Matthew
2017-01-01
The land surface temperature (LST) represents a critical element in efforts to characterize global surface energy and water fluxes, as well as being an essential climate variable in its own right. Current satellite platforms provide a range of spatial and temporal resolution radiance data from which LST can be determined. One of the most complete records of data comes via the Landsat series of satellites, which provide a continuous sequence that extends back to 1982. However, for much of this time, Landsat thermal data were provided through a single broadband thermal channel, making surface temperature retrieval challenging. To fully exploit the valuable time-series of thermal information that is available from these satellites requires efforts to better describe and understand the accuracy of temperature retrievals. Here, we contribute to these efforts by examining the impact of atmospheric correction on the estimation of LST, using atmospheric profiles derived from a range of in-situ, reanalysis, and satellite data. Radiance data from the thermal infrared (TIR) sensor onboard Landsat 8 was converted to LST by using the MODTRAN version 5.2 radiative transfer model, allowing the production of an LST time series based upon 28 Landsat overpasses. LST retrievals were then evaluated against in-situ thermal measurements collected over an arid zone farmland comprising both bare soil and vegetated surface types. Atmospheric profiles derived from AIRS, MOD07, ECMWF, NCEP, and balloon-based radiosonde data were used to drive the MODTRAN simulations. In addition to examining the direct impact of using various profile data on LST retrievals, randomly distributed errors were introduced into a range of forcing variables to better understand retrieval uncertainty. Results indicated differences in LST of up to 1 K for perturbations in emissivity and profile measurements, with the analysis also highlighting the challenges in modeling aerosol optical depth (AOD) over arid lands and
Philosophical theories of probability
Gillies, Donald
2000-01-01
The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.
Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned
Interpretations of probability
Khrennikov, Andrei
2009-01-01
This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.
International Nuclear Information System (INIS)
Kumar, A.; Rao, K.S.; Srinivasan, M.
1983-01-01
The Trombay criticality formula (TCF) has been derived by incorporating a number of well-known concepts of criticality physics to enable prediction of changes in critical size or k /SUB eff/ following alterations in geometrical and physical parameters of uniformly reflected small reactor assemblies characterized by large neutron leakage from the core. The variant parameters considered are size, shape, density and diluent concentration of the core, and density and thickness of the reflector. The effect of these changes (except core size) manifests, through sigma /SUB c/ the critical surface mass density of the ''corresponding critical core,'' that sigma, the massto-surface-area ratio of the core,'' is essentially a measure of the product /rho/ extended to nonspherical systems and plays a dominant role in the TCF. The functional dependence of k /SUB eff/ on sigma/sigma /SUB c/ , the system size relative to critical, is expressed in the TCF through two alternative representations, namely the modified Wigner rational form and, an exponential form, which is given
Sapriza-Azuri, Gonzalo; Gamazo, Pablo; Razavi, Saman; Wheater, Howard S.
2018-06-01
Arctic and subarctic regions are amongst the most susceptible regions on Earth to global warming and climate change. Understanding and predicting the impact of climate change in these regions require a proper process representation of the interactions between climate, carbon cycle, and hydrology in Earth system models. This study focuses on land surface models (LSMs) that represent the lower boundary condition of general circulation models (GCMs) and regional climate models (RCMs), which simulate climate change evolution at the global and regional scales, respectively. LSMs typically utilize a standard soil configuration with a depth of no more than 4 m, whereas for cold, permafrost regions, field experiments show that attention to deep soil profiles is needed to understand and close the water and energy balances, which are tightly coupled through the phase change. To address this gap, we design and run a series of model experiments with a one-dimensional LSM, called CLASS (Canadian Land Surface Scheme), as embedded in the MESH (Modélisation Environmentale Communautaire - Surface and Hydrology) modelling system, to (1) characterize the effect of soil profile depth under different climate conditions and in the presence of parameter uncertainty; (2) assess the effect of including or excluding the geothermal flux in the LSM at the bottom of the soil column; and (3) develop a methodology for temperature profile initialization in permafrost regions, where the system has an extended memory, by the use of paleo-records and bootstrapping. Our study area is in Norman Wells, Northwest Territories of Canada, where measurements of soil temperature profiles and historical reconstructed climate data are available. Our results demonstrate a dominant role for parameter uncertainty, that is often neglected in LSMs. Considering such high sensitivity to parameter values and dependency on the climate condition, we show that a minimum depth of 20 m is essential to adequately represent
Paatero, Ilkka; Casals, Eudald; Niemi, Rasmus; Özliseli, Ezgi; Rosenholm, Jessica M; Sahlgren, Cecilia
2017-08-21
Mesoporous silica nanoparticles (MSNs) are extensively explored as drug delivery systems, but in depth understanding of design-toxicity relationships is still scarce. We used zebrafish (Danio rerio) embryos to study toxicity profiles of differently surface functionalized MSNs. Embryos with the chorion membrane intact, or dechoroniated embryos, were incubated or microinjected with amino (NH 2 -MSNs), polyethyleneimine (PEI-MSNs), succinic acid (SUCC-MSNs) or polyethyleneglycol (PEG-MSNs) functionalized MSNs. Toxicity was assessed by viability and cardiovascular function. NH 2 -MSNs, SUCC-MSNs and PEG-MSNs were well tolerated, 50 µg/ml PEI-MSNs induced 100% lethality 48 hours post fertilization (hpf). Dechoroniated embryos were more sensitive and 10 µg/ml PEI-MSNs reduced viability to 5% at 96hpf. Sensitivity to PEG- and SUCC-, but not NH 2 -MSNs, was also enhanced. Typically cardiovascular toxicity was evident prior to lethality. Confocal microscopy revealed that PEI-MSNs penetrated into the embryos whereas PEG-, NH2- and SUCC-MSNs remained aggregated on the skin surface. Direct exposure of inner organs by microinjecting NH 2 -MSNs and PEI-MSNs demonstrated that the particles displayed similar toxicity indicating that functionalization affects the toxicity profile by influencing penetrance through biological barriers. The data emphasize the need for careful analyses of toxicity mechanisms in relevant models and constitute an important knowledge step towards the development of safer and sustainable nanotherapies.
Zhang, Yueqing; Li, Qifeng; Lu, Yonglong; Jones, Kevin; Sweetman, Andrew J
2016-04-01
Hexabromocyclododecane (HBCDD) is a brominated flame retardant with a wide range of industrial applications, although little is known about its patterns of spatial distribution in soils in relation to industrial emissions. This study has undertaken a large-scale investigation around an industrialized coastal area of China, exploring the concentrations, spatial distribution and diastereoisomer profiles of HBCDD in 188 surface soils from 21 coastal cities in North China. The detection frequency was 100% and concentrations of total HBCDD in the surface soils ranged from 0.123 to 363 ng g(-1) and averaged 7.20 ng g(-1), showing its ubiquitous existence at low levels. The spatial distribution of HBCDD exhibited a correlation with the location of known manufacturing facilities in Weifang, suggesting the production of HBCDD as major emission source. Diastereoisomer profiles varied in different cities. Diastereoisomer compositions in soils were compared with emissions from HBCDD industrial activities, and correlations were found between them, which has the potential for source identification. Although the contemporary concentrations of HBCDD in soils from the study were relatively low, HBCDD-containing products (expanded/extruded polystyrene insulation boards) would be a potential source after its service life, and attention needs to be paid to prioritizing large-scale waste management efforts. Copyright © 2016 Elsevier Ltd. All rights reserved.
Davidson, C M; Peters, N J; Britton, A; Brady, L; Gardiner, P H E; Lewis, B D
2004-01-01
Modern analytical techniques have been applied to investigate the nature of lead pipe corrosion products formed in pH adjusted, orthophosphate-treated, low alkalinity water, under supply conditions. Depth profiling and surface analysis have been carried out on pipe samples obtained from the water distribution system in Glasgow, Scotland, UK. X-ray diffraction spectrometry identified basic lead carbonate, lead oxide and lead phosphate as the principal components. Scanning electron microscopy/energy-dispersive x-ray spectrometry revealed the crystalline structure within the corrosion product and also showed spatial correlations existed between calcium, iron, lead, oxygen and phosphorus. Elemental profiling, conducted by means of secondary ion mass spectrometry (SIMS) and secondary neutrals mass spectrometry (SNMS) indicated that the corrosion product was not uniform with depth. However, no clear stratification was apparent. Indeed, counts obtained for carbonate, phosphate and oxide were well correlated within the depth range probed by SIMS. SNMS showed relationships existed between carbon, calcium, iron, and phosphorus within the bulk of the scale, as well as at the surface. SIMS imaging confirmed the relationship between calcium and lead and suggested there might also be an association between chloride and phosphorus.
Neutron activation analysis to the profile surface sediments from several sites on the Havana Bay
International Nuclear Information System (INIS)
Diaz Riso, O.; Gelen, A.; Lopez, N.; Gonzalez, H.; Manso, M.V.; Graciano, A.M.; Nogueira, C.A.; Beltran, J.; Soto, J.
2003-01-01
Instrumental neutron activation analysis (INAA) technique was employed to analyze the surface sediments from several sites on the Havana Bay, Cuba. Measurements of heavy and trace elements in the sediments are reported. The results show that the concentration of the elements is site dependent. The data suggest that an anthropogenic input into the bay from domestic sewage and industries occurred
Perfect Composition Depth Profiling of Ionic Liquid Surfaces Using High-Resolution RBS/ERDA.
Czech Academy of Sciences Publication Activity Database
Nakajima, K.; Zolboo, E.; Ohashi, T.; Lísal, Martin; Kimura, K.
2016-01-01
Roč. 32, č. 10 (2016), s. 1089-1094 ISSN 0910-6340 R&D Projects: GA ČR(CZ) GA16-12291S Institutional support: RVO:67985858 Keywords : surface structure * ionic liquid * hydrogen Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 1.228, year: 2016
Toe clearance and velocity profiles of young and elderly during walking on sloped surfaces
Directory of Open Access Journals (Sweden)
Begg Rezaul K
2010-04-01
Full Text Available Abstract Background Most falls in older adults are reported during locomotion and tripping has been identified as a major cause of falls. Challenging environments (e.g., walking on slopes are potential interventions for maintaining balance and gait skills. The aims of this study were: 1 to investigate whether or not distributions of two important gait variables [minimum toe clearance (MTC and foot velocity at MTC (VelMTC] and locomotor control strategies are altered during walking on sloped surfaces, and 2 if altered, are they maintained at two groups (young and elderly female groups. Methods MTC and VelMTC data during walking on a treadmill at sloped surfaces (+3°, 0° and -3° were analysed for 9 young (Y and 8 elderly (E female subjects. Results MTC distributions were found to be positively skewed whereas VelMTC distributions were negatively skewed for both groups on all slopes. Median MTC values increased (Y = 33%, E = 7% at negative slope but decreased (Y = 25%, E = 15% while walking on the positive slope surface compared to their MTC values at the flat surface (0°. Analysis of VelMTC distributions also indicated significantly (p th percentile (Q1 values in the elderly at all slopes. Conclusion The young displayed a strong positive correlation between MTC median changes and IQR (interquartile range changes due to walking on both slopes; however, such correlation was weak in the older adults suggesting differences in control strategies being employed to minimize the risk of tripping.
Wilde, Markus; Ohno, Satoshi; Ogura, Shohei; Fukutani, Katsuyuki; Matsuzaki, Hiroyuki
2016-03-29
Nuclear reaction analysis (NRA) via the resonant (1)H((15)N,αγ)(12)C reaction is a highly effective method of depth profiling that quantitatively and non-destructively reveals the hydrogen density distribution at surfaces, at interfaces, and in the volume of solid materials with high depth resolution. The technique applies a (15)N ion beam of 6.385 MeV provided by an electrostatic accelerator and specifically detects the (1)H isotope in depths up to about 2 μm from the target surface. Surface H coverages are measured with a sensitivity in the order of ~10(13) cm(-2) (~1% of a typical atomic monolayer density) and H volume concentrations with a detection limit of ~10(18) cm(-3) (~100 at. ppm). The near-surface depth resolution is 2-5 nm for surface-normal (15)N ion incidence onto the target and can be enhanced to values below 1 nm for very flat targets by adopting a surface-grazing incidence geometry. The method is versatile and readily applied to any high vacuum compatible homogeneous material with a smooth surface (no pores). Electrically conductive targets usually tolerate the ion beam irradiation with negligible degradation. Hydrogen quantitation and correct depth analysis require knowledge of the elementary composition (besides hydrogen) and mass density of the target material. Especially in combination with ultra-high vacuum methods for in-situ target preparation and characterization, (1)H((15)N,αγ)(12)C NRA is ideally suited for hydrogen analysis at atomically controlled surfaces and nanostructured interfaces. We exemplarily demonstrate here the application of (15)N NRA at the MALT Tandem accelerator facility of the University of Tokyo to (1) quantitatively measure the surface coverage and the bulk concentration of hydrogen in the near-surface region of a H2 exposed Pd(110) single crystal, and (2) to determine the depth location and layer density of hydrogen near the interfaces of thin SiO2 films on Si(100).
DEFF Research Database (Denmark)
Hokland, M; Hokland, P; Heron, I
1978-01-01
By means of simple rosette sedimentation methods two subsets from human peripheral blood lymphocytes have been isolated: (1) (E, Fc)- and (2) (E, Ig)-. The first subset was obtained by centrifuging suspensions of macrophage-depleted PBL in which E and EA rosettes had been allowed to form simultan......By means of simple rosette sedimentation methods two subsets from human peripheral blood lymphocytes have been isolated: (1) (E, Fc)- and (2) (E, Ig)-. The first subset was obtained by centrifuging suspensions of macrophage-depleted PBL in which E and EA rosettes had been allowed to form...... simultaneously. The dominant marker of these E- Fc- cells was surface Ig, and during 4 days of culture this population did not alter its surface markers. Subset 2 was obtained in two ways following rosette centrifugation with AET-treated SRBC and rabbit anti-human Ig-coated autologous RBC. This 'Null cell...
Energy Technology Data Exchange (ETDEWEB)
Ermolaev, G V; Kovalev, O B [Khristianovich Institute of Theoretical and Applied Mechanics, Siberian Branch of Russian Academy of Sciences, Institutskaya Str 4/1, Novosibirsk, 630090 (Russian Federation)
2009-09-21
A physicomathematical model of cyclic iron combustion in an oxygen flow during oxygen laser cutting of metal sheets is developed. The combustion front is set into motion by focused laser radiation and a heterogeneous oxidation reaction in oxygen. The burning rate is limited by oxygen supply from the gas phase towards the metal surface, and the interface motion depends on the local temperature. A 3D numerical simulation predicts wavy structures on the metal surface; their linear sizes depend on the scanning speed of the laser beam, the thickness of the produced liquid oxide film and the parameters of the oxygen jet flow. Simulation results help in understanding the mechanism of striation formation during oxygen gas-laser cutting of mild steel and are in qualitative agreement with experimental findings.
Chlorhexidine controlled-release profile after EDTA root surface etching: an in vivo study.
Gamal, Ahmed Y; Kumper, Radi M; Sadek, Hesham S; El Destawy, Mahmoud T
2011-05-01
The main objective of the present study was to quantify chlorhexidine (CHX) release after the use of CHX-EDTA root surface treatment as a local-delivery antimicrobial vehicle. Twenty non-smoking patients clinically diagnosed as having moderate-to-severe chronic periodontitis were selected to participate in this study. After cause-related therapy, one site in every patient received defect overfill with CHX gel 2% (20 sites). In addition, twenty contralateral sites received defect fill of CHX gel after 3 minutes of 24% EDTA gel root surface etching (20 sites). Gingival crevicular fluid samples were collected at 1, 3, 7, and 14 days post-therapy. The CHX-EDTA group showed statistically significantly higher levels of CHX than those of the control group at 1, 3, and 7 days. At 14 days, the CHX-EDTA group showed 0.8 mg/mL values. The use of CHX-EDTA root surface treatment as a local-delivery antimicrobial improves CHX substantivity.
Twichell, David C.; Kenyon, Neil H.; Parson, Lindsay M.; McGregor, Bonnie A.
1991-01-01
GLORIA long-range side-scan sonar imagery and 3.5-kHz seismic-reflection profiles depict a series of nine elongate deposits with generally high-backscatter surfaces covering most of the latest fanlobe sequence of the Mississippi Fan in the eastern Gulf of Mexico. The youngest deposit is a “slump” that covers a 250 by 100 km area of the middle and upper fan. The remaining mapped deposits, termed depositional lobes, are long (as much as 200 km) and relatively thin (less than 35 m thick) bodies. Small channels and lineations on the surface of many of these depositional lobes radiate from a single, larger main channel that is the conduit through which sediment has been supplied to these surficial deposits on the fan. The 3.5-kHz profiles show that adjacent depositional lobes overlap one another rather than interfingering, indicating that only one lobe was an active site of deposition at a time. Shifting of the depositional sites appears to be caused by both aggradation and avulsion. The chronology developed from the overlapping relations indicates the oldest of the mapped depositional lobes are on the lowermost fan, and the youngest are further up the fan. Depositional lobes on the lower fan consist of a series of smaller, elongate features with high-backscatter surfaces (540 km in length) located at the ends of previously unrecognized small channels (turbidity currents and/or debris flows, sand flows, or mud flows appear to be the dominant transport process constructing these depositional lobes. Channelized flow is an important mechanism for transporting sediment away from the main channel on this fan and the resulting facies created by these small flows are laterally discontinuous.
International Nuclear Information System (INIS)
Fraassen, B.C. van
1979-01-01
The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)
The quantum probability calculus
International Nuclear Information System (INIS)
Jauch, J.M.
1976-01-01
The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)
Choice Probability Generating Functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....
Suru, Marius-Gabriel; Paraschiv, Adrian-Liviu; Lohan, Nicoleta Monica; Pricop, Bogdan; Ozkal, Burak; Bujoreanu, Leandru-Gheorghe
2014-07-01
The present work reports the influence of the loading mode provided during training under constant stress, in bending, applied to lamellar specimens of Cu-Zn-Al shape memory alloys (SMAs). During training, the specimens were bent by a load fastened at their free end, while being martensitic at room temperature and they lifted the load by one-way effect (1WE), during heating up to austenitic field. On cooling to martensite field, the lower concave surface of bent specimens was compressed, and during heating it was elongated, being subjected to a series of tension-compression cycles, during heating-cooling, respectively. Conversely, the upper convex surface of bent specimens was elongated during cooling and compressed during heating, being subjected to compression-tension cycles. Furthermore, 2WE-trained actuators were tested by means of a hydraulic installation where, this time heating-cooling cycles were performed in oil conditions. Considering that the lower concave surface of the specimens was kept in compressed state, while the upper convex surface was kept in elongated state, the study reveals the influence of the two loading modes and environments on the width of martensite plates of the specimens trained under various numbers of cycles. In this purpose, Cu-Zn-Al specimens, trained under 100-300-500 cycles, were prepared and analyzed by atomic force microscopy (AFM) as well as optical and scanning electron microscopy (OM and SEM, respectively). The analysis also included AFM micrographs corroborated with statistical evaluations in order to reveal the effects of loading mode (tension or compression) in different environmental conditions of the specimens, on the surface profile characteristics of martensite plates, revealed by electropolishing.
Probability of satellite collision
Mccarter, J. W.
1972-01-01
A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2013-01-01
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...
Florescu, Ionut
2013-01-01
THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
New twist in the optical schematic of surface slope measuring long trace profiler
Nikitin, Sergey M.; Gevorkyan, Gevork S.; McKinney, Wayne R.; Lacey, Ian; Takacs, Peter Z.; Yashchuk, Valeriy V.
2017-09-01
The advents of fully coherent free electron lasers and diffraction limited synchrotron storage ring sources of x-rays are catalyzing the development of new, ultra-high accuracy metrology methods. To fully exploit the potential of these sources, metrology needs to be capable of determining the figure of an optical element with sub-nanometer height accuracy. Currently, the two most prevalent slope measuring instruments used for characterization of x-ray optics are the auto-collimator based nanometer optical measuring device (NOM) and the long trace profiler (LTP) using pencil beam interferometry (PBI). These devices have been consistently improved upon by the x-ray optics metrology community, but appear to be approaching their metrological limits. Here, we revise the traditional optical schematic of the LTP. We experimentally show that, for the level of accuracy desired for metrology with state-of-the-art x-ray optics, the Dove prism in the LTP reference channel appears to be one of the major sources of instrumental error. Therefore, we suggest returning back to the original PBI LTP schematics with no Dove prism in the reference channel. In this case, the optimal scanning strategies [Yashchuk, Rev. Sci. Instrum. 80, 115101 (2009)] used to suppress the instrumental drift error have to be used to suppress a possible drift error associated with laser beam pointing instability. We experimentally and by numerical simulation demonstrate the usefulness of the suggested approach for measurements with x-ray optics with both face up and face down orientations.
Surface-seismic imaging for nehrp soil profile classifications and earthquake hazards in urban areas
Williams, R.A.; Stephenson, W.J.; Odum, J.K.
1998-01-01
We acquired high-resolution seismic-refraction data on the ground surface in selected areas of the San Fernando Valley (SFV) to help explain the earthquake damage patterns and the variation in ground motion caused by the 17 January 1994 magnitude 6.7 Northridge earthquake. We used these data to determine the compressional- and shear-wave velocities (Vp and Vs) at 20 aftershock recording sites to 30-m depth ( V??s30, and V??p30). Two other sites, located next to boreholes with downhole Vp and Vs data, show that we imaged very similar seismic-vefocity structures in the upper 40 m. Overall, high site response appears to be associated with tow Vs in the near surface, but there can be a wide rangepf site amplifications for a given NEHRP soil type. The data suggest that for the SFV, if the V??s30 is known, we can determine whether the earthquake ground motion will be amplified above a factor of 2 relative to a local rock site.
Directory of Open Access Journals (Sweden)
Ming-Hui Yang
2016-01-01
Full Text Available The microenvironment of neuron cells plays a crucial role in regulating neural development and regeneration. Hyaluronic acid (HA biomaterial has been applied in a wide range of medical and biological fields and plays important roles in neural regeneration. PC12 cells have been reported to be capable of endogenous NGF synthesis and secretion. The purpose of this research was to assess the effect of HA biomaterial combining with PC12 cells conditioned media (PC12 CM in neural regeneration. Using SH-SY5Y cells as an experimental model, we found that supporting with PC12 CM enhanced HA function in SH-SY5Y cell proliferation and adhesion. Through RP-nano-UPLC-ESI-MS/MS analyses, we identified increased expression of HSP60 and RanBP2 in SH-SY5Y cells grown on HA-modified surface with cotreatment of PC12 CM. Moreover, we also identified factors that were secreted from PC12 cells and may promote SH-SY5Y cell proliferation and adhesion. Here, we proposed a biomaterial surface enriched with neurotrophic factors for nerve regeneration application.
Lectin binding profiles of SSEA-4 enriched, pluripotent human embryonic stem cell surfaces
Venable, Alison; Mitalipova, Maisam; Lyons, Ian; Jones, Karen; Shin, Soojung; Pierce, Michael; Stice, Steven
2005-01-01
Background Pluripotent human embryonic stem cells (hESCs) have the potential to form every cell type in the body. These cells must be appropriately characterized prior to differentiation studies or when defining characteristics of the pluripotent state. Some developmentally regulated cell surface antigens identified by monoclonal antibodies in a variety of species and stem cell types have proven to be side chains of membrane glycolipids and glycoproteins. Therefore, to examine hESC surfaces for other potential pluripotent markers, we used a panel of 14 lectins, which were chosen based on their specificity for a variety of carbohydrates and carbohydrate linkages, along with stage specific embryonic antigen-4 (SSEA-4), to determine binding quantitation by flow cytometry and binding localization in adherent colonies by immunocytochemistry. Results Enriching cells for SSEA-4 expression increased the percentage of SSEA-4 positive cells to 98–99%. Using enriched high SSEA-4-expressing hESCs, we then analyzed the binding percentages of selected lectins and found a large variation in binding percentages ranging from 4% to 99% binding. Lycopersicon (tomato)esculetum lectin (TL), Ricinus communis agglutinin (RCA), and Concanavalin A (Con A) bound to SSEA-4 positive regions of hESCs and with similar binding percentages as SSEA-4. In contrast, we found Dolichos biflorus agglutinin (DBA) and Lotus tetragonolobus lectin (LTL) did not bind to hESCs while Phaseolus vulgaris leuco-agglutinin (PHA-L), Vicia villosa agglutinin (VVA), Ulex europaeus agglutinin (UEA), Phaseolus vulgaris erythro-agglutinin (PHA-E), and Maackia amurensis agglutinin (MAA) bound partially to hESCs. These binding percentages correlated well with immunocytochemistry results. Conclusion Our results provide information about types of carbohydrates and carbohydrate linkages found on pluripotent hESC surfaces. We propose that TL, RCA and Con A may be used as markers that are associated with the pluripotent
Lectin binding profiles of SSEA-4 enriched, pluripotent human embryonic stem cell surfaces
Directory of Open Access Journals (Sweden)
Shin Soojung
2005-07-01
Full Text Available Abstract Background Pluripotent human embryonic stem cells (hESCs have the potential to form every cell type in the body. These cells must be appropriately characterized prior to differentiation studies or when defining characteristics of the pluripotent state. Some developmentally regulated cell surface antigens identified by monoclonal antibodies in a variety of species and stem cell types have proven to be side chains of membrane glycolipids and glycoproteins. Therefore, to examine hESC surfaces for other potential pluripotent markers, we used a panel of 14 lectins, which were chosen based on their specificity for a variety of carbohydrates and carbohydrate linkages, along with stage specific embryonic antigen-4 (SSEA-4, to determine binding quantitation by flow cytometry and binding localization in adherent colonies by immunocytochemistry. Results Enriching cells for SSEA-4 expression increased the percentage of SSEA-4 positive cells to 98–99%. Using enriched high SSEA-4-expressing hESCs, we then analyzed the binding percentages of selected lectins and found a large variation in binding percentages ranging from 4% to 99% binding. Lycopersicon (tomatoesculetum lectin (TL, Ricinus communis agglutinin (RCA, and Concanavalin A (Con A bound to SSEA-4 positive regions of hESCs and with similar binding percentages as SSEA-4. In contrast, we found Dolichos biflorus agglutinin (DBA and Lotus tetragonolobus lectin (LTL did not bind to hESCs while Phaseolus vulgaris leuco-agglutinin (PHA-L, Vicia villosa agglutinin (VVA, Ulex europaeus agglutinin (UEA, Phaseolus vulgaris erythro-agglutinin (PHA-E, and Maackia amurensis agglutinin (MAA bound partially to hESCs. These binding percentages correlated well with immunocytochemistry results. Conclusion Our results provide information about types of carbohydrates and carbohydrate linkages found on pluripotent hESC surfaces. We propose that TL, RCA and Con A may be used as markers that are associated with the
Determination of line profiles on nano-structured surfaces using EUV and x-ray scattering
Soltwisch, Victor; Wernecke, Jan; Haase, Anton; Probst, Jürgen; Schoengen, Max; Krumrey, Michael; Scholze, Frank; Pomplun, Jan; Burger, Sven
2014-09-01
Non-imaging techniques like X-ray scattering are supposed to play an important role in the further development of CD metrology for the semiconductor industry. Grazing Incidence Small Angle X-ray Scattering (GISAXS) provides directly assessable information on structure roughness and long-range periodic perturbations. The disadvantage of the method is the large footprint of the X-ray beam on the sample due to the extremely shallow angle of incidence. This can be overcome by using wavelengths in the extreme ultraviolet (EUV) spectral range, EUV small angle scattering (EUVSAS), which allows for much steeper angles of incidence but preserves the range of momentum transfer that can be observed. Generally, the potentially higher momentum transfer at shorter wavelengths is counterbalanced by decreasing diffraction efficiency. This results in a practical limit of about 10 nm pitch for which it is possible to observe at least the +/- 1st diffraction orders with reasonable efficiency. At the Physikalisch-Technische Bundesanstalt (PTB), the available photon energy range extends from 50 eV up to 10 keV at two adjacent beamlines. PTB commissioned a new versatile Ellipso-Scatterometer which is capable of measuring 6" square substrates in a clean, hydrocarbon-free environment with full flexibility regarding the direction of the incident light polarization. The reconstruction of line profiles using a geometrical model with six free parameters, based on a finite element method (FEM) Maxwell solver and a particle swarm based least-squares optimization yielded consistent results for EUV-SAS and GISAXS. In this contribution we present scatterometry data for line gratings and consistent reconstruction results of the line geometry for EUV-SAS and GISAXS.
Freund, John E
1993-01-01
Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.
Probability, Nondeterminism and Concurrency
DEFF Research Database (Denmark)
Varacca, Daniele
Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...
Directory of Open Access Journals (Sweden)
Kumar Sukhdeo
Full Text Available Colon cancer is a deadly disease affecting millions of people worldwide. Current treatment challenges include management of disease burden as well as improvements in detection and targeting of tumor cells. To identify disease state-specific surface antigen signatures, we combined fluorescent cell barcoding with high-throughput flow cytometric profiling of primary and metastatic colon cancer lines (SW480, SW620, and HCT116. Our multiplexed technique offers improvements over conventional methods by permitting the simultaneous and rapid screening of cancer cells with reduced effort and cost. The method uses a protein-level analysis with commercially available antibodies on live cells with intact epitopes to detect potential tumor-specific targets that can be further investigated for their clinical utility. Multiplexed antibody arrays can easily be applied to other tumor types or pathologies for discovery-based approaches to target identification.
Xu, Hui Fang; Sun, Wen; Han, Xin Feng
2018-06-01
An analytical model of surface potential profiles and transfer characteristics for hetero stacked tunnel field-effect transistors (HS-TFETs) is presented for the first time, where hetero stacked materials are composed of two different bandgaps. The bandgap of the underlying layer is smaller than that of the upper layer. Under different device parameters (upper layer thickness, underlying layer thickness, and hetero stacked materials) and temperature, the validity of the model is demonstrated by the agreement of its results with the simulation results. Moreover, the results show that the HS-TFETs can obtain predominant performance with relatively slow changes of subthreshold swing (SS) over a wide drain current range, steep average subthreshold swing, high on-state current, and large on–off state current ratio.
Chao, David F.; McQuillen, J. B.; Sankovic, J. M.; Zhang, Nengli
2009-01-01
As discovered by recent studies, what directly affects the wetting and spreading is curvature in micro-region rather than the macroscopic contact angle. Measuring the profile of the micro-region becomes an important research topic. Recently, catastrophe optics has been applied to this kind of measurements. Optical catastrophe occurring in far field of waves of liquid-refracted laser beam implies a wealth of information about the liquid spreading not only for liquid drops but also for films. When a parallel laser beam passes through a liquid film on a slide glass at three-phase-line (TPL), very interesting optical image patterns occur on a screen far from the film. An analysis based on catastrophe optics discloses and interprets the formation of these optical image patterns. The analysis reveals that the caustic line manifested as the bright-thick line on the screen implies the lowest hierarchy of optical catastrophes, called fold caustic. This optical catastrophe is produced by the inflexion line on liquid surface at the liquid foot, which is formed not only in the spreading of drops but also in spreading of films. The generalized catastrophe optics method enables to identify the edge profiles and determine the edge foot height of liquid films. Keywords: Crossover region, Inflexion line, liquid edge foot, Catastrophe optics, Caustic and diffraction
Rocchi, Paolo
2014-01-01
The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.
Iwahana, G.; Wilson, C.; Newman, B. D.; Heikoop, J. M.; Busey, R.
2017-12-01
Wetlands associated with ice-wedge polygons are commonly distributed across the Arctic Coastal Plain of northern Alaska, a region underlain by continuous permafrost. Micro-topography of the ice-wedge polygons controls local hydrology, and the micro-topography could be altered due to factors such like surface vegetation, wetness, freeze-thaw cycles, and permafrost degradation/aggradation under climate change. Understanding status of the wetlands in the near future is important because it determines biogeochemical cycle, which drives release of greenhouse gases from the ground. However, transitional regime of the ice-wedge polygons under the changing climate is not fully understood. In this study, we analyzed geochemistry of water extracted from frozen soil cores sampled down to about 1m depth in 2014 March at NGEE-Arctic sites in the Barrow Environmental Observatory. The cores were sampled from troughs/rims/centers of five different low-centered or flat-centered polygons. The frozen cores are divided into 5-10cm cores for each location, thawed in sealed plastic bags, and then extracted water was stored in vials. Comparison between the profiles of geochemistry indicated connection of soil water in the active layer at different location in a polygon, while it revealed that distinctly different water has been stored in permafrost layer at troughs/rims/centers of some polygons. Profiles of volumetric water content (VWC) showed clear signals of freeze-up desiccation in the middle of saturated active layers as low VWC anomalies at most sampling points. Water in the active layer and near-surface permafrost was classified into four categories: ice wedge / fresh meteoric / transitional / highly fractionated water. The overall results suggested prolonged separation of water in the active layer at the center of low-centered polygons without lateral connection in water path in the past.
International Nuclear Information System (INIS)
Abdeen, Mostafa A. M.
2008-01-01
Most of the open water irrigation channels in Egypt suffer from the infestation of aquatic weeds, especially the submerged ones that cause numerous hydraulic problems for the open channels themselves and their water distributaries such as increasing water losses, obstructing water flow, and reducing channels' water distribution efficiencies. Accurate simulation and prediction of flow behavior in such channels is very essential for water distribution decision makers. Artificial neural networks (ANN) have proven to be very successful in the simulation of several physical phenomena, in general, and in the water research field in particular. Therefore, the current study aims towards introducing the utilization of ANN in simulating the impact of vegetation in main open channel, which supplies water to different distributaries, on the water surface profile in this main channel. Specifically, the study, presented in the current paper utilizes ANN technique for the development of various models to simulate the impact of different submerged weeds' densities, different flow discharges, and different distributaries operation scheduling on the water surface profile in an experimental main open channel that supplies water to different distributaries. In the investigated experiment, the submerged weeds were simulated as branched flexible elements. The investigated experiment was considered as an example for implementing the same methodology and technique in a real open channel system. The results showed that the ANN technique is very successful in simulating the flow behavior of the pre-mentioned open channel experiment with the existence of the submerged weeds. In addition, the developed ANN models were capable of predicting the open channel flow behavior in all the submerged weeds' cases that were considered in the ANN development process
Chen, Yuan-Liu; Xu, Yanhao; Shimizu, Yuki; Matsukuma, Hiraku; Gao, Wei
2018-06-01
This paper presents a high quality-factor (Q-factor) quartz tuning fork (QTF) with a glass probe attached, used in frequency modulation tapping mode atomic force microscopy (AFM) for the surface profile metrology of micro and nanostructures. Unlike conventionally used QTFs, which have tungsten or platinum probes for tapping mode AFM, and suffer from a low Q-factor influenced by the relatively large mass of the probe, the glass probe, which has a lower density, increases the Q-factor of the QTF probe unit allowing it to obtain better measurement sensitivity. In addition, the process of attaching the probe to the QTF with epoxy resin, which is necessary for tapping mode AFM, is also optimized to further improve the Q-factor of the QTF glass probe. The Q-factor of the optimized QTF glass probe unit is demonstrated to be very close to that of a bare QTF without a probe attached. To verify the effectiveness and the advantages of the optimized QTF glass probe unit, the probe unit is integrated into a home-built tapping mode AFM for conducting surface profile measurements of micro and nanostructures. A blazed grating with fine tool marks of 100 nm, a microprism sheet with a vertical amplitude of 25 µm and a Fresnel lens with a steep slope of 90 degrees are used as measurement specimens. From the measurement results, it is demonstrated that the optimized QTF glass probe unit can achieve higher sensitivity as well as better stability than conventional probes in the measurement of micro and nanostructures.
Directory of Open Access Journals (Sweden)
Xia Qinxiang
2016-01-01
Full Text Available Over thinning is a serious defect influencing the forming quality of spun workpiece during multi-pass deep drawing spinning. Surface-profile and movement-path of roller are the key factors influencing the thinning ratio of wall thickness of spun workpiece. The influence of surface-profile and movement-path of roller on thickness thinning were studied based on numerical simulation and experimental research, four groups of forming experiments were carried out under the combination of the different surface-profile of roller (R12 and R25-12 and movement-path of roller (spinning from the bottom of the blank and spinning from the middle of the blank. The results show that both the surface-profile and movement-path of roller have great influence on wall thickness thinning during multi-pass deep drawing spinning; and compared with the movement-path of roller, the influence of surface-profile of roller is more significant. The experimental results conform well to the simulation ones. It indicates that the FEA model established is reasonable and reliable.
Matias-Guiu, Pau; Rodríguez-Bencomo, Juan José; Pérez-Correa, José R; López, Francisco
2018-04-15
Developing new distillation strategies can help the spirits industry to improve quality, safety and process efficiency. Batch stills equipped with a packed column and an internal partial condenser are an innovative experimental system, allowing a fast and flexible management of the rectification. In this study, the impact of four factors (heart-cut volume, head-cut volume, pH and cooling flow rate of the internal partial condenser during the head-cut fraction) on 18 major volatile compounds of Muscat spirits was optimized using response surface methodology and desirability function approaches. Results have shown that high rectification at the beginning of the heart-cut enhances the overall positive aroma compounds of the product, reducing off-flavor compounds. In contrast, optimum levels of heart-cut volume, head-cut volume and pH factors varied depending on the process goal. Finally, three optimal operational conditions (head off-flavors reduction, flowery terpenic enhancement and fruity ester enhancement) were evaluated by chemical and sensory analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.
International Nuclear Information System (INIS)
Mumtaz, M.W.; Raza, M.A.; Ahmed, Z.; Abbas, M.N.; Hussain, M.
2015-01-01
The physico-chemical characterization of the surface water. Samples was carried out collected from nine sampling points of drain passing by the territory of Hafizabad city, Punjab, Pakistan. The water of drain is used by farmers for irrigation purposes in nearby agricultural fields. Twenty water quality parameters were evaluated in three turns and the results obtained were compared with the National Environmental Quality Standards (NEQS) municipal and industrial effluents prescribed limits. The highly significant difference (p<0.01) was recorded for the content of phenols, carbonyl compounds, cyanides, dissolved oxygen, biological oxygen demand, total soluble salts, total dissolved salts, nitrates and sulphates, whereas, the concentration of magnesium, potassium and oil and grease differed significantly (p<0.05) with respect to the sampling points on average basis. Non-significant difference (p>0.05) was noted for temperature, pH, electrical conductivity, hardness, calcium, sodium, chemical oxygen demand and chloride among water samples from different sampling points. Furthermore, the experimental results of different water quality parameters studied at nine sampling points of the drain were used and interpolated in ArcGIS 9.3 environment system using kriging techniques to obtain calculated values for the remaining locations of the Drain. (author)
Kuuluvainen, Heino; Poikkimäki, Mikko; Järvinen, Anssi; Kuula, Joel; Irjala, Matti; Dal Maso, Miikka; Keskinen, Jorma; Timonen, Hilkka; Niemi, Jarkko V; Rönkkö, Topi
2018-05-23
The vertical profiles of lung deposited surface area (LDSA) concentration were measured in an urban street canyon in Helsinki, Finland, by using an unmanned aerial system (UAS) as a moving measurement platform. The street canyon can be classified as an avenue canyon with an aspect ratio of 0.45 and the UAS was a multirotor drone especially modified for emission measurements. In the experiments of this study, the drone was equipped with a small diffusion charge sensor capable of measuring the alveolar LDSA concentration of particles. The drone measurements were conducted during two days on the same spatial location at the kerbside of the street canyon by flying vertically from the ground level up to an altitude of 50 m clearly above the rooftop level (19 m) of the nearest buildings. The drone data were supported by simultaneous measurements and by a two-week period of measurements at nearby locations with various instruments. The results showed that the averaged LDSA concentrations decreased approximately from 60 μm 2 /cm 3 measured close to the ground level to 36-40 μm 2 /cm 3 measured close to the rooftop level of the street canyon, and further to 16-26 μm 2 /cm 3 measured at 50 m. The high-resolution measurement data enabled an accurate analysis of the functional form of vertical profiles both in the street canyon and above the rooftop level. In both of these regions, exponential fits were used and the parameters obtained from the fits were thoroughly compared to the values found in literature. The results of this study indicated that the role of turbulent mixing caused by traffic was emphasized compared to the street canyon vortex as a driving force of the dispersion. In addition, the vertical profiles above the rooftop level showed a similar exponential decay compared to the profiles measured inside the street canyon. Copyright © 2018 Elsevier Ltd. All rights reserved.
Billingsley, Patrick
2012-01-01
Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this
International Nuclear Information System (INIS)
Bitsakis, E.I.; Nicolaides, C.A.
1989-01-01
The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs
Gong, Yuanzheng; Seibel, Eric J.
2017-01-01
Rapid development in the performance of sophisticated optical components, digital image sensors, and computer abilities along with decreasing costs has enabled three-dimensional (3-D) optical measurement to replace more traditional methods in manufacturing and quality control. The advantages of 3-D optical measurement, such as noncontact, high accuracy, rapid operation, and the ability for automation, are extremely valuable for inline manufacturing. However, most of the current optical approaches are eligible for exterior instead of internal surfaces of machined parts. A 3-D optical measurement approach is proposed based on machine vision for the 3-D profile measurement of tiny complex internal surfaces, such as internally threaded holes. To capture the full topographic extent (peak to valley) of threads, a side-view commercial rigid scope is used to collect images at known camera positions and orientations. A 3-D point cloud is generated with multiview stereo vision using linear motion of the test piece, which is repeated by a rotation to form additional point clouds. Registration of these point clouds into a complete reconstruction uses a proposed automated feature-based 3-D registration algorithm. The resulting 3-D reconstruction is compared with x-ray computed tomography to validate the feasibility of our proposed method for future robotically driven industrial 3-D inspection.
Ebert, Berit; Melle, Christian; Lieckfeldt, Elke; Zöller, Daniela; von Eggeling, Ferdinand; Fisahn, Joachim
2008-08-25
Here, we describe a novel approach for investigating differential protein expression within three epidermal cell types. In particular, 3000 single pavement, basal, and trichome cells from leaves of Arabidopsis thaliana were harvested by glass micro-capillaries. Subsequently, these single cell samples were joined to form pools of 100 individual cells and analyzed using the ProteinChip technology; SELDI: surface-enhanced laser desorption and ionization. As a result, numerous protein signals that were differentially expressed in the three epidermal cell types could be detected. One of these proteins was characterized by tryptical digestion and subsequent identification via tandem quadrupole-time of flight (Q-TOF) mass spectrometry. Down regulation of this sequenced small subunit precursor of ribulose-1,5 bisphosphate carboxylase(C) oxygenase(O) (RuBisCo) in trichome and basal cells indicates the sink status of these cell types that are located on the surface of A. thaliana source leaves. Based on the obtained protein profiles, we suggest a close functional relationship between basal and trichome cells at the protein level.
Shorack, Galen R
2017-01-01
This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...
Concepts of probability theory
Pfeiffer, Paul E
1979-01-01
Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
Probability and Statistical Inference
Prosper, Harrison B.
2006-01-01
These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.
Hartmann, Stephan
2011-01-01
Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...
Grimmett, Geoffrey
2014-01-01
Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...
Hemmo, Meir
2012-01-01
What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive.
International Nuclear Information System (INIS)
Wang Chuangye; Morgner, Harald
2011-01-01
In the current work, we first reconstructed the molar fraction-depth profiles of cation and anion near the surface of tetrabutylammonium iodide dissolved in formamide by a refined calculation procedure, based on angle resolved X-ray photoelectron spectroscopy experiments. In this calculation procedure, both the transmission functions of the core levels and the inelastic mean free paths of the photoelectrons have been taken into account. We have evaluated the partial molar volumes of surfactant and solvent by the densities of such solutions with different bulk concentrations. With those partial molar volumes, the molar concentration-depth profiles of tetrabutylammonium ion and iodide ion were determined. The surface excesses of both surfactant ions were then achieved directly by integrating these depth profiles. The anionic molar concentration-depth profiles and surface excesses have been compared with their counterparts determined by neutral impact ion scattering spectroscopy. The comparisons exhibit good agreements. Being capable of determining molar concentration-depth profiles of surfactant ions by core levels with different kinetic energies may extend the applicable range of ARXPS in investigating solution surfaces.
Probability in quantum mechanics
Directory of Open Access Journals (Sweden)
J. G. Gilson
1982-01-01
Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.
Quantum computing and probability.
Ferry, David K
2009-11-25
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.
Quantum computing and probability
International Nuclear Information System (INIS)
Ferry, David K
2009-01-01
Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)
Velocity dispersion profiles of clusters of galaxies
International Nuclear Information System (INIS)
Struble, M.F.
1979-01-01
Velocity dispersion as a function of radius, called sigma/sub ls/ profiles, is presented for 13 clusters of galaxies having > or =30 radial velocities from both published and unpublished lists. A list of probable new members and possible outlying members for these clusters is also given. chi 2 and Kolmogoroff--Smirnoff one-sample tests for the goodness of fit of power laws to portions of the profiles indicate two significant structures in some profiles: (1) a local minimum corresponding to the local minimum noted in surface density or surface brightness profiles, and (2) a decrease in sigma/sub ls/ toward the cores. Both of these features are discussed in terms of a comparison with Wielen's N-body simulations. The sigma/sub ls/ profiles are placed in a new classification scheme which lends itself to interpreting clusters in a dynamical age sequence. The velocity field of galaxies at large distances from cluster centers is also discussed
The perception of probability.
Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E
2014-01-01
We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
Irreversibility and conditional probability
International Nuclear Information System (INIS)
Stuart, C.I.J.M.
1989-01-01
The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)
Isaac, Richard
1995-01-01
The ideas of probability are all around us. Lotteries, casino gambling, the al most non-stop polling which seems to mold public policy more and more these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...
Experimental Probability in Elementary School
Andrew, Lane
2009-01-01
Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.
Improving Ranking Using Quantum Probability
Melucci, Massimo
2011-01-01
The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2010-01-01
This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....
Probability and stochastic modeling
Rotar, Vladimir I
2012-01-01
Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...
Collision Probability Analysis
DEFF Research Database (Denmark)
Hansen, Peter Friis; Pedersen, Preben Terndrup
1998-01-01
It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...
Riffel, R. A.; Storchi-Bergmann, T.; Riffel, R.; Davies, R.; Bianchin, M.; Diniz, M. R.; Schönell, A. J.; Burtscher, L.; Crenshaw, M.; Fischer, T. C.; Dahmer-Hahn, L. G.; Dametto, N. Z.; Rosario, D.
2018-02-01
We present and characterize a sample of 20 nearby Seyfert galaxies selected for having BAT 14-195 keV luminosities LX ≥ 1041.5 erg s-1, redshift z ≤ 0.015, being accessible for observations with the Gemini Near-Infrared Field Spectrograph (NIFS) and showing extended [O III]λ5007 emission. Our goal is to study Active Galactic Nucleus (AGN) feeding and feedback processes from near-infrared integral-field spectra, which include both ionized (H II) and hot molecular (H2) emission. This sample is complemented by other nine Seyfert galaxies previously observed with NIFS. We show that the host galaxy properties (absolute magnitudes MB, MH, central stellar velocity dispersion and axial ratio) show a similar distribution to those of the 69 BAT AGN. For the 20 galaxies already observed, we present surface mass density (Σ) profiles for H II and H2 in their inner ˜500 pc, showing that H II emission presents a steeper radial gradient than H2. This can be attributed to the different excitation mechanisms: ionization by AGN radiation for H II and heating by X-rays for H2. The mean surface mass densities are in the range (0.2 ≤ ΣH II ≤ 35.9) M⊙ pc-2, and (0.2 ≤ ΣH2 ≤ 13.9)× 10-3 M⊙ pc-2, while the ratios between the H II and H2 masses range between ˜200 and 8000. The sample presented here will be used in future papers to map AGN gas excitation and kinematics, providing a census of the mass inflow and outflow rates and power as well as their relation with the AGN luminosity.
Estimating Subjective Probabilities
DEFF Research Database (Denmark)
Andersen, Steffen; Fountain, John; Harrison, Glenn W.
2014-01-01
either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...
Introduction to imprecise probabilities
Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M
2014-01-01
In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin
Classic Problems of Probability
Gorroochurn, Prakash
2012-01-01
"A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin
Dutkiewicz, Ewelina P; Chiu, Hsien-Yi; Urban, Pawel L
2015-11-01
Micropatch-arrayed pads (MAPAs) are presented as a facile and sensitive sampling method for spatial profiling of topical agents adsorbed on the surface of skin. MAPAs are 28 × 28 mm sized pieces of polytetrafluoroethylene containing plurality of cavities filled with agarose hydrogel. They are affixed onto skin for 10 min with the purpose to collect drugs applied topically. Polar compounds are absorbed by the hydrogel micropatches. The probes are subsequently scanned by an automated nanospray desorption electrospray ionization mass spectrometry system operated in the tapping dual-polarity mode. When the liquid junction gets into contact with every micropatch, polar compounds absorbed in the hydrogel matrix are desorbed and transferred to the ion source. A 3D-printed interface prevents evaporation of hydrogel micropatches assuring good reproducibility and sensitivity. MAPAs have been applied to follow dispersion of topical drugs applied to human skin in vivo and to porcine skin ex vivo, in the form of self-adhesive patches. Spatiotemporal characteristics of the drug dispersion process have been revealed using this non-invasive test. Differences between drug dispersion in vivo and ex vivo could be observed. We envision that MAPAs can be used to investigate spatiotemporal kinetics of various topical agents utilized in medical treatment. Copyright © 2015 John Wiley & Sons, Ltd.
Long-Term Impact of Sediment Deposition and Erosion on Water Surface Profiles in the Ner River
Directory of Open Access Journals (Sweden)
Tomasz Dysarz
2017-02-01
Full Text Available The purpose of the paper is to test forecasting of the sediment transport process, taking into account two main uncertainties involved in sediment transport modeling. These are: the lack of knowledge regarding future flows, and the uncertainty with respect to which sediment transport formula should be chosen for simulations. The river reach chosen for study is the outlet part of the Ner River, located in the central part of Poland. The main characteristic of the river is the presence of an intensive morphodynamic process, increasing flooding frequency. The approach proposed here is based on simulations with a sediment-routing model and assessment of the hydraulic condition changes on the basis of hydrodynamic calculations for the chosen characteristic flows. The data used include Digital Terrain Models (DTMs, cross-section measurements, and hydrological observations from the Dabie gauge station. The sediment and hydrodynamic calculations are performed using program HEC-RAS 5.0. Twenty inflow scenarios are of a 10-year duration and are composed on the basis of historical data. Meyer-Peter and Müller and Engelund-Hansen formulae are applied for the calculation of sediment transport intensity. The methodology presented here seems to be a good tool for the prediction of long-term impacts on water surface profiles caused by sediment deposition and erosion.
Directory of Open Access Journals (Sweden)
M. G. Manoj
2011-03-01
Full Text Available Atmospheric gravity waves, which are a manifestation of the fluctuations in buoyancy of the air parcels, are well known for their direct influence on concentration of atmospheric trace gases and aerosols, and also on oscillations of meteorological variables such as temperature, wind speed, visibility and so on. The present paper reports quasi-periodic oscillations in the lidar backscatter signal strength due to aerosol fluctuations in the nocturnal boundary layer, studied with a high space-time resolution polarimetric micro pulse lidar and concurrent meteorological parameters over a tropical station in India. The results of the spectral analysis of the data, archived on some typical clear-sky conditions during winter months of 2008 and 2009, exhibit a prominent periodicity of 20–40 min in lidar-observed aerosol variability and show close association with those observed in the near-surface temperature and wind at 5% statistical significance. Moreover, the lidar aerosol backscatter signal strength variations at different altitudes, which have been generated from the height-time series of the one-minute interval profiles at 2.4 m vertical resolution, indicate vertical propagation of these waves, exchanging energy between lower and higher height levels. Such oscillations are favoured by the stable atmospheric background condition and peculiar topography of the experimental site. Accurate representation of these buoyancy waves is essential in predicting the sporadic fluctuations of weather in the tropics.
Energy Technology Data Exchange (ETDEWEB)
Seo, Ho Geon; Kim, Myung Hwan; Choi, Sung Ho; Kim, Chung Seok; Jhang, Kyung Young [Hanyang University, Seoul (Korea, Republic of)
2012-08-15
Using a single-line pulsed laser beam is well known as a useful noncontact method to generate a directional surface acoustic wave. In this method, different laser beam energy profiles produce different waveforms and frequency characteristics. In this paper, we considered two typical kinds of laser beam energy profiles, Gaussian and square-like, to find out a difference in the frequency characteristics. To achieve this, mathematical models were proposed first for Gaussian laser beam profile and square-like respectively, both of which depended on the laser beam width. To verify the theoretical models, experimental setups with a cylindrical lens and a line-slit mask were respectively designed to produce a line laser beam with Gaussian spatial energy profile and square-like. The frequency responses of the theoretical models showed good agreement with experimental results in terms of the existence of harmonic frequency components and the shift of the first peak frequencies to low.
Uchida, Mai; Faraone, Stephen V; Martelon, MaryKate; Kenworthy, Tara; Woodworth, K Yvonne; Spencer, Thomas; Wozniak, Janet; Biederman, Joseph
2014-01-01
Background Previous work shows that children with high scores (2 SD, combined score ≥ 210) on the Attention Problems, Aggressive Behavior, and Anxious-Depressed (A-A-A) subscales of the Child Behavior Checklist (CBCL) are more likely than other children to meet criteria for bipolar (BP)-I disorder. However, the utility of this profile as a screening tool has remained unclear. Methods We compared 140 patients with pediatric BP-I disorder, 83 with attention deficit hyperactivity disorder (ADHD), and 114 control subjects. We defined the CBCL-Severe Dysregulation profile as an aggregate cutoff score of ≥ 210 on the A-A-A scales. Patients were assessed with structured diagnostic interviews and functional measures. Results Patients with BP-I disorder were significantly more likely than both control subjects (Odds Ratio [OR]: 173.2; 95% Confidence Interval [CI], 21.2 to 1413.8; P < 0.001) and those with ADHD (OR: 14.6; 95% CI, 6.2 to 34.3; P < 0.001) to have a positive CBCL-Severe Dysregulation profile. Receiver Operating Characteristics analyses showed that the area under the curve for this profile comparing children with BP-I disorder against control subjects and those with ADHD was 99% and 85%, respectively. The corresponding positive predictive values for this profile were 99% and 92% with false positive rates of < 0.2% and 8% for the comparisons with control subjects and patients with ADHD, respectively. Limitations Non-clinician raters administered structured diagnostic interviews, and the sample was referred and largely Caucasian. Conclusions The CBCL-Severe Dysregulation profile can be useful as a screen for BP-I disorder in children in clinical practice. PMID:24882182
Counterexamples in probability
Stoyanov, Jordan M
2013-01-01
While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.
Plotnitsky, Arkady
2010-01-01
Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general
Transition probabilities for atoms
International Nuclear Information System (INIS)
Kim, Y.K.
1980-01-01
Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods
Energy Technology Data Exchange (ETDEWEB)
Yan, X.L.; Coetsee, E. [Department of Physics, University of the Free State, P O Box 339, Bloemfontein, ZA9300 (South Africa); Wang, J.Y., E-mail: wangjy@stu.edu.cn [Department of Physics, Shantou University, 243 Daxue Road, Shantou, 515063, Guangdong (China); Swart, H.C., E-mail: swartHC@ufs.ac.za [Department of Physics, University of the Free State, P O Box 339, Bloemfontein, ZA9300 (South Africa); Terblans, J.J., E-mail: terblansjj@ufs.ac.za [Department of Physics, University of the Free State, P O Box 339, Bloemfontein, ZA9300 (South Africa)
2017-07-31
Highlights: • Linear Least Square (LLS) method used to separate Ni and Cu Auger spectra. • The depth-dependent ion sputtering induced roughness was quantitatively evaluated. • The depth resolution better when profiling with dual-ion beam vs. a single-ion beam. • AES depth profiling with a lower ion energy results in a better depth resolution. - Abstract: The polycrystalline Ni/Cu multilayer thin films consisting of 8 alternating layers of Ni and Cu were deposited on a SiO{sub 2} substrate by means of electron beam evaporation in a high vacuum. Concentration-depth profiles of the as-deposited multilayered Ni/Cu thin films were determined with Auger electron spectroscopy (AES) in combination with Ar{sup +} ion sputtering, under various bombardment conditions with the samples been stationary as well as rotating in some cases. The Mixing-Roughness-Information depth (MRI) model used for the fittings of the concentration-depth profiles accounts for the interface broadening of the experimental depth profiling. The interface broadening incorporates the effects of atomic mixing, surface roughness and information depth of the Auger electrons. The roughness values extracted from the MRI model fitting of the depth profiling data agrees well with those measured by atomic force microscopy (AFM). The ion sputtering induced surface roughness during the depth profiling was accordingly quantitatively evaluated from the fitted MRI parameters with sample rotation and stationary conditions. The depth resolutions of the AES depth profiles were derived directly from the values determined by the fitting parameters in the MRI model.
Negative probability in the framework of combined probability
Burgin, Mark
2013-01-01
Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...
Caneppele, Taciana Marco Ferraz; Jeronymo, Raffaela Di Iorio; Di Nicoló, Rebeca; Araújo, Maria Amélia Máximo de; Soares, Luís Eduardo Silva
2012-01-01
The aim of this study was to investigate the effects of some acidic drinks on dentin erosion, using methods of surface profile (SP) analysis and energy-dispersive X-ray fluorescence spectrometry (EDXRF). One hundred standardized dentin slabs obtained from bovine incisor roots were used. Dentin slabs measuring 5x5 mm were ground flat, polished and half of each specimen surface was protected with nail polish. For 60 min, the dentin surfaces were immersed in 50 mL of 5 different drinks (Gatorade...
Contributions to quantum probability
International Nuclear Information System (INIS)
Fritz, Tobias
2010-01-01
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome
von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo
2014-06-01
Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.
Contributions to quantum probability
Energy Technology Data Exchange (ETDEWEB)
Fritz, Tobias
2010-06-25
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a
Waste Package Misload Probability
International Nuclear Information System (INIS)
Knudsen, J.K.
2001-01-01
The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a
Probability theory and applications
Hsu, Elton P
1999-01-01
This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.
Paradoxes in probability theory
Eckhardt, William
2013-01-01
Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory. Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies. Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.
Measurement uncertainty and probability
Willink, Robin
2013-01-01
A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.
Model uncertainty and probability
International Nuclear Information System (INIS)
Parry, G.W.
1994-01-01
This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example
Retrocausality and conditional probability
International Nuclear Information System (INIS)
Stuart, C.I.J.M.
1989-01-01
Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)
Whittle, Peter
1992-01-01
This book is a complete revision of the earlier work Probability which ap peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...
International Nuclear Information System (INIS)
Berecz, Tibor; Balogh, Levente; Mészáros, István; Steinbach, Ágoston
2014-01-01
Surface rolling is a cold-working technique used for hardening the surface of steel and ductile cast iron components. This process increases the surface hardness and improves the fatigue properties of components, so it is commonly used to treat railroad car wheel axles. The present paper examines the influence of this surface strengthening technique on the microstructure of the railroad car wheel axle material by hardness tests, optical microscopy (OM), and other novel examination methods, such as scanning electron microscopy (SEM), X-ray line profile analysis (XLPA), non-destructive magnetic evaluation (NDE) and automated electron backscatter diffraction (EBSD). The results show that surface rolling causes an increase in hardness down to a depth of ∼10 mm. It is also shown, that the increase in hardness is not due to grain refinement or change in grain morphology; thus it is likely to be caused by an increase in dislocation density
Energy Technology Data Exchange (ETDEWEB)
Berecz, Tibor, E-mail: berecz@eik.bme.hu [Department of Materials Science and Engineering, Budapest University of Technology and Economics, 1111 Budapest, Bertalan Lajos utca 7 (Hungary); Balogh, Levente, E-mail: levente@metal.elte.hu [Department of Materials Physics, Eötvös Loránd University, 1117 Budapest, Pázmány Péter sétány 1/a (Hungary); Mészáros, István, E-mail: meszaros@eik.bme.hu [Department of Materials Science and Engineering, Budapest University of Technology and Economics, 1111 Budapest, Bertalan Lajos utca 7 (Hungary); Steinbach, Ágoston, E-mail: sa984@hszk.bme.hu [Department of Materials Science and Engineering, Budapest University of Technology and Economics, 1111 Budapest, Bertalan Lajos utca 7 (Hungary)
2014-01-13
Surface rolling is a cold-working technique used for hardening the surface of steel and ductile cast iron components. This process increases the surface hardness and improves the fatigue properties of components, so it is commonly used to treat railroad car wheel axles. The present paper examines the influence of this surface strengthening technique on the microstructure of the railroad car wheel axle material by hardness tests, optical microscopy (OM), and other novel examination methods, such as scanning electron microscopy (SEM), X-ray line profile analysis (XLPA), non-destructive magnetic evaluation (NDE) and automated electron backscatter diffraction (EBSD). The results show that surface rolling causes an increase in hardness down to a depth of ∼10 mm. It is also shown, that the increase in hardness is not due to grain refinement or change in grain morphology; thus it is likely to be caused by an increase in dislocation density.
Roth, Don J.; Kautz, Harold E.; Abel, Phillip B.; Whalen, Mike F.; Hendricks, J. Lynne; Bodis, James R.
2000-01-01
Surface topography, which significantly affects the performance of many industrial components, is normally measured with diamond-tip profilometry over small areas or with optical scattering methods over larger areas. To develop air-coupled surface profilometry, the NASA Glenn Research Center at Lewis Field initiated a Space Act Agreement with Sonix, Inc., through two Glenn programs, the Advanced High Temperature Engine Materials Program (HITEMP) and COMMTECH. The work resulted in quantitative surface topography profiles obtained using only high-frequency, focused ultrasonic pulses in air. The method is nondestructive, noninvasive, and noncontact, and it does not require light-reflective surfaces. Air surface profiling may be desirable when diamond-tip or laserbased methods are impractical, such as over large areas, when a significant depth range is required, or for curved surfaces. When the configuration is optimized, the method is reasonably rapid and all the quantitative analysis facilities are online, including two- and three-dimensional visualization, extreme value filtering (for faulty data), and leveling.
Probability mapping of contaminants
Energy Technology Data Exchange (ETDEWEB)
Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)
1994-04-01
Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).
Probability mapping of contaminants
International Nuclear Information System (INIS)
Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.
1994-01-01
Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)
Probability of causation approach
International Nuclear Information System (INIS)
Jose, D.E.
1988-01-01
Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice
Generalized Probability Functions
Directory of Open Access Journals (Sweden)
Alexandre Souto Martinez
2009-01-01
Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.
2014-06-30
precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux
Directory of Open Access Journals (Sweden)
D. Noone
2013-02-01
Full Text Available The D/H isotope ratio is used to attribute boundary layer humidity changes to the set of contributing fluxes for a case following a snowstorm in which a snow pack of about 10 cm vanished. Profiles of H_{2}O and CO_{2} mixing ratio, D/H isotope ratio, and several thermodynamic properties were measured from the surface to 300 m every 15 min during four winter days near Boulder, Colorado. Coeval analysis of the D/H ratios and CO_{2} concentrations find these two variables to be complementary with the former being sensitive to daytime surface fluxes and the latter particularly indicative of nocturnal surface sources. Together they capture evidence for strong vertical mixing during the day, weaker mixing by turbulent bursts and low level jets within the nocturnal stable boundary layer during the night, and frost formation in the morning. The profiles are generally not well described with a gradient mixing line analysis because D/H ratios of the end members (i.e., surface fluxes and the free troposphere evolve throughout the day which leads to large uncertainties in the estimate of the D/H ratio of surface water flux. A mass balance model is constructed for the snow pack, and constrained with observations to provide an optimal estimate of the partitioning of the surface water flux into contributions from sublimation, evaporation of melt water in the snow and evaporation from ponds. Results show that while vapor measurements are important in constraining surface fluxes, measurements of the source reservoirs (soil water, snow pack and standing liquid offer stronger constraint on the surface water balance. Measurements of surface water are therefore essential in developing observational programs that seek to use isotopic data for flux attribution.
Energy Technology Data Exchange (ETDEWEB)
Khanbabaee, B., E-mail: khanbabaee@physik.uni-siegen.de; Pietsch, U. [Solid State Physics, University of Siegen, D-57068 Siegen (Germany); Facsko, S. [Helmholtz-Zentrum Dresden-Rossendorf, 01314 Dresden (Germany); Doyle, S. [Synchrotron Light Source ANKA, Hermann-von-Helmholtz-Platz 1, D-76344 Eggenstein-Leopoldshafen (Germany)
2014-10-20
In this work, we report on correlations between surface density variations and ion parameters during ion beam-induced surface patterning process. The near-surface density variations of irradiated Si(100) surfaces were investigated after off-normal irradiation with 5 keV Fe ions at different fluences. In order to reduce the x-ray probing depth to a thickness below 5 nm, the extremely asymmetrical x-ray diffraction by variation of wavelength was applied, exploiting x-ray refraction at the air-sample interface. Depth profiling was achieved by measuring x-ray rocking curves as function of varying wavelengths providing incidence angles down to 0°. The density variation was extracted from the deviations from kinematical Bragg angle at grazing incidence angles due to refraction of the x-ray beam at the air-sample interface. The simulations based on the dynamical theory of x-ray diffraction revealed that while a net near-surface density decreases with increasing ion fluence which is accompanied by surface patterning, there is a certain threshold of ion fluence to surface density modulation. Our finding suggests that the surface density variation can be relevant with the mechanism of pattern formation.
International Nuclear Information System (INIS)
Koide, Yoshihiko; Ishida, Shin-ichi; Sakasai, Akira
1991-03-01
A particular role of integer-mode rational surfaces on the formation of peaked T i (r) and V t (r) is observed. In the case of JT-60 hot-ion regime, the plasma spontaneously changes its peaking region from the inside of q=2 surface to that of broader q=3 surface. (author)
Takebe, Jun; Ito, Shigeki; Miura, Shingo; Miyata, Kyohei; Ishibashi, Kanji
2012-01-01
A method of coating commercially pure titanium (cpTi) implants with a highly crystalline, thin hydroxyapatite (HA) layer using discharge anodic oxidation followed by hydrothermal treatment (Spark discharged Anodic oxidation treatment ; SA-treated cpTi) has been reported for use in clinical dentistry. We hypothesized that a thin HA layer with high crystallinity and nanostructured anodic titanium oxide film on such SA-treated cpTi implant surfaces might be a crucial function of their surface-specific potential energy. To test this, we analyzed anodic oxide (AO) cpTi and SA-treated cpTi disks by SEM and AFM. Contact angles and surface free energy of each disk surface was measured using FAMAS software. High-magnification SEM and AFM revealed the nanotopographic structure of the anodic titanium oxide film on SA-treated cpTi; however, this was not observed on the AO cpTi surface. The contact angle and surface free energy measurements were also significantly different between AO cpTi and SA-treated cpTi surfaces (Tukey's, P<0.05). These data indicated that the change of physicochemical properties of an anodic titanium oxide film with HA crystals on an SA-treated cpTi surface may play a key role in the phenomenon of osteoconduction during the process of osseointegration. Copyright © 2011 Elsevier B.V. All rights reserved.
Probability of brittle failure
Kim, A.; Bosnyak, C. P.; Chudnovsky, A.
1991-01-01
A methodology was developed for collecting statistically representative data for crack initiation and arrest from small number of test specimens. An epoxy (based on bisphenol A diglycidyl ether and polyglycol extended diglycyl ether and cured with diethylene triamine) is selected as a model material. A compact tension specimen with displacement controlled loading is used to observe multiple crack initiation and arrests. The energy release rate at crack initiation is significantly higher than that at a crack arrest, as has been observed elsewhere. The difference between these energy release rates is found to depend on specimen size (scale effect), and is quantitatively related to the fracture surface morphology. The scale effect, similar to that in statistical strength theory, is usually attributed to the statistics of defects which control the fracture process. Triangular shaped ripples (deltoids) are formed on the fracture surface during the slow subcritical crack growth, prior to the smooth mirror-like surface characteristic of fast cracks. The deltoids are complementary on the two crack faces which excludes any inelastic deformation from consideration. Presence of defects is also suggested by the observed scale effect. However, there are no defects at the deltoid apexes detectable down to the 0.1 micron level.
Ignition probabilities for Compact Ignition Tokamak designs
International Nuclear Information System (INIS)
Stotler, D.P.; Goldston, R.J.
1989-09-01
A global power balance code employing Monte Carlo techniques had been developed to study the ''probability of ignition'' and has been applied to several different configurations of the Compact Ignition Tokamak (CIT). Probability distributions for the critical physics parameters in the code were estimated using existing experimental data. This included a statistical evaluation of the uncertainty in extrapolating the energy confinement time. A substantial probability of ignition is predicted for CIT if peaked density profiles can be achieved or if one of the two higher plasma current configurations is employed. In other cases, values of the energy multiplication factor Q of order 10 are generally obtained. The Ignitor-U and ARIES designs are also examined briefly. Comparisons of our empirically based confinement assumptions with two theory-based transport models yield conflicting results. 41 refs., 11 figs
Probability and rational choice
Directory of Open Access Journals (Sweden)
David Botting
2014-05-01
Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.
Wu, Di; Wang, Yi-long; Liu, Wei-jian; Chen, Yuan-chen; Fu, Xiao-fang; Tao, Shu; Liu, Wen-xin
2016-02-15
In this study, paired surface soil and mature wheat grain samples were collected in the cornfields near the large Handan Steel Manufacturer; and the total concentrations and compositional profiles of the parent PAHs were measured, then the spatial distribution characteristics and correlation with total organic carbon fractions in soil were determined. Accordingly, a preliminary source identification was performed, and the association between PAHs in surface soil and wheat grain was briefly discussed. The median concentration of total PAHs in surface soils from the cornfields of Handan was 398.9 ng x g(-1) (ranged from 123.4 ng x g(-1) to 1626.4 ng x g(-1), where around 18% and 10% of all the studied soil samples were over the corresponding quality criteria for total PAHs and B [a] P in soils, respectively. The MMW and HMW species were the main components in the compositional profiles of surface soils. Based on the specific isomeric ratios of PAHs species, coal/biomass combustion and transportation fuel (tail gas) were the dominant mixed sources for the local PAHs emission. The fractions of surface soil TOC had significant positive correlations with the total PAHs and also with the individual components with different rings. In addition, the median concentration of total PAHs in wheat grains collected in the cornfields near the Handan Steel Manufacture was 27.0 ng x g(-1) (ranged from 19.0-34.0 ng x g(-1)). The levels in wheat grains were not high, and lower than the related hygienic standards of food proposed by EU and China. The LMW and MMW PAHs with 2 to 4 rings occupied a larger proportion, more than 84% of the total PAHs, which was largely different from the component profiles in surface soils. This situation suggested that the local sources of PAHs in wheat grains may originate not only from surface soil via root absorption and internal transportation, but also from ambient air through dry and wet deposition on the leaf surface (stoma).
López-Fernández, Jorge; Gallardo, Leonor; Fernández-Luna, Álvaro; Villacañas, Victor; García-Unanue, Jorge; Sánchez-Sánchez, Javier
2017-06-22
The aim of this research was to evaluate the influence of game surface and pitch size on the movement profile in female soccer players during Small-Sided-Games (SSGs) of 4 v 4. 16 women played three different 4-a-side (400 m, 600 m and 800 m) on three surfaces (ground [GR], artificial turf [AT] and natural grass [NG]). Time-motion variables were assessed through GPS devices (Spi Pro X, GPSports, Australia). GR had the worst outputs on most variables. NG achieved higher results than AT in terms of total distance [SSG 400 (+37.000 m; p=0.006); SSG 600 (+59.989 m; pwomen's performance being higher on AT than GR, the NG surface still showed the highest outcomes in the most intense SSG. Moreover, although the performance increase in bigger pitches, if the size is too large the outputs could be reduced.
COVAL, Compound Probability Distribution for Function of Probability Distribution
International Nuclear Information System (INIS)
Astolfi, M.; Elbaz, J.
1979-01-01
1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions
DEFF Research Database (Denmark)
Maibach, Julia; Younesi, Reza; Schwarzburger, Nele
2014-01-01
The formation of surface and interface layers at the electrodes is highly important for the performance and stability of lithium ion batteries. To unravel the surface composition of electrode materials, photoelectron spectroscopy (PES) is highly suitable as it probes chemical surface and interface...... properties with high surface sensitivity. Additionally, by using synchrotron-generated hard x-rays as excitation source, larger probing depths compared to in-house PES can be achieved. Therefore, the combination of in-house soft x-ray photoelectron spectroscopy and hard x-ray photoelectron spectroscopy...
Weekes, Michael P; Antrobus, Robin; Talbot, Suzanne; Hör, Simon; Simecek, Nikol; Smith, Duncan L; Bloor, Stuart; Randow, Felix; Lehner, Paul J
2012-03-02
The endoplasmic reticulum chaperone gp96 is required for the cell surface expression of a narrow range of proteins, including toll-like receptors (TLRs) and integrins. To identify a more comprehensive repertoire of proteins whose cell surface expression is dependent on gp96, we developed plasma membrane profiling (PMP), a technique that combines SILAC labeling with selective cell surface aminooxy-biotinylation. This approach allowed us to compare the relative abundance of plasma membrane (PM) proteins on gp96-deficient versus gp96-reconstituted murine pre-B cells. Analysis of unfractionated tryptic peptides initially identified 113 PM proteins, which extended to 706 PM proteins using peptide prefractionation. We confirmed a requirement for gp96 in the cell surface expression of certain TLRs and integrins and found a marked decrease in cell surface expression of four members of the extended LDL receptor family (LDLR, LRP6, Sorl1 and LRP8) in the absence of gp96. Other novel gp96 client proteins included CD180/Ly86, important in the B-cell response to lipopolysaccharide. We highlight common structural motifs in these client proteins that may be recognized by gp96, including the beta-propeller and leucine-rich repeat. This study therefore identifies the extended LDL receptor family as an important new family of proteins whose cell surface expression is regulated by gp96.
Falk, Ruma; Kendig, Keith
2013-01-01
Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.
Data descriptions are provided at the following urls:GADEP Continuous PM2.5 mass concentration data - https://aqs.epa.gov/aqsweb/documents/data_mart_welcome.htmlhttps://www3.epa.gov/ttn/amtic/files/ambient/pm25/qa/QA-Handbook-Vol-II.pdfVIIRS Day Night Band SDR (SVDNB) http://www.class.ngdc.noaa.gov/saa/products/search?datatype_family=VIIRS_SDRMODIS Terra Level 2 water vapor profiles (infrared algorithm for atmospheric profiles for both day and night -MOD0&_L2; http://modis-atmos.gsfc.nasa.gov/MOD07_L2/index.html NWS surface meteorological data - https://www.ncdc.noaa.gov/isdThis dataset is associated with the following publication:Wang, J., C. Aegerter, and J. Szykman. Potential Application of VIIRS Day/Night Band for Monitoring Nighttime Surface PM2.5 Air Quality From Space. ATMOSPHERIC ENVIRONMENT. Elsevier Science Ltd, New York, NY, USA, 124(0): 55-63, (2016).
Introduction to probability with R
Baclawski, Kenneth
2008-01-01
FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable
DEFF Research Database (Denmark)
Bottoli, Federico; Christiansen, Thomas Lundin; Winther, Grethe
2016-01-01
The present work deals with the evaluation of the residual stress profiles in expanded austenite by applying grazing incidence X-ray diffraction (GI-XRD) combined with successive sublayer removal. Annealed and deformed (εeq=0.5) samples of stable stainless steel EN 1.4369 were nitrided...... or nitrocarburized. The residual stress profiles resulting from the thermochemical low-temperature surface treatment were measured. The results indicate high-residual compressive stresses of several GPa’s in the nitrided region, while lower-compressive stresses are produced in the carburized case. Plastic...... deformation in the steel prior to thermochemical treatment has a hardly measurable influence on the nitrogen-rich zone, while it has a measurable effect on the stresses and depth of the carbon-rich zone....
Directory of Open Access Journals (Sweden)
Edvaldo Leal de Moraes
2009-10-01
Full Text Available This study aimed to characterize donors according to gender, age group, cause of brain death; quantify donors with hypernatremia, hyperpotassemia and hypopotassemia; and get to know which organs were the most used in transplantations. This quantitative, descriptive, exploratory and retrospective study was performed at the Organ Procurement Organization of the University of São Paulo Medical School Hospital das Clínicas. Data from the medical records of 187 potential donors were analyzed. Cerebrovascular accidents represented 53.48% of all brain death causes, sodium and potassium disorders occurred in 82.36% of cases and 45.46% of the potential donors were between 41 and 60 years old. The results evidenced that natural death causes exceeded traumatic deaths, and that most donors presented sodium and potassium alterations, likely associated to inappropriate maintenance.Se tuvo como objetivos determinar las características de los donadores según el sexo, el intervalo de edad, y, las causas por muerte encefálica; determinar el número donadores que presentaban hipernatremia, hiperpotasemia y hipopotasemia; conocer los órganos que fueron más utilizados para el trasplante. Es un estudio de tipo cuantitativo, descriptivo, exploratorio y retrospectivo. La investigación fue realizada en una Institución de donación de Órganos perteneciente al Hospital de las Clínicas de Sao Paulo. Fueron analizados los datos de 187 probables donadores. Entre las causas de muerte encefálica el 53,48% fueron por accidente cerebro vascular, en 82,36% de los casos se produjeron alteraciones en los valores de sodio y potasio y los donadores se encontraban entre 41 y 60 años de edad. Los resultados muestran que las causas naturales de muerte superaron a las muertes por traumatismo. La mayoría de los donadores tuvo alteraciones en los niveles de sodio y potasio, estando posiblemente relacionadas a medidas de conservación inadecuadas.Objetivou-se caracterizar os
Ross, Sheldon
2014-01-01
A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.
Rad, Farshid Mashayekhy; Leck, Caroline; Ilag, Leopold L; Nilsson, Ulrika
2018-03-09
Fatty acids are enriched in the ocean surface microlayer (SML) and have as a consequence been detected worldwide in sea spray aerosols. In searching for a relationship between the properties of the atmospheric aerosol and its ability to form cloud condensation nuclei and to promote cloud droplet formation over remote marine areas, the role of surface active fatty acids sourced from the SML is of interest to be investigated. Here is presented a fast method for profiling of major fatty acids in SML samples collected in the high Arctic (89 °N, 1 °W) in the summer of 2001. UHPLC/travelling-wave ion mobility spectrometry (TWIMS)/time-of-flight (TOF) mass spectrometry (MS) for profiling was evaluated and compared with UHPLC/TOFMS. No sample preparation, except evaporation and centrifugation, was necessary to perform prior to the analysis. TOFMS data on accurate mass, isotopic ratios and fragmentation patterns enabled identification of the fatty acids. The TWIMS dimension added to the selectivity by extensive reduction of the noise level and the entire UHPLC/TWIMS/TOFMS method provided a fast profiling of the acids, ranging from C 8 to C 24 . Hexadecanoic and octadecanoic acids were shown to yield the highest signals among the fatty acids detected in a high Arctic SML sample, followed by the unsaturated octadecenoic and octadecadienoic acids. The predominance of signal from even-numbered carbon chains indicates a mainly biogenic origin of the detected fatty acids. This study presents a fast alternative method for screening and profiling of fatty acids, which has the advantage of not requiring any complicated sample preparation thus limiting the loss of analytes. Almost no manual handling, together with the very small sample volumes needed, is certainly beneficial for the determination of trace amounts and should open up the field of applications to also include atmospheric aerosol and fog. This article is protected by copyright. All rights reserved.
International Nuclear Information System (INIS)
Zhang Yuzhong; Deng Shuxing; Liu Yanan; Shen Guofeng; Li Xiqing; Cao Jun; Wang Xilong; Reid, Brian; Tao Shu
2011-01-01
Air-soil exchange is an important process governing the fate of polycyclic aromatic hydrocarbons (PAHs). A novel passive air sampler was designed and tested for measuring the vertical concentration profile of 4 low molecular weight PAHs in gaseous phase (PAH LMW4 ) in near soil surface air. Air at various heights from 5 to 520 mm above the ground was sampled by polyurethane foam disks held in down-faced cartridges. The samplers were tested at three sites: A: an extremely contaminated site, B: a site near A, and C: a background site on a university campus. Vertical concentration gradients were revealed for PAH LMW4 within a thin layer close to soil surface at the three sites. PAH concentrations either decreased (Site A) or increased (Sites B and C) with height, suggesting either deposition to or evaporation from soils. The sampler is a useful tool for investigating air-soil exchange of gaseous phase semi-volatile organic chemicals. - Research highlights: → Design, field test and calibration of the novel passive air sampler, PAS-V-I. → Vertical concentration gradients of PAH LMW4 within a thin layer close to soil. → Comparison of results between PAS-V-I measurement and fugacity approach. → Potential application of PAS-V-I and further modifications. - A novel passive sampling device was developed and tested for measuring vertical concentration profile of gaseous phase polycyclic aromatic hydrocarbons in near soil surface air.
Hauchecorne, Alain; Keckhut, Philippe; Mariscal, Jean-François; d'Almeida, Eric; Dahoo, Pierre-Richard; Porteneuve, Jacques
2016-06-01
A concept of innovative rotational Raman lidar with daylight measurement capability is proposed to measure the vertical profile of temperature from the ground to the middle stratosphere. The optical filtering is made using a Fabry-Pérot Interferometer with line spacing equal to the line spacing of the Raman spectrum. The detection is made using a linear PMT array operated in photon counting mode. We plan to build a prototype and to test it at the Haute-Provence Observatory lidar facility. to achieve a time resolution permitting the observation of small-scale atmospheric processes playing a role in the troposphere-stratosphere interaction as gravity waves. If successful, this project will open the possibility to consider a Raman space lidar for the global observation of atmospheric temperature profiles.
Directory of Open Access Journals (Sweden)
Hauchecorne Alain
2016-01-01
Full Text Available A concept of innovative rotational Raman lidar with daylight measurement capability is proposed to measure the vertical profile of temperature from the ground to the middle stratosphere. The optical filtering is made using a Fabry-Pérot Interferometer with line spacing equal to the line spacing of the Raman spectrum. The detection is made using a linear PMT array operated in photon counting mode. We plan to build a prototype and to test it at the Haute-Provence Observatory lidar facility. to achieve a time resolution permitting the observation of small-scale atmospheric processes playing a role in the troposphere-stratosphere interaction as gravity waves. If successful, this project will open the possibility to consider a Raman space lidar for the global observation of atmospheric temperature profiles.
DEFF Research Database (Denmark)
Goutam, Shovon; Timmermans, Jean-Marc; Omar, Noshin
2014-01-01
This experimental work attempts to determine the surface temperature evolution of large (20 Ah-rated capacity) commercial Lithium-Ion pouch cells for the application of rechargeable energy storage of plug in hybrid electric vehicles and electric vehicles. The cathode of the cells is nickel...
Sanz, J M; Saiz, J M; González, F; Moreno, F
2011-07-20
In this research, the polar decomposition (PD) method is applied to experimental Mueller matrices (MMs) measured on two-dimensional microstructured surfaces. Polarization information is expressed through a set of parameters of easier physical interpretation. It is shown that evaluating the first derivative of the retardation parameter, δ, a clear indication of the presence of defects either built on or dug in the scattering flat surface (a silicon wafer in our case) can be obtained. Although the rule of thumb thus obtained is established through PD, it can be easily implemented on conventional surface polarimetry. These results constitute an example of the capabilities of the PD approach to MM analysis, and show a direct application in surface characterization. © 2011 Optical Society of America
National Oceanic and Atmospheric Administration, Department of Commerce — Temperature profile data were collected using surface seawater intake, bucket, and XBT casts from several vessels in a world wide distribution from February 24,...
National Oceanic and Atmospheric Administration, Department of Commerce — Temperature profile data were collected using surface seawater intake, bucket, and XBT casts from multiple vessels in a world wide distribution from June 29, 1994 to...
National Oceanic and Atmospheric Administration, Department of Commerce — Temperature profile data were collected using surface seawater intake, bucket, and XBT casts from several vessels in a world wide distribution from March 1, 1996 to...
National Oceanic and Atmospheric Administration, Department of Commerce — Temperature profile data were collected using surface seawater intake, bucket, and XBT casts in a world wide distribution by several vessels from August 11, 1996 to...
National Oceanic and Atmospheric Administration, Department of Commerce — Temperature profile data were collected using surface seawater intake, bucket, and XBT casts from several vessels in a world wide distribution from September 30,...
National Oceanic and Atmospheric Administration, Department of Commerce — Temperature profile data were collected using surface seawater intake, bucket, and XBT casts from several vessels in a world wide distribution from September 19,...
A brief introduction to probability.
Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio
2018-02-01
The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.
Propensity, Probability, and Quantum Theory
Ballentine, Leslie E.
2016-08-01
Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.
Prediction and probability in sciences
International Nuclear Information System (INIS)
Klein, E.; Sacquin, Y.
1998-01-01
This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)
Applied probability and stochastic processes
Sumita, Ushio
1999-01-01
Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...
Yamani, Laura Navika; Yano, Yoshihiko; Utsumi, Takako; Wasityastuti, Widya; Rinonce, Hanggoro Tri; Widasari, Dewiyani Indah; Juniastuti; Lusida, Maria Inge; Soetjipto; Hayashi, Yoshitake
2017-11-22
Mutations in the reverse transcriptase (RT) region of the hepatitis B virus (HBV) genome are an important factor in low therapeutic effectiveness. Nonetheless, the prevalence of these mutations in HBV strains isolated previously in Indonesia has not been systematically examined. Therefore, in this study, we investigated the profile of mutations in the RT region and the associations of these mutations with amino acid changes in the surface protein in the virus of treatment-naïve Indonesian HBV carriers. Overall, 96 sequences of the full-length Indonesian HBV genomes (genotype B, n = 54; genotype C, n = 42) were retrieved from the National Center for Biotechnology Information. Naturally occurring primary and/or compensatory drug resistance mutations were found in 6/54 (11.1%) genotype B strains and in 1/42 (2.4%) genotype C strains. The potential mutations underlying resistance to a nucleos(t)ide analog and/or pretreatment mutations were more frequent in both genotypes but more frequent in genotype C strains than in genotype B strains. The A-B interdomain region in the RT gene was more frequently mutated in genotype C than in genotype B (3.51 ± 2.53 vs. 1.08 ± 1.52, P ＜ 0.001). Knowledge about the mutational profiles of the RT gene and changes in the surface protein may help clinicians to select the most appropriate antiviral drug and vaccination or HBV immunoglobulin regimen for management of HBV infection in Indonesia.
Energy Technology Data Exchange (ETDEWEB)
Kimori, Yoshitaka [Division of Biomolecular Imaging, Institute of Medical Science, The University of Tokyo, Minato-ku, Tokyo 108-8639 (Japan); Department of Bioscience and Bioinformatics, Kyushu Institute of Technology, Iizuka, Fukuoka 820-8502 (Japan); Oguchi, Yosuke [Department of Electric Engineering, Kogakuin University, Hachioji, Tokyo 192-0015 (Japan); Ichise, Norihiko [Department of Visual Communication, Komazawa Women' s University, Inagi, Tokyo 206-8511 (Japan); Baba, Norio [Department of Electric Engineering, Kogakuin University, Hachioji, Tokyo 192-0015 (Japan); Katayama, Eisaku [Division of Biomolecular Imaging, Institute of Medical Science, The University of Tokyo, Minato-ku, Tokyo 108-8639 (Japan)]. E-mail: ekatayam@ims.u-tokyo.ac.jp
2007-01-15
Quick-freeze deep-etch replica electron microscopy gives high contrast snapshots of individual protein molecules under physiological conditions in vitro or in situ. The images show delicate internal pattern, possibly reflecting the rotary-shadowed surface profile of the molecule. As a step to build the new system for the 'Structural analysis of single molecules', we propose a procedure to quantitatively characterize the structural property of individual molecules; e.g. conformational type and precise view-angle of the molecules, if the crystallographic structure of the target molecule is available. This paper presents a framework to determine the observed face of the protein molecule by analyzing the surface profile of individual molecules visualized in freeze-replica specimens. A comprehensive set of rotary-shadowed views of the protein molecule was artificially generated from the available atomic coordinates using light-rendering software. Exploiting new mathematical morphology-based image filter, characteristic features were extracted from each image and stored as template. Similar features were extracted from the true replica image and the most likely projection angle and the conformation of the observed particle were determined by quantitative comparison with a set of archived images. The performance and the robustness of the procedure were examined with myosin head structure in defined configuration for actual application.
International Nuclear Information System (INIS)
Kimori, Yoshitaka; Oguchi, Yosuke; Ichise, Norihiko; Baba, Norio; Katayama, Eisaku
2007-01-01
Quick-freeze deep-etch replica electron microscopy gives high contrast snapshots of individual protein molecules under physiological conditions in vitro or in situ. The images show delicate internal pattern, possibly reflecting the rotary-shadowed surface profile of the molecule. As a step to build the new system for the 'Structural analysis of single molecules', we propose a procedure to quantitatively characterize the structural property of individual molecules; e.g. conformational type and precise view-angle of the molecules, if the crystallographic structure of the target molecule is available. This paper presents a framework to determine the observed face of the protein molecule by analyzing the surface profile of individual molecules visualized in freeze-replica specimens. A comprehensive set of rotary-shadowed views of the protein molecule was artificially generated from the available atomic coordinates using light-rendering software. Exploiting new mathematical morphology-based image filter, characteristic features were extracted from each image and stored as template. Similar features were extracted from the true replica image and the most likely projection angle and the conformation of the observed particle were determined by quantitative comparison with a set of archived images. The performance and the robustness of the procedure were examined with myosin head structure in defined configuration for actual application
Institute of Scientific and Technical Information of China (English)
秦大河; 任贾文
1992-01-01
Along a 5986 km route on Antarctic ice sheet from west to east, 106 snow pits with a depth ranging from 1.0—2.0 m have been dug by the first author of this paper, the Chinese member of the "1990 International Trans-Antarctic Expedition". The basic physical characteristics of the surface layer of the ice sheet on a large scale are obtained through the observations of snow profiles at these snow pits. The sastrugi shapes and major axis azimuths have also been observed or measured on the way. Analysis for these observation data shows that in West Antarctica the meltwater infiltration-congelation is obvious and the annual precipitation is larger than that in East Antarctica, which implies that climate in West Antarctica is warmer, more humid and influenced more greatly by the South Ocean than that in East Antarctica. Radiation ice-glazes frequently found in snow profiles indicate that even in East Antarctica under very low temperatures, surface "melting" occurs in summer due to the long-time solar radiatio
Birch, Stacy L.
2016-01-01
The purpose of the present study was to identify and characterize surface and phonological subgroups of readers among college students with a prior diagnosis of developmental reading disability (RD). Using a speeded naming task derived from Castles and Coltheart's subtyping study, we identified subgroups of readers from among college students with…
Poisson Processes in Free Probability
An, Guimei; Gao, Mingchu
2015-01-01
We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...
Sreelash, K.; Sekhar, M.; Ruiz, L.; Tomer, S. K.; Guérif, M.; Buis, S.; Durand, P.; Gascuel-Odoux, C.
2012-08-01
SummaryEstimation of soil parameters by inverse modeling using observations on either surface soil moisture or crop variables has been successfully attempted in many studies, but difficulties to estimate root zone properties arise when heterogeneous layered soils are considered. The objective of this study was to explore the potential of combining observations on surface soil moisture and crop variables - leaf area index (LAI) and above-ground biomass for estimating soil parameters (water holding capacity and soil depth) in a two-layered soil system using inversion of the crop model STICS. This was performed using GLUE method on a synthetic data set on varying soil types and on a data set from a field experiment carried out in two maize plots in South India. The main results were (i) combination of surface soil moisture and above-ground biomass provided consistently good estimates with small uncertainity of soil properties for the two soil layers, for a wide range of soil paramater values, both in the synthetic and the field experiment, (ii) above-ground biomass was found to give relatively better estimates and lower uncertainty than LAI when combined with surface soil moisture, especially for estimation of soil depth, (iii) surface soil moisture data, either alone or combined with crop variables, provided a very good estimate of the water holding capacity of the upper soil layer with very small uncertainty whereas using the surface soil moisture alone gave very poor estimates of the soil properties of the deeper layer, and (iv) using crop variables alone (else above-ground biomass or LAI) provided reasonable estimates of the deeper layer properties depending on the soil type but provided poor estimates of the first layer properties. The robustness of combining observations of the surface soil moisture and the above-ground biomass for estimating two layer soil properties, which was demonstrated using both synthetic and field experiments in this study, needs now to
PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT
We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...
International Nuclear Information System (INIS)
Kameda, Yutaka; Kimura, Kumiko; Miyazaki, Motonobu
2011-01-01
Sun-blocking agents including eight UV filters (UVF) and 10 UV light stabilizers (UVLS) were measured in water and sediment collected from 22 rivers, four sewage treatment plant effluents (STPE) and three lakes in Japan. Total sun blocking agents levels ranged from N.D. to 4928 ng/L and from 2.0 to 3422 μg/kg dry wt in surface water and in sediment, respectively. Benzyl salicylate, benzophenone-3, 2-ethyl hexyl-4-methoxycinnamte (EHMC) and octyl salicylate were dominant in surface water receiving wastewater effluents and STPE, although UV-328, benzophenone and EHMC were dominant in other surface water except background sites. Three UVF and nine UVLS were observed from all sediment and their compositions showed similar patterns with UV-328 and UV-234 as the most prevalent compounds. Homosalate, octocrylene, UV-326, UV-327, UV-328 and UV-234 were significantly correlated with Galaxolide in sediments. Concentrations of UV-327 and UV-328 also had strong correlation between those of UV-326 in sediment. - Highlights: → Total sun-blocking agents levels ranged from N.D. to 4928 ng/L in surface water from 29 sampling sites. → The maximum concentration of total sun-blocking agents was 3422 μg/kg dry wt. in sediment. → Residential wastewaters and STPE were considered to be potential sources of UVLS in river and lakes. → Most of sun-blocking agents in sediment were significantly correlated with HHCB. → UV-326 had a strong linear correlation between UV-327 as well as UV-328 in all sediment. - Occurrence of eight UV filters and 10 UV light stabilizers in surface water and sediment were investigated and characterized their compositions in water and sediment.
Quinete, Natalia; Wu, Qian; Zhang, Tao; Yun, Se Hun; Moreira, Isabel; Kannan, Kurunthachalam
2009-10-01
Despite the concern over widespread distribution of perfluorinated compounds (PFCs) even in sparsely populated regions of the world, few studies have reported their occurrence in South America. In this study, PFCs were measured in Rio de Janeiro State in southeast Brazil: in drinking water from various districts in the State, in river water and tucuxi dolphins from the Paraiba do Sul River, several species of fish from the State, and mussels from Guanabara Bay. Liver, kidney, and muscle from fishes were analyzed to enable an understanding of the tissue distribution of PFCs. PFOS, PFOA, and PFHxS were detected in all drinking water samples in concentration ranges of 0.58-6.70, 0.35-2.82, and 0.15-1.00 ng L(-1), respectively. The profiles of PFCs in drinking water from Brazil (with PFOS concentrations comparable to or higher than those of PFOA) were different from the profiles that have been reported for other countries. In fish, concentrations of PFOS were, in general, higher in liver than in muscle. Concentrations of PFOA in livers of fish were similar to or lower than fish muscle tissue concentrations. PFOS and PFOA were found in brown mussels from Guanabara Bay. Bioconcentration factors (BCFs) of PFOA calculated for mussels were higher than the BCFs calculated for fishes. Elevated concentrations of PFUnDA (mean: 109+/-17.4 ng g(-1) wet weight) were found in mussels from certain locations within Guanabara Bay. Although PFCs were detected in all types of samples analyzed, the concentrations were generally lower than the concentrations reported for Japan and the USA.
Directory of Open Access Journals (Sweden)
Aleksander VORON’KO
2008-01-01
Full Text Available In the middle of the 90s the deterioration of wheels flanges and lateral rail surfaces on roads in the countries of CIS from natural process of wear process of surfaces has turned to the sharp problem which has received the status of a «rail plague». On separate roads the lateral deterioration of rails has reached 1 mm times 106 tons, by exceeding a level of normative deterioration in some times. Thus the run of wheel pairs between regrinding on flange undercuts was reduced in by 3-5 times [5]. In the article some ways of elimination of deterioration of wheels and rails are considered. The technique of process modeling of parameters changes of deterioration is offered.
International Nuclear Information System (INIS)
Fujii, A; Hayashi, S; Fujii, S; Yanagi, K
2014-01-01
This paper deals with the functional performance of optical surface texture measuring instruments on the market. It is well known that their height response curves against certain referential geometry are not always identical to each other. So, a more precise study on the optical instrument's characteristics is greatly needed. Firstly, we developed a new simulation tool using a finite-difference time-domain technique, which enables the prediction of the height response curve against the fundamental surface geometry in the case of the confocal laser scanning microscope. Secondly, by utilizing this new simulation tool, measurement results, including outliers, were compared with the analytical simulation results. The comparison showed the consistency, which indicates that necessary conditions of surface measurement standards for verifying the instrument performance can be established. Consequently, we suggest that the maximum measurable slope angle must be added to evaluation subjects as significant metrological characteristics of measuring instruments, along with the lateral period limit. Finally, we propose a procedure to determine the lateral period limit in an ISO standard. (paper)
Luna, Gian Marco; Corinaldesi, Cinzia; Rastelli, Eugenio; Danovaro, Roberto
2013-10-01
We investigated the patterns and drivers of bacterial α- and β-diversity, along with viral and prokaryotic abundance and the carbon production rates, in marine surface and subsurface sediments (down to 1 m depth) in two habitats: vegetated sediments (seagrass meadow) and non-vegetated sediments. Prokaryotic abundance and production decreased with depth in the sediment, but cell-specific production rates and the virus-to-prokaryote ratio increased, highlighting unexpectedly high activity in the subsurface. The highest diversity was observed in vegetated sediments. Bacterial β-diversity between sediment horizons was high, and only a minor number of taxa was shared between surface and subsurface layers. Viruses significantly contributed to explain α- and β-diversity patterns. Despite potential limitations due to the only use of fingerprinting techniques, this study indicates that the coastal subsurface host highly active and diversified bacterial assemblages, that subsurface cells are more active than expected and that viruses promote β-diversity and stimulate bacterial metabolism in subsurface layers. The limited number of taxa shared between habitats, and between surface and subsurface sediment horizons, suggests that future investigations of the shallow subsurface will provide insights into the census of bacterial diversity, and the comprehension of the patterns and drivers of prokaryotic diversity in marine ecosystems. © 2013 John Wiley & Sons Ltd and Society for Applied Microbiology.
Probability inequalities for decomposition integrals
Czech Academy of Sciences Publication Activity Database
Agahi, H.; Mesiar, Radko
2017-01-01
Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf
Expected utility with lower probabilities
DEFF Research Database (Denmark)
Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte
1994-01-01
An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...
Invariant probabilities of transition functions
Zaharopol, Radu
2014-01-01
The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...
Introduction to probability with Mathematica
Hastings, Kevin J
2009-01-01
Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...
Linear positivity and virtual probability
International Nuclear Information System (INIS)
Hartle, James B.
2004-01-01
We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics
Directory of Open Access Journals (Sweden)
Matyunina Lilya V
2009-12-01
Full Text Available Abstract Background Accumulating evidence suggests that somatic stem cells undergo mutagenic transformation into cancer initiating cells. The serous subtype of ovarian adenocarcinoma in humans has been hypothesized to arise from at least two possible classes of progenitor cells: the ovarian surface epithelia (OSE and/or an as yet undefined class of progenitor cells residing in the distal end of the fallopian tube. Methods Comparative gene expression profiling analyses were carried out on OSE removed from the surface of normal human ovaries and ovarian cancer epithelial cells (CEPI isolated by laser capture micro-dissection (LCM from human serous papillary ovarian adenocarcinomas. The results of the gene expression analyses were randomly confirmed in paraffin embedded tissues from ovarian adenocarcinoma of serous subtype and non-neoplastic ovarian tissues using immunohistochemistry. Differentially expressed genes were analyzed using gene ontology, molecular pathway, and gene set enrichment analysis algorithms. Results Consistent with multipotent capacity, genes in pathways previously associated with adult stem cell maintenance are highly expressed in ovarian surface epithelia and are not expressed or expressed at very low levels in serous ovarian adenocarcinoma. Among the over 2000 genes that are significantly differentially expressed, a number of pathways and novel pathway interactions are identified that may contribute to ovarian adenocarcinoma development. Conclusions Our results are consistent with the hypothesis that human ovarian surface epithelia are multipotent and capable of serving as the origin of ovarian adenocarcinoma. While our findings do not rule out the possibility that ovarian cancers may also arise from other sources, they are inconsistent with claims that ovarian surface epithelia cannot serve as the origin of ovarian cancer initiating cells.
Energy Technology Data Exchange (ETDEWEB)
Cefalas, A.C., E-mail: ccefalas@eie.gr [National Hellenic Research Foundation, Theoretical and Physical Chemistry Institute, 48 Vassileos Constantinou Avenue, Athens 11635 (Greece); Kollia, Z.; Spyropoulos-Antonakakis, N.; Gavriil, V. [National Hellenic Research Foundation, Theoretical and Physical Chemistry Institute, 48 Vassileos Constantinou Avenue, Athens 11635 (Greece); Christofilos, D.; Kourouklis, G. [Physics Division, School of Technology, Aristotle University of Thessaloniki, Thessaloniki 54124 (Greece); Semashko, V.V.; Pavlov, V. [Kazan Federal University, Institute of Physics, 18 Kremljovskaja str., Kazan 420008 (Russian Federation); Sarantopoulou, E. [National Hellenic Research Foundation, Theoretical and Physical Chemistry Institute, 48 Vassileos Constantinou Avenue, Athens 11635 (Greece); Kazan Federal University, Institute of Physics, 18 Kremljovskaja str., Kazan 420008 (Russian Federation)
2017-02-28
Highlights: • The work links the surface morphology of amorphous semiconductors with both their electric-thermal properties and current stability at the nanoscale (<1 μm). • Measured high correlation value between surface morphological spatial gradient and conductive electron energy spatial gradient or thermal gradient. • Unidirectional current stability is associated with asymmetric nanodomains along nanosize conductive paths. • Bidirectional current stability is inherent with either long conductive paths or nanosize conductive paths along symmetric nanodomains. • Conclusion: Surface design improves current stability across nanoelectonic junctions. - Abstract: A link between the morphological characteristics and the electric properties of amorphous layers is established by means of atomic, conductive, electrostatic force and thermal scanning microscopy. Using amorphous Ta{sub 2}O{sub 5} (a-Ta{sub 2}O{sub 5}) semiconductive layer, it is found that surface profile gradients (morphological gradient), are highly correlated to both the electron energy gradient of trapped electrons in interactive Coulombic sites and the thermal gradient along conductive paths and thus thermal and electric properties are correlated with surface morphology at the nanoscale. Furthermore, morphological and electron energy gradients along opposite conductive paths of electrons intrinsically impose a current stability anisotropy. For either long conductive paths (L > 1 μm) or along symmetric nanodomains, current stability for both positive and negative currents i is demonstrated. On the contrary, for short conductive paths along non-symmetric nanodomains, the set of independent variables (L, i) is spanned by two current stability/intability loci. One locus specifies a stable state for negative currents, while the other locus also describes a stable state for positive currents.
Syed, Ahmed Rashid
Among the great physical challenges faced by the current front-end semiconductor equipment manufacturers is the accurate and repeatable surface temperature measurement of wafers during various fabrication steps. Close monitoring of temperature is essential in that it ensures desirable device characteristics to be reliably reproduced across various wafer lots. No where is the need to control temperature more pronounced than it is during Rapid Thermal Processing (RTP) which involves temperature ramp rates in excess of 200°C/s. This dissertation presents an elegant and practical approach to solve the wafer surface temperature estimation problem, in context of RTP, by deploying hardware that acquires the necessary data while preserving the integrity and purity of the wafer. In contrast to the widely used wafer-contacting (and hence contaminating) methods, such as bonded thermocouples, or environment sensitive schemes, such as light-pipes and infrared pyrometry, the proposed research explores the concept of utilizing Lamb (acoustic) waves to detect changes in wafer surface temperature, during RTP. Acoustic waves are transmitted to the wafer via an array of quartz rods that normally props the wafer inside an RTP chamber. These waves are generated using piezoelectric transducers affixed to the bases of the quartz rods. The group velocity of Lamb waves traversing the wafer surface undergoes a monotonic decrease with rise in wafer temperature. The correspondence of delay in phase of the received Lamb waves and the ambient temperature, along all direct paths between sending and receiving transducers, yields a psuedo real-time thermal image of the wafer. Although the custom built hardware-setup implements the above "proof-of-concept" scheme by transceiving acoustic signals at a single frequency, the real-world application will seek to enhance the data acquistion. rate (>1000 temperature measurements per seconds) by sending and receiving Lamb waves at multiple frequencies (by
Energy Technology Data Exchange (ETDEWEB)
Malak, M.; Marty, F.; Bourouina, T. [Universite Paris-Est, Laboratoire ESYCOM, ESIEE Paris, Cite Descartes, 2 Boulevard Blaise Pascal, 93162 Noisy-le-Grand Cedex (France); Nouira, H.; Vailleau, G. [Laboratoire National de Metrologie et d' Essais, 1 rue Gaston Boissier, 75724 Paris Cedex 15 (France)
2013-04-08
A miniature Michelson interferometer is analyzed theoretically and experimentally. The fabricated micro-interferometer is incorporated at the tip of a monolithic silicon probe to achieve contactless distance measurements and surface profilometry. For infrared operation, two approaches are studied, based on the use of monochromatic light and wavelength sweep, respectively. A theoretical model is devised to depict the system characteristics taking into account Gaussian beam divergence and light spot size. Furthermore, preliminary results using visible light demonstrate operation of the probe as a visible light spectrometer, despite silicon absorbance, thanks to the micrometer thickness involved in the beam splitter.
International Nuclear Information System (INIS)
Malak, M.; Marty, F.; Bourouina, T.; Nouira, H.; Vailleau, G.
2013-01-01
A miniature Michelson interferometer is analyzed theoretically and experimentally. The fabricated micro-interferometer is incorporated at the tip of a monolithic silicon probe to achieve contactless distance measurements and surface profilometry. For infrared operation, two approaches are studied, based on the use of monochromatic light and wavelength sweep, respectively. A theoretical model is devised to depict the system characteristics taking into account Gaussian beam divergence and light spot size. Furthermore, preliminary results using visible light demonstrate operation of the probe as a visible light spectrometer, despite silicon absorbance, thanks to the micrometer thickness involved in the beam splitter.
Asker, Dalal
2018-09-30
Carotenoids are valuable natural colorants that exhibit numerous health promoting properties, and thus are widely used in food, feeds, pharmaceutical and nutraceuticals industries. In this study, we isolated and identified novel microbial sources that produced high-value carotenoids using high throughput screening (HTS). A total of 701 pigmented microbial strains library including marine bacteria and red yeast was constructed. Carotenoids profiling using HPLC-DAD-MS methods showed 88 marine bacterial strains with potential for the production of high-value carotenoids including astaxanthin (28 strains), zeaxanthin (21 strains), lutein (1 strains) and canthaxanthin (2 strains). A comprehensive 16S rRNA gene based phylogenetic analysis revealed that these strains can be classified into 30 species belonging to five bacterial classes (Flavobacteriia, α-Proteobacteria, γ-Proteobacteria, Actinobacteria and Bacilli). Importantly, we discovered novel producers of zeaxanthin and lutein, and a high diversity in both carotenoids and producing microbial strains, which are promising and highly selective biotechnological sources for high-value carotenoids. Copyright © 2018 Elsevier Ltd. All rights reserved.
Directory of Open Access Journals (Sweden)
Suma George Mulamattathil
2014-01-01
Full Text Available The aim of this study was to isolate and identify environmental bacteria from various raw water sources as well as the drinking water distributions system in Mafikeng, South Africa, and to determine their antibiotic resistance profiles. Water samples from five different sites (raw and drinking water were analysed for the presence of faecal indicator bacteria as well as Aeromonas and Pseudomonas species. Faecal and total coliforms were detected in summer in the treated water samples from the Modimola dam and in the mixed water samples, with Pseudomonas spp. being the most prevalent organism. The most prevalent multiple antibiotic resistance phenotype observed was KF-AP-C-E-OT-K-TM-A. All organisms tested were resistant to erythromycin, trimethoprim, and amoxicillin. All isolates were susceptible to ciprofloxacin and faecal coliforms and Pseudomonas spp. to neomycin and streptomycin. Cluster analysis based on inhibition zone diameter data suggests that the isolates had similar chemical exposure histories. Isolates were identified using gyrB, toxA, ecfX, aerA, and hylH gene fragments and gyrB, ecfX, and hylH fragments were amplified. These results demonstrate that (i the drinking water from Mafikeng contains various bacterial species and at times faecal and total coliforms. (ii The various bacteria are resistant to various classes of antibiotics.
Directory of Open Access Journals (Sweden)
Gomez-Mancilla Baltazar
2006-04-01
Full Text Available Abstract Cerebrospinal fluid (CSF potentially carries an archive of peptides and small proteins relevant to pathological processes in the central nervous system (CNS and surrounding brain tissue. Proteomics is especially well suited for the discovery of biomarkers of diagnostic potential in CSF for early diagnosis and discrimination of several neurodegenerative diseases. ProteinChip surface-enhanced laser-desorption/ionization time-of-flight mass spectrometry (SELDI-TOF-MS is one such approach which offers a unique platform for high throughput profiling of peptides and small proteins in CSF. In this study, we evaluated methodologies for the retention of CSF proteins m/z we found a high degree of overlap between the tested array surfaces. The combination of CM10 and IMAC30 arrays was sufficient to represent between 80–90% of all assigned peaks when using either sinapinic acid or α-Cyano-4-hydroxycinnamic acid as the energy absorbing matrices. Moreover, arrays processed with SPA consistently showed better peak resolution and higher peak number across all surfaces within the measured mass range. We intend to use CM10 and IMAC30 arrays prepared in sinapinic acid as a fast and cost-effective approach to drive decisions on sample selection prior to more in-depth discovery of diagnostic biomarkers in CSF using alternative but complementary proteomic strategies.
Park, Ji-Won; Jeong, Hyobin; Kang, Byeongsoo; Kim, Su Jin; Park, Sang Yoon; Kang, Sokbom; Kim, Hark Kyun; Choi, Joon Sig; Hwang, Daehee; Lee, Tae Geol
2015-06-05
Time-of-flight secondary ion mass spectrometry (TOF-SIMS) emerges as a promising tool to identify the ions (small molecules) indicative of disease states from the surface of patient tissues. In TOF-SIMS analysis, an enhanced ionization of surface molecules is critical to increase the number of detected ions. Several methods have been developed to enhance ionization capability. However, how these methods improve identification of disease-related ions has not been systematically explored. Here, we present a multi-dimensional SIMS (MD-SIMS) that combines conventional TOF-SIMS and metal-assisted SIMS (MetA-SIMS). Using this approach, we analyzed cancer and adjacent normal tissues first by TOF-SIMS and subsequently by MetA-SIMS. In total, TOF- and MetA-SIMS detected 632 and 959 ions, respectively. Among them, 426 were commonly detected by both methods, while 206 and 533 were detected uniquely by TOF- and MetA-SIMS, respectively. Of the 426 commonly detected ions, 250 increased in their intensities by MetA-SIMS, whereas 176 decreased. The integrated analysis of the ions detected by the two methods resulted in an increased number of discriminatory ions leading to an enhanced separation between cancer and normal tissues. Therefore, the results show that MD-SIMS can be a useful approach to provide a comprehensive list of discriminatory ions indicative of disease states.
Directory of Open Access Journals (Sweden)
Philip M. Giffard
2017-06-01
Full Text Available Background The microbiome of built environment surfaces is impacted by the presence of humans. In this study, we tested the hypothesis that analysis of surface swabs from clinic toilet/bathroom yields results correlated with sexually transmitted infection (STI notifications from corresponding human populations. We extended a previously reported study in which surfaces in toilet/bathroom facilities in primary health clinics in the Australian Northern Territory (NT were swabbed then tested for nucleic acid from the STI agents Chlamydia trachomatis, Neisseria gonorrhoeae and Trichomonas vaginalis. This was in the context of assessing the potential for such nucleic acid to contaminate specimens collected in such facilities. STIs are notifiable in the NT, thus allowing comparison of swab and notification data. Methods An assumption in the design was that while absolute built environment loads of STI nucleic acids will be a function of patient traffic density and facility cleaning protocols, the relative loads of STI nucleic acids from different species will be largely unaffected by these processes. Another assumption was that the proportion of swabs testing positive for STIs provides a measure of surface contamination. Accordingly, “STI profiles” were calculated. These were the proportions that each of the three STIs of interest contributed to the summed STI positive swabs or notifications. Three comparisons were performed, using swab data from clinics in remote Indigenous communities, clinics in small-medium towns, and a single urban sexual health clinic. These data were compared with time and place-matched STI notifications. Results There were significant correlations between swab and notifications data for the both the remote Indigenous and regional data. For the remote Indigenous clinics the p values ranged from 0.041 to 0.0089, depending on data transformation and p value inference method. Further, the swab data appeared to strongly indicate
Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines
Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.
2011-01-01
Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433
Probable Inference and Quantum Mechanics
International Nuclear Information System (INIS)
Grandy, W. T. Jr.
2009-01-01
In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.
Failure probability under parameter uncertainty.
Gerrard, R; Tsanakas, A
2011-05-01
In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.
International Nuclear Information System (INIS)
Pilyugin, L. S.; Grebel, E. K.; Zinchenko, I. A.; Kniazev, A. Y.
2014-01-01
The relations between oxygen abundance and disk surface brightness (OH–SB relation) in the infrared W1 band are examined for nearby late-type galaxies. The oxygen abundances were presented in Paper I. The photometric characteristics of the disks are inferred here using photometric maps from the literature through bulge-disk decomposition. We find evidence that the OH–SB relation is not unique but depends on the galactocentric distance r (taken as a fraction of the optical radius R 25 ) and on the properties of a galaxy: the disk scale length h and the morphological T-type. We suggest a general, four-dimensional OH–SB relation with the values r, h, and T as parameters. The parametric OH–SB relation reproduces the observed data better than a simple, one-parameter relation; the deviations resulting when using our parametric relation are smaller by a factor of ∼1.4 than that of the simple relation. The influence of the parameters on the OH–SB relation varies with galactocentric distance. The influence of the T-type on the OH–SB relation is negligible at the centers of galaxies and increases with galactocentric distance. In contrast, the influence of the disk scale length on the OH–SB relation is at a maximum at the centers of galaxies and decreases with galactocentric distance, disappearing at the optical edges of galaxies. Two-dimensional relations can be used to reproduce the observed data at the optical edges of the disks and at the centers of the disks. The disk scale length should be used as a second parameter in the OH–SB relation at the center of the disk while the morphological T-type should be used as a second parameter in the relation at optical edge of the disk. The relations between oxygen abundance and disk surface brightness in the optical B and infrared K bands at the center of the disk and at optical edge of the disk are also considered. The general properties of the abundance–surface brightness relations are similar for the three
Probability with applications and R
Dobrow, Robert P
2013-01-01
An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c
A philosophical essay on probabilities
Laplace, Marquis de
1996-01-01
A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application
Imanaka, Hiroyuki; Tanaka, Soukichi; Feng, Bin; Imamura, Koreyoshi; Nakanishi, Kazuhiro
2010-03-01
We cultivated a filamentous fungus, Aspergillus oryzae IAM 2706 by three different cultivation methods, i.e., shaking-flask culture (SFC), agar-plate culture (APC), and membrane-surface liquid culture (MSLC), to elucidate the differences of its behaviors by different cultivation methods under the same media, by measuring the growth, secretion of proteases and alpha-amylase, secreted protein level, and gene transcriptional profile by the DNA microarray analysis. The protease activities detected by MSLC and APC were much higher than that by SFC, using both modified Czapek-Dox (mCD) and dextrin-peptone-yeast extract (DPY) media. The alpha-amylase activity was detected in MSLC and APC in a much larger extent than that in SFC when DPY medium was used. On the basis of SDS-PAGE analyses and N-terminal amino acid sequences, 6 proteins were identified in the supernatants of the culture broths using DPY medium, among which oryzin (alkaline protease) and alpha-amylase were detected at a much higher extent for APC and MSLC than those for SFC while only oryzin was detected in mCD medium, in accordance with the activity measurements. A microarray analysis for the fungi cultivated by SFC, APC, and MSLC using mCD medium was carried out to elucidate the differences in the gene transcriptional profile by the cultivation methods. The gene transcriptional profile obtained for the MSLC sample showed a similar tendency to the APC sample while it was quite different from that for the SFC sample. Most of the genes specifically transcribed in the MSLC sample versus those in the SFC sample with a 10-fold up-regulation or higher were unknown or predicted proteins. However, transcription of oryzin gene was only slightly up-regulated in the MSLC sample and that of alpha-amylase gene, slightly down-regulated. Copyright 2009 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
Directory of Open Access Journals (Sweden)
С. А. Игнатьев
2016-08-01
Full Text Available The paper addresses an issue of creating an environment favorable for the life in megacities by planting vegetation on the rooftops. It also provides information about rooftop greening practices adopted in other countries. The issues of ‘green roof’ building in climatic conditions of Saint Petersburg and roof vegetation impact on the urban ecosystem are examined. Vegetation composition quality- and quantity-wise has been proposed for the roof under research and a 3D model of this roof reflecting its geometric properties has been developed. A structure of roof covering and substrate qualitative composition is presented. An effect of rooftop geometry on the substrate temperature is explored. The annual substrate temperature and moisture content in different parts of the roof have been analyzed. Results of thermal imaging monitoring and insolation modelling for different parts of green roof surface are presented.
Matuła, Rafał; Lewińska, Paulina
2018-01-01
This paper revolves around newly designed and constructed system that can make 2D seismic measurement in natural, subsoil conditions and role of land survey in obtaining accurate results and linking them to 3D surface maps. A new type of land streamer, designed for shallow subsurface exploration is described in this paper. In land seismic data acquisition methods a vehicle tows a line of seismic cable, lying on construction called streamer. The measurements of points and shots are taken while the line is stationary, arbitrary placed on seismic profile. Exposed land streamer consists of 24 innovatory gimballed 10 Hz geophones. It eliminates the need for hand `planting' of geophones, reducing time and costs. With the use of current survey techniques all data obtained with this instrument are being transferred in to 2D and 3D maps. This process is becoming more automatic.
Kim, Kyung Lock; Park, Kyeng Min; Murray, James; Kim, Kimoon; Ryu, Sung Ho
2018-05-23
Combinatorial post-translational modifications (PTMs), which can serve as dynamic "molecular barcodes", have been proposed to regulate distinct protein functions. However, studies of combinatorial PTMs on single protein molecules have been hindered by a lack of suitable analytical methods. Here, we describe erasable single-molecule blotting (eSiMBlot) for combinatorial PTM profiling. This assay is performed in a highly multiplexed manner and leverages the benefits of covalent protein immobilization, cyclic probing with different antibodies, and single molecule fluorescence imaging. Especially, facile and efficient covalent immobilization on a surface using Cu-free click chemistry permits multiple rounds (>10) of antibody erasing/reprobing without loss of antigenicity. Moreover, cumulative detection of coregistered multiple data sets for immobilized single-epitope molecules, such as HA peptide, can be used to increase the antibody detection rate. Finally, eSiMBlot enables direct visualization and quantitative profiling of combinatorial PTM codes at the single-molecule level, as we demonstrate by revealing the novel phospho-codes of ligand-induced epidermal growth factor receptor. Thus, eSiMBlot provides an unprecedentedly simple, rapid, and versatile platform for analyzing the vast number of combinatorial PTMs in biological pathways.
Logic, probability, and human reasoning.
Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P
2015-04-01
This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.
Free probability and random matrices
Mingo, James A
2017-01-01
This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.
Introduction to probability and measure
Parthasarathy, K R
2005-01-01
According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.
Chandran, Parwathy; Riviere, Jim E; Monteiro-Riviere, Nancy A
2017-05-01
This study investigated the role of nanoparticle size and surface chemistry on biocorona composition and its effect on uptake, toxicity and cellular responses in human umbilical vein endothelial cells (HUVEC), employing 40 and 80 nm gold nanoparticles (AuNP) with branched polyethyleneimine (BPEI), lipoic acid (LA) and polyethylene glycol (PEG) coatings. Proteomic analysis identified 59 hard corona proteins among the various AuNP, revealing largely surface chemistry-dependent signature adsorbomes exhibiting human serum albumin (HSA) abundance. Size distribution analysis revealed the relative instability and aggregation inducing potential of bare and corona-bound BPEI-AuNP, over LA- and PEG-AuNP. Circular dichroism analysis showed surface chemistry-dependent conformational changes of proteins binding to AuNP. Time-dependent uptake of bare, plasma corona (PC) and HSA corona-bound AuNP (HSA-AuNP) showed significant reduction in uptake with PC formation. Cell viability studies demonstrated dose-dependent toxicity of BPEI-AuNP. Transcriptional profiling studies revealed 126 genes, from 13 biological pathways, to be differentially regulated by 40 nm bare and PC-bound BPEI-AuNP (PC-BPEI-AuNP). Furthermore, PC formation relieved the toxicity of cationic BPEI-AuNP by modulating expression of genes involved in DNA damage and repair, heat shock response, mitochondrial energy metabolism, oxidative stress and antioxidant response, and ER stress and unfolded protein response cascades, which were aberrantly expressed in bare BPEI-AuNP-treated cells. NP surface chemistry is shown to play the dominant role over size in determining the biocorona composition, which in turn modulates cell uptake, and biological responses, consequently defining the potential safety and efficacy of nanoformulations.
Joint probabilities and quantum cognition
International Nuclear Information System (INIS)
Acacio de Barros, J.
2012-01-01
In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.
Joint probabilities and quantum cognition
Energy Technology Data Exchange (ETDEWEB)
Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)
2012-12-18
In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.
Default probabilities and default correlations
Erlenmaier, Ulrich; Gersbach, Hans
2001-01-01
Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...
The Probabilities of Unique Events
2012-08-30
Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the
Probability Matching, Fast and Slow
Koehler, Derek J.; James, Greta
2014-01-01
A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...
Moghadas, Davood
2013-01-01
We theoretically investigated the effect of vapor flow on the drying front that develops in soils when water evaporates from the soil surface and on GPR data. The results suggest the integration of the full-wave GPR model with a coupled water, vapor, and heat flow model to accurately estimate the soil hydraulic properties. We investigated the Effects of a drying front that emerges below an evaporating soil surface on the far-field ground-penetrating radar (GPR) data. First, we performed an analysis of the width of the drying front in soils with 12 different textures by using an analytical model. Then, we numerically simulated vertical soil moisture profiles that develop during evaporation for the soil textures. We performed the simulations using a Richards flow model that considers only liquid water flow and a model that considers coupled water, vapor, and heat flows. The GPR signals were then generated from the simulated soil water content profiles taking into account the frequency dependency of apparent electrical conductivity and dielectric permittivity. The analytical approach indicated that the width of the drying front at the end of Stage I of the evaporation was larger in silty soils than in other soil textures and smaller in sandy soils. We also demonstrated that the analytical estimate of the width of the drying front can be considered as a proxy for the impact that a drying front could have on far-field GPR data. The numerical simulations led to the conclusion that vapor transport in soil resulted in S-shaped soil moisture profiles, which clearly influenced the GPR data. As a result, vapor flow needs to be considered when GPR data are interpreted in a coupled inversion approach. Moreover, the impact of vapor flow on the GPR data was larger for silty than for sandy soils. These Effects on the GPR data provide promising perspectives regarding the use of radars for evaporation monitoring. © Soil Science Society of America 5585 Guilford Rd., Madison, WI
Probably not future prediction using probability and statistical inference
Dworsky, Lawrence N
2008-01-01
An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...
Wang, H. F.; Fratta, D.; Lancelle, C.; Ak, E. Ms; Lord, N. E.
2017-12-01
Monitoring traffic is important for many technical reasons. It allows for better design of future roads and assessment of the state of current roads. The number, size, weight, and speed of vehicles control deterioration rate. Also, real-time information supplies data to intelligent information systems to help control traffic. Recently there have been studies looking at monitoring traffic seismically as vibrations from traffic are not sensitive to weather and poor visibility. Furthermore, traffic noise can be used to image S-wave velocity distribution in the near surface by capturing and interpreting Rayleigh and Love waves (Nakata, 2016; Zeng et al. 2016). The capability of DAS for high spatial sampling (1 m), temporal sampling (up to 10 kHz), and distributed nature (tens of kilometers) allows for a closer look at the traffic as it passes and how the speed of the vehicle may change over the length of the array. The potential and difficulties of using DAS for these objectives were studied using two DAS arrays. One at Garner Valley in Southern California (a 700-meter array adjacent to CA Highway 74) and another in Brady Hot Springs, Nevada (an 8700-meter array adjacent to Interstate 80). These studies experimentally evaluated the use of DAS data for monitoring traffic and assessing the use of traffic vibration as non-localized sources for seismic imaging. DAS arrays should also be resilient to issues with lighting conditions that are problematic for video monitoring and it may be sensitive to the weight of a vehicle. This study along a major interstate provides a basis for examining DAS' potential and limitations as a key component of intelligent highway systems.
Zhang, Yuzhong; Deng, Shuxing; Liu, Yanan; Shen, Guofeng; Li, Xiqing; Cao, Jun; Wang, Xilong; Reid, Brian; Tao, Shu
2011-03-01
Air-soil exchange is an important process governing the fate of polycyclic aromatic hydrocarbons (PAHs). A novel passive air sampler was designed and tested for measuring the vertical concentration profile of 4 low molecular weight PAHs in gaseous phase (PAH(LMW4)) in near soil surface air. Air at various heights from 5 to 520 mm above the ground was sampled by polyurethane foam disks held in down-faced cartridges. The samplers were tested at three sites: A: an extremely contaminated site, B: a site near A, and C: a background site on a university campus. Vertical concentration gradients were revealed for PAH(LMW4) within a thin layer close to soil surface at the three sites. PAH concentrations either decreased (Site A) or increased (Sites B and C) with height, suggesting either deposition to or evaporation from soils. The sampler is a useful tool for investigating air-soil exchange of gaseous phase semi-volatile organic chemicals. Copyright Â© 2010 Elsevier Ltd. All rights reserved.
Foundations of quantization for probability distributions
Graf, Siegfried
2000-01-01
Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.
Conditional probabilities in Ponzano-Regge minisuperspace
International Nuclear Information System (INIS)
Petryk, Roman; Schleich, Kristin
2003-01-01
We examine the Hartle-Hawking no-boundary initial state for the Ponzano-Regge formulation of gravity in three dimensions. We consider the behavior of conditional probabilities and expectation values for geometrical quantities in this initial state for a simple minisuperspace model consisting of a two-parameter set of anisotropic geometries on a 2-sphere boundary. We find dependence on the cutoff used in the construction of Ponzano-Regge amplitudes for expectation values of edge lengths. However, these expectation values are cutoff independent when computed in certain, but not all, conditional probability distributions. Conditions that yield cutoff independent expectation values are those that constrain the boundary geometry to a finite range of edge lengths. We argue that such conditions have a correspondence to fixing a range of local time, as classically associated with the area of a surface for spatially closed cosmologies. Thus these results may hint at how classical spacetime emerges from quantum amplitudes
International Nuclear Information System (INIS)
Glass, Lisa; Ferrarese, Laura; Cote, Patrick; Blakeslee, John P.; Chen, Chin-Wei; Jordan, Andres; Infante, Leopoldo; Peng, Eric; Mei, Simona; Tonry, John L.; West, Michael J.
2011-01-01
Although early observations with the Hubble Space Telescope (HST) pointed to a sharp dichotomy among early-type galaxies in terms of the logarithmic slope γ' of their central surface brightness profiles, several studies in the past few years have called this finding into question. In particular, recent imaging surveys of 143 early-type galaxies belonging to the Virgo and Fornax Clusters using the Advanced Camera for Surveys (ACS) on board HST have not found a dichotomy in γ', but instead a systematic progression from central luminosity deficit to excess relative to the inward extrapolation of the best-fitting global Sersic model. Given that earlier studies also found that the dichotomy persisted when analyzing the deprojected density profile slopes, we investigate the distribution of the three-dimensional luminosity density profiles of the ACS Virgo and Fornax Cluster Survey galaxies. Having fitted the surface brightness profiles with modified Sersic models, we then deproject the galaxies using an Abel integral and measure the inner slopes γ 3D of the resulting luminosity density profiles at various fractions of the effective radius R e . We find no evidence of a dichotomy, but rather, a continuous variation in the central luminosity profiles as a function of galaxy magnitude. We introduce a parameter, Δ 3D , that measures the central deviation of the deprojected luminosity profiles from the global Sersic fit, showing that this parameter varies smoothly and systematically along the luminosity function.
Normal probability plots with confidence.
Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang
2015-01-01
Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Directory of Open Access Journals (Sweden)
Octavio Novaro
2012-01-01
Full Text Available We review ab initio studies based on quantum mechanics on the most important mechanisms of reaction leading to the C–H, Si–H, and Ge–H bond breaking of methane, silane, and germane, respectively, by a metal atom in the lowest states in Cs symmetry: X(2nd excited state, 1st excited state and ground state + YH4→ H3XYH → H + XYH3 and XH + YH3. with X = Au, Zn, Cd, Hg, Al, and G, and Y = C, Si, and Ge. Important issues considered here are (a the role that the occupation of the d-, s-, or p-shells of the metal atom plays in the interactions with a methane or silane or germane molecule, (b the role of either singlet or doublet excited states of metals on the reaction barriers, and (c the role of transition probabilities for different families of reacting metals with these gases, using the H–X–Y angle as a reaction coordinate. The breaking of the Y–H bond of YH4 is useful in the production of amorphous hydrogenated films, necessary in several fields of industry.
Weber, R. C.; Dimech, J. L.; Phillips, D.; Molaro, J.; Schmerr, N. C.
2017-12-01
Apollo 17's Lunar Seismic Profiling Experiment's (LSPE) primary objective was to constrain the near-surface velocity structure at the landing site using active sources detected by a 100 m-wide triangular geophone array. The experiment was later operated in "listening mode," and early studies of these data revealed the presence of thermal moonquakes - short-duration seismic events associated with terminator crossings. However, the full data set has never been systematically analyzed for natural seismic signal content. In this study, we analyze 8 months of continuous LSPE data using an automated event detection technique that has previously successfully been applied to the Apollo 16 Passive Seismic Experiment data. We detected 50,000 thermal moonquakes from three distinct event templates, representing impulsive, intermediate, and emergent onset of seismic energy, which we interpret as reflecting their relative distance from the array. Impulsive events occur largely at sunrise, possibly representing the thermal "pinging" of the nearby lunar lander, while emergent events occur at sunset, possibly representing cracking or slumping in more distant surface rocks and regolith. Preliminary application of an iterative event location algorithm to a subset of the impulsive waveforms supports this interpretation. We also perform 3D modeling of the lunar surface to explore the relative contribution of the lander, known rocks and surrounding topography to the thermal state of the regolith in the vicinity of the Apollo 17 landing site over the course of the lunar diurnal cycle. Further development of both this model and the event location algorithm may permit definitive discrimination between different types of local diurnal events e.g. lander noise, thermally-induced rock breakdown, or fault creep on the nearby Lee-Lincoln scarp. These results could place important constraints on both the contribution of seismicity to regolith production, and the age of young lobate scarps.
Femtosecond laser-induced surface wettability modification of polystyrene surface
Wang, Bing; Wang, XinCai; Zheng, HongYu; Lam, YeeCheong
2016-12-01
In this paper, we demonstrated a simple method to create either a hydrophilic or hydrophobic surface. With femtosecond laser irradiation at different laser parameters, the water contact angle (WCA) on polystyrene's surface can be modified to either 12.7° or 156.2° from its original WCA of 88.2°. With properly spaced micro-pits created, the surface became hydrophilic probably due to the spread of the water droplets into the micro-pits. While with properly spaced micro-grooves created, the surface became rough and more hydrophobic. We investigated the effect of laser parameters on WCAs and analyzed the laser-treated surface roughness, profiles and chemical bonds by surface profilometer, scanning electron microscope (SEM) and X-ray photoelectron spectroscopy (XPS). For the laser-treated surface with low roughness, the polar (such as C—O, C=O, and O—C=O bonds) and non-polar (such as C—C or C—H bonds) groups were found to be responsible for the wettability changes. While for a rough surface, the surface roughness or the surface topography structure played a more significant role in the changes of the surface WCA. The mechanisms involved in the laser surface wettability modification process were discussed.
Crosson, William L.; Laymon, Charles A.; Inguva, Ramarao; Schamschula, Marius; Caulfield, John
1998-01-01
' soil moisture under such conditions and even more difficult to apply such a value. Because of the non-linear relationships between near-surface soil moisture and other variables of interest, such as surface energy fluxes and runoff, mean soil moisture has little applicability at such large scales. It is for these reasons that the use of remote sensing in conjunction with a hydrologic model appears to be of benefit in capturing the complete spatial and temporal structure of soil moisture. This paper is Part I of a four-part series describing a method for intermittently assimilating remotely-sensed soil moisture information to improve performance of a distributed land surface hydrology model. The method, summarized in section II, involves the following components, each of which is detailed in the indicated section of the paper or subsequent papers in this series: Forward radiative transfer model methods (section II and Part IV); Use of a Kalman filter to assimilate remotely-sensed soil moisture estimates with the model profile (section II and Part IV); Application of a soil hydrology model to capture the continuous evolution of the soil moisture profile within and below the root zone (section III); Statistical aggregation techniques (section IV and Part II); Disaggregation techniques using a neural network approach (section IV and Part III); and Maximum likelihood and Bayesian algorithms for inversely solving for the soil moisture profile in the upper few cm (Part IV).
Probability theory a foundational course
Pakshirajan, R P
2013-01-01
This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.
VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS
Directory of Open Access Journals (Sweden)
Smirnov Vladimir Alexandrovich
2012-10-01
Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.
Approximation methods in probability theory
Čekanavičius, Vydas
2016-01-01
This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.
Model uncertainty: Probabilities for models?
International Nuclear Information System (INIS)
Winkler, R.L.
1994-01-01
Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising
Knowledge typology for imprecise probabilities.
Energy Technology Data Exchange (ETDEWEB)
Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)
2002-01-01
When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2011-01-01
A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d
Statistical probability tables CALENDF program
International Nuclear Information System (INIS)
Ribon, P.
1989-01-01
The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity
Probability, statistics, and queueing theory
Allen, Arnold O
1990-01-01
This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit
Probability and Statistics: 5 Questions
DEFF Research Database (Denmark)
Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...
Monte Carlo methods to calculate impact probabilities
Rickman, H.; Wiśniowski, T.; Wajer, P.; Gabryszewski, R.; Valsecchi, G. B.
2014-09-01
Context. Unraveling the events that took place in the solar system during the period known as the late heavy bombardment requires the interpretation of the cratered surfaces of the Moon and terrestrial planets. This, in turn, requires good estimates of the statistical impact probabilities for different source populations of projectiles, a subject that has received relatively little attention, since the works of Öpik (1951, Proc. R. Irish Acad. Sect. A, 54, 165) and Wetherill (1967, J. Geophys. Res., 72, 2429). Aims: We aim to work around the limitations of the Öpik and Wetherill formulae, which are caused by singularities due to zero denominators under special circumstances. Using modern computers, it is possible to make good estimates of impact probabilities by means of Monte Carlo simulations, and in this work, we explore the available options. Methods: We describe three basic methods to derive the average impact probability for a projectile with a given semi-major axis, eccentricity, and inclination with respect to a target planet on an elliptic orbit. One is a numerical averaging of the Wetherill formula; the next is a Monte Carlo super-sizing method using the target's Hill sphere. The third uses extensive minimum orbit intersection distance (MOID) calculations for a Monte Carlo sampling of potentially impacting orbits, along with calculations of the relevant interval for the timing of the encounter allowing collision. Numerical experiments are carried out for an intercomparison of the methods and to scrutinize their behavior near the singularities (zero relative inclination and equal perihelion distances). Results: We find an excellent agreement between all methods in the general case, while there appear large differences in the immediate vicinity of the singularities. With respect to the MOID method, which is the only one that does not involve simplifying assumptions and approximations, the Wetherill averaging impact probability departs by diverging toward
International Nuclear Information System (INIS)
Chen, She-Jun; Feng, An-Hong; He, Ming-Jing; Chen, Man-Ying; Luo, Xiao-Jun; Mai, Bi-Xian
2013-01-01
Polybrominated diphenyl ethers (PBDEs) and alternative flame retardants were measured in surface sediments collected during 2009–2010 from the Pearl River Delta, southern China (a large manufacturing base for electronics/electrical products), to evaluate the influence of China's RoHS directive (adopted in 2006) on their environmental occurrence. The concentrations in sediments from different water systems ranged from 3.67 to 2520 ng/g (average of 17.1–588 ng/g) for PBDEs and from 0.22 to 5270 ng/g (average of 11.3–454 ng/g) for the alternative retardants. Although the PBDE levels have decreased significantly compared with those in sediments collected in 2002 in this region, the levels of alternative decabromodiphenyl ethane (DBDPE) have exceeded those of BDE209 (two predominant halogenated flame retardants (HFRs) in China) in the majority of sediments. This finding suggests a different contaminant pattern of HFRs in current sediments due to the replacement of the deca-BDE mixture with DBDPE in this region. In addition, sediment concentrations of discontinued PBDEs in the rural area are clearly elevated due to e-waste dismantling. The congener profiles of PBDEs in the current sediments (with more abundant lower-brominated congeners) differed substantially from those in 2002 and from the technical products, suggesting that biological or photolytic debromination of PBDEs may have occurred in the environment. - Highlights: ► PBDE levels in sediments have decreased substantially since China's RoHS directive. ► Contamination of novel DBDPE has exceeded that of deca-BDE in the PRD sediments. ► The congener profiles of PBDEs in the sediments have changed significantly. ► Significant biological or photolytic degradation of PBDEs may occur in the environment
Dynamic SEP event probability forecasts
Kahler, S. W.; Ling, A.
2015-10-01
The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.
Conditional Independence in Applied Probability.
Pfeiffer, Paul E.
This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…
Stretching Probability Explorations with Geoboards
Wheeler, Ann; Champion, Joe
2016-01-01
Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…
GPS: Geometry, Probability, and Statistics
Field, Mike
2012-01-01
It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…
Swedish earthquakes and acceleration probabilities
International Nuclear Information System (INIS)
Slunga, R.
1979-03-01
A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)
DECOFF Probabilities of Failed Operations
DEFF Research Database (Denmark)
Gintautas, Tomas
2015-01-01
A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...
Risk estimation using probability machines
2014-01-01
Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306
Probability and statistics: A reminder
International Nuclear Information System (INIS)
Clement, B.
2013-01-01
The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from 'data analysis in experimental sciences' given in [1]. (authors)
Nash equilibrium with lower probabilities
DEFF Research Database (Denmark)
Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte
1998-01-01
We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...
On probability-possibility transformations
Klir, George J.; Parviz, Behzad
1992-01-01
Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.
Spectral backward radiation profile
International Nuclear Information System (INIS)
Kwon, Sung Duck; Lee, Keun Hyun; Kim, Bo Ra; Yoon, Suk Soo
2004-01-01
Ultrasonic backward radiation profile is frequency-dependent when incident region has deptional gradient of acoustical properties or multi-layers. Until now, we have measured the profiles of principal frequencies of used transducers so that it was not easy to understand the change of the frequency component and spectrum of backward radiation from the profile. We tried to measure the spectral backward radiation profiles using DFP(digital filer package) Lecroy DSO. The very big changes in the shape and pattern of spectral backward radiation profiles leads to the conclusion that this new try could be very effective tool to evaluate frequency dependent surface area.
Johnson, K. S.; Plant, J. N.; Sakamoto, C.; Coletti, L. J.; Sarmiento, J. L.; Riser, S.; Talley, L. D.
2016-12-01
Sixty profiling floats with ISUS and SUNA nitrate sensors have been deployed in the Southern Ocean (south of 30 degrees S) as part of the SOCCOM (Southern Ocean Carbon and Climate Observations and Modeling) program and earlier efforts. These floats have produced detailed records of the annual cycle of nitrate concentration throughout the region from the surface to depths near 2000 m. In surface waters, there are clear cycles in nitrate concentration that result from uptake of nitrate during austral spring and summer. These changes in nitrate concentration were used to compute the annual net community production over this region. NCP was computed using a simplified version of the approach detailed by Plant et al. (2016, Global Biogeochemical Cycles, 30, 859-879, DOI: 10.1002/2015GB005349). At the time the abstract was written 41 complete annual cycles were available from floats deployed before the austral summer of 2015/2016. After filtering the data to remove floats that crossed distinct frontal boundaries, floats with other anomalies, and floats in sub-tropical waters, 23 cycles were available. A preliminary assessment of the data yields an NCP of 2.8 +/- 0.95 (1 SD) mol C/m2/y after integrating to 100 m depth and converting nitrate uptake to carbon using the Redfield ratio. This preliminary assessment ignores vertical transport across the nitracline and is, therefore, a minimum estimate. The number of cycles available for analysis will increase rapidly, as 32 of the floats were deployed in the austral summer of 2015/2016 and have not yet been analyzed.
VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES
International Nuclear Information System (INIS)
G.A. Valentine; F.V. Perry; S. Dartevelle
2005-01-01
Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision
Energy Technology Data Exchange (ETDEWEB)
Ari Palczewski, Rongli Geng, Grigory Eremeev
2011-07-01
We designed and built two high resolution (0.6-0.55mm special resolution [1.1-1.2mm separation]) thermometry arrays prototypes out of the Allen Bradley 90-120 ohm 1/8 watt resistor to measure surface temperature profiles on SRF cavities. One array was designed to be physically flexible and conform to any location on a SRF cavity; the other was modeled after the common G-10/stycast 2850 thermometer and designed to fit on the equator of an ILC (Tesla 1.3GHz) SRF cavity. We will discuss the advantages and disadvantages of each array and their construction. In addition we will present a case study of the arrays performance on a real SRF cavity TB9NR001. TB9NR001 presented a unique opportunity to test the performance of each array as it contained a dual (4mm separation) cat eye defect which conventional methods such as OST (Oscillating Superleak second-sound Transducers) and full coverage thermometry mapping were unable to distinguish between. We will discuss the new arrays ability to distinguish between the two defects and their preheating performance.
Large deviations and idempotent probability
Puhalskii, Anatolii
2001-01-01
In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...
Probability biases as Bayesian inference
Directory of Open Access Journals (Sweden)
Andre; C. R. Martins
2006-11-01
Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.
Probability matching and strategy availability.
Koehler, Derek J; James, Greta
2010-09-01
Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought to their attention, more participants subsequently engage in maximizing. Third, matchers are more likely than maximizers to base decisions in other tasks on their initial intuitions, suggesting that they are more inclined to use a choice strategy that comes to mind quickly. These results indicate that a substantial subset of probability matchers are victims of "underthinking" rather than "overthinking": They fail to engage in sufficient deliberation to generate a superior alternative to the matching strategy that comes so readily to mind.
Probability as a Physical Motive
Directory of Open Access Journals (Sweden)
Peter Martin
2007-04-01
Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (Ã¢Â€ÂœMEPÃ¢Â€Â to the information-theoreticalÃ¢Â€ÂœMaxEntÃ¢Â€Â principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand Ã¢Â€Âœthe adjacentpossibleÃ¢Â€Â as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.
Logic, Probability, and Human Reasoning
2015-01-01
accordingly suggest a way to integrate probability and deduction. The nature of deductive reasoning To be rational is to be able to make deductions...3–6] and they underlie mathematics, science, and tech- nology [7–10]. Plato claimed that emotions upset reason- ing. However, individuals in the grip...fundamental to human rationality . So, if counterexamples to its principal predictions occur, the theory will at least explain its own refutation
Probability Measures on Groups IX
1989-01-01
The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.
Probability matching and strategy availability
J. Koehler, Derek; Koehler, Derek J.; James, Greta
2010-01-01
Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought...
California Natural Resource Agency — Beaches are commonly characterized by cross-shore surveys. The resulting profiles represent the elevation of the beach surface and nearshore seabed from the back of...
[Biometric bases: basic concepts of probability calculation].
Dinya, E
1998-04-26
The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.
Impact probabilities of meteoroid streams with artificial satellites: An assessment
International Nuclear Information System (INIS)
Foschini, L.; Cevolani, G.
1997-01-01
Impact probabilities of artificial satellites with meteoroid streams were calculated using data collected with the CNR forward scatter (FS) bistatic radar over the Bologna-Lecce baseline (about 700 km). Results show that impact probabilities are 2 times higher than other previously calculated values. Nevertheless, although catastrophic impacts are still rare even in the case of meteor storm conditions, it is expected that high meteoroid fluxes can erode satellites surfaces and weaken their external structures
Probability for Weather and Climate
Smith, L. A.
2013-12-01
Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of
Probability, Statistics, and Stochastic Processes
Olofsson, Peter
2012-01-01
This book provides a unique and balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and
Probability, statistics, and computational science.
Beerenwinkel, Niko; Siebourg, Juliane
2012-01-01
In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.
Sensitivity analysis using probability bounding
International Nuclear Information System (INIS)
Ferson, Scott; Troy Tucker, W.
2006-01-01
Probability bounds analysis (PBA) provides analysts a convenient means to characterize the neighborhood of possible results that would be obtained from plausible alternative inputs in probabilistic calculations. We show the relationship between PBA and the methods of interval analysis and probabilistic uncertainty analysis from which it is jointly derived, and indicate how the method can be used to assess the quality of probabilistic models such as those developed in Monte Carlo simulations for risk analyses. We also illustrate how a sensitivity analysis can be conducted within a PBA by pinching inputs to precise distributions or real values
Lectures on probability and statistics
International Nuclear Information System (INIS)
Yost, G.P.
1984-09-01
These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another
Event Discrimination Using Seismoacoustic Catalog Probabilities
Albert, S.; Arrowsmith, S.; Bowman, D.; Downey, N.; Koch, C.
2017-12-01
Presented here are three seismoacoustic catalogs from various years and locations throughout Utah and New Mexico. To create these catalogs, we combine seismic and acoustic events detected and located using different algorithms. Seismoacoustic events are formed based on similarity of origin time and location. Following seismoacoustic fusion, the data is compared against ground truth events. Each catalog contains events originating from both natural and anthropogenic sources. By creating these seismoacoustic catalogs, we show that the fusion of seismic and acoustic data leads to a better understanding of the nature of individual events. The probability of an event being a surface blast given its presence in each seismoacoustic catalog is quantified. We use these probabilities to discriminate between events from natural and anthropogenic sources. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525.
Probability theory a comprehensive course
Klenke, Achim
2014-01-01
This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms. To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as: • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...
Excluding joint probabilities from quantum theory
Allahverdyan, Armen E.; Danageozian, Arshag
2018-03-01
Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.
Probability theory and mathematical statistics for engineers
Pugachev, V S
1984-01-01
Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector
Introduction to probability theory with contemporary applications
Helms, Lester L
2010-01-01
This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process
Litaker, J R; Pan, J; Cheung, Y; Zhang, D K; Liu, Y; Wong, S C; Wan, T S; Tsao, S W
1998-11-01
Senescence is a specific physiological stage of cells characterized by long population doubling time. It accounts for the inability of normal somatic cells to undergo indefinite cell division. As the number of population doublings increase, cell cycle regulatory mechanisms come into play and signal cells to exit the cell cycle and become senescent. Senescence has been implicated in the aging process and may function as a tumor suppressor mechanism in human cells. The ability to measure the degree of cellular senescence is important in understanding the biological processes regulating cell aging and immortalization. Senescent cells exhibit an enzyme termed senescence-associated histochemical staining. Cells immortalized by viral oncogenes often enter a stage of crisis at the early phase of immortalization. The cells at crisis have a long population doubling time. Cells at the crisis stage resemble senescent cells and the expression of SA- beta-Gal may be used to monitor the process of immortalization. In this study the expression profile of SA-beta-Gal was examined in human ovarian surface epithelial cells (HOSE 6-3) undergoing immortalization by the human papilloma viral oncogene E6 and E7 (HPV E6 and E7). Our results showed a low percentage (12.0%) of HOSE 6-3 cells expressing SA-beta-Gal activity at the pre-crisis stage. The percentage of HOSE 6-3 cells expressing SA-beta-Gal activity was highest (39.2%) at the crisis stage. When HOSE 6-3 cells achieved immortalized status there was a sharp decrease in cells (1. 3%) expressing SA-beta-Gal activity. In addition, an inverse relationship between the expression of SA-beta-Gal activity and telomerase activity was noted in cells undergoing immortalization. The results confirm that the SA-beta-Gal enzyme is a good marker for monitoring the population of cells undergoing senescence at different stages of immortalization and that telomerase activation is a characteristic feature of post-crisis cells.
K-forbidden transition probabilities
International Nuclear Information System (INIS)
Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki
2000-01-01
Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)
Direct probability mapping of contaminants
International Nuclear Information System (INIS)
Rautman, C.A.
1993-01-01
Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration
Front Probability, NOAA GOES Imager, 0.05 degrees, Western Hemisphere, EXPERIMENTAL
National Oceanic and Atmospheric Administration, Department of Commerce — The data indicates the probability of oceanic sea surface temperature fronts off the California coast. They were created using remote sensing sea surface temperature...
Psychophysics of the probability weighting function
Takahashi, Taiki
2011-03-01
A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.
Modeling the probability distribution of peak discharge for infiltrating hillslopes
Baiamonte, Giorgio; Singh, Vijay P.
2017-07-01
Hillslope response plays a fundamental role in the prediction of peak discharge at the basin outlet. The peak discharge for the critical duration of rainfall and its probability distribution are needed for designing urban infrastructure facilities. This study derives the probability distribution, denoted as GABS model, by coupling three models: (1) the Green-Ampt model for computing infiltration, (2) the kinematic wave model for computing discharge hydrograph from the hillslope, and (3) the intensity-duration-frequency (IDF) model for computing design rainfall intensity. The Hortonian mechanism for runoff generation is employed for computing the surface runoff hydrograph. Since the antecedent soil moisture condition (ASMC) significantly affects the rate of infiltration, its effect on the probability distribution of peak discharge is investigated. Application to a watershed in Sicily, Italy, shows that with the increase of probability, the expected effect of ASMC to increase the maximum discharge diminishes. Only for low values of probability, the critical duration of rainfall is influenced by ASMC, whereas its effect on the peak discharge seems to be less for any probability. For a set of parameters, the derived probability distribution of peak discharge seems to be fitted by the gamma distribution well. Finally, an application to a small watershed, with the aim to test the possibility to arrange in advance the rational runoff coefficient tables to be used for the rational method, and a comparison between peak discharges obtained by the GABS model with those measured in an experimental flume for a loamy-sand soil were carried out.
Gas prices: realities and probabilities
International Nuclear Information System (INIS)
Broadfoot, M.
2000-01-01
An assessment of price trends suggests continuing rise in 2001, with some easing of upward price movement in 2002 and 2003. Storage levels as of Nov. 1, 2000 are expected to be at 2.77 Tcf, but if the winter of 2000/2001 proves to be more severe than usual, inventory levels could sink as low as 500 Bcf by April 1, 2001. With increasing demand for natural gas for non-utility electric power generation the major challenge will be to achieve significant supply growth, which means increased developmental drilling and inventory draw-downs, as well as more exploratory drilling in deepwater and frontier regions. Absence of a significant supply response by next summer will affect both growth in demand and in price levels, and the increased demand for electric generation in the summer will create a flatter consumption profile, erasing the traditional summer/winter spread in consumption, further intensifying price volatility. Managing price fluctuations is the second biggest challenge (after potential supply problems) facing the industry
THE BLACK HOLE FORMATION PROBABILITY
Energy Technology Data Exchange (ETDEWEB)
Clausen, Drew; Piro, Anthony L.; Ott, Christian D., E-mail: dclausen@tapir.caltech.edu [TAPIR, Walter Burke Institute for Theoretical Physics, California Institute of Technology, Mailcode 350-17, Pasadena, CA 91125 (United States)
2015-02-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.
THE BLACK HOLE FORMATION PROBABILITY
International Nuclear Information System (INIS)
Clausen, Drew; Piro, Anthony L.; Ott, Christian D.
2015-01-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH (M ZAMS ). Although we find that it is difficult to derive a unique P BH (M ZAMS ) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH (M ZAMS ) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH (M ZAMS ) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment
The Black Hole Formation Probability
Clausen, Drew; Piro, Anthony L.; Ott, Christian D.
2015-02-01
A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.
Probable leaching mechanisms for spent fuel
International Nuclear Information System (INIS)
Wang, R.; Katayama, Y.B.
1981-01-01
At the Pacific Northwest Laboratory, researchers in the Waste/Rock Interaction Technology Program are studying spent fuel as a possible waste form for the Office of Nuclear Waste Isolation. This paper presents probable leaching mechanisms for spent fuel and discusses current progress in identifying and understanding the leaching process. During the past year, experiments were begun to study the complex leaching mechanism of spent fuel. The initial work in this investigation was done with UO 2 , which provided the most information possible on the behavior of the spent-fuel matrix without encountering the very high radiation levels associated with spent fuel. Both single-crystal and polycrystalline UO 2 samples were used for this study, and techniques applicable to remote experimentation in a hot cell are being developed. The effects of radiation are being studied in terms of radiolysis of water and surface activation of the UO 2 . Dissolution behavior and kinetics of UO 2 were also investigated by electrochemical measurement techniques. These data will be correlated with those acquired when spent fuel is tested in a hot cell. Oxidation effects represent a major area of concern in evaluating the stability of spent fuel. Dissolution of UO 2 is greatly increased in an oxidizing solution because the dissolution is then controlled by the formation of hexavalent uranium. In solutions containing very low oxygen levels (i.e., reducing solutions), oxidation-induced dissolution may be possible via a previously oxidized surface, through exposure to air during storage, or by local oxidants such as O 2 and H 2 O 2 produced from radiolysis of water and radiation-activated UO 2 surfaces. The effects of oxidation not only increase the dissolution rate, but could lead to the disintegration of spent fuel into fine fragments
Foundations of the theory of probability
Kolmogorov, AN
2018-01-01
This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.
The Probability Distribution for a Biased Spinner
Foster, Colin
2012-01-01
This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)
Conditional Probability Modulates Visual Search Efficiency
Directory of Open Access Journals (Sweden)
Bryan eCort
2013-10-01
Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.
Analytic Neutrino Oscillation Probabilities in Matter: Revisited
Energy Technology Data Exchange (ETDEWEB)
Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT
2018-01-02
We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.
Void probability scaling in hadron nucleus interactions
International Nuclear Information System (INIS)
Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima
2002-01-01
Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability
Pre-Service Teachers' Conceptions of Probability
Odafe, Victor U.
2011-01-01
Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…
Using Playing Cards to Differentiate Probability Interpretations
López Puga, Jorge
2014-01-01
The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.
Dependent Human Error Probability Assessment
International Nuclear Information System (INIS)
Simic, Z.; Mikulicic, V.; Vukovic, I.
2006-01-01
This paper presents an assessment of the dependence between dynamic operator actions modeled in a Nuclear Power Plant (NPP) PRA and estimate the associated impact on Core damage frequency (CDF). This assessment was done improve HEP dependencies implementation inside existing PRA. All of the dynamic operator actions modeled in the NPP PRA are included in this assessment. Determining the level of HEP dependence and the associated influence on CDF are the major steps of this assessment. A decision on how to apply the results, i.e., should permanent HEP model changes be made, is based on the resulting relative CDF increase. Some CDF increase was selected as a threshold based on the NPP base CDF value and acceptance guidelines from the Regulatory Guide 1.174. HEP dependence resulting in a CDF increase of > 5E-07 would be considered potential candidates for specific incorporation into the baseline model. The approach used to judge the level of dependence between operator actions is based on dependency level categories and conditional probabilities developed in the Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications NUREG/CR-1278. To simplify the process, NUREG/CR-1278 identifies five levels of dependence: ZD (zero dependence), LD (low dependence), MD (moderate dependence), HD (high dependence), and CD (complete dependence). NUREG/CR-1278 also identifies several qualitative factors that could be involved in determining the level of dependence. Based on the NUREG/CR-1278 information, Time, Function, and Spatial attributes were judged to be the most important considerations when determining the level of dependence between operator actions within an accident sequence. These attributes were used to develop qualitative criteria (rules) that were used to judge the level of dependence (CD, HD, MD, LD, ZD) between the operator actions. After the level of dependence between the various HEPs is judged, quantitative values associated with the
KMC Simulation of Surface Growth of Semiconductors
International Nuclear Information System (INIS)
Esen, M.
2004-01-01
In this work we have studied the growth and equilibration of semiconductor surfaces consisting of monoatomic steps separated by flat terraces using kinetic Monte Carlo method. Atomistic processes such as diffusion on terraces, attachment/detachment particles to/from step edges, attachment of particles from an upper terrace to a bounding step, diffusion of particles along step edges are considered. A rate equation for each, these processes is written and the overall transition probabilities are calculated where processes are ordered to make the distinction between slow and fast processes Iractal The interaction of steps is also included in the calculation of rate equations. The growth of such a surface is simulated when there is a particle flux to the surface. The rough of the surface and its dependence on both temperature and kinetic parameters such edge diffusion barrier are investigated. The formation of islands on terraces is prohibited and the distribution of their number and sizes are investigated as a function of temperature and appropriate kinetic parameters. In the absence of a flux to the surface, the equilibration of the surface is investigated paying particular attention to the top of the profile when the initial surface is a periodic profile where parallel monoatomic steps separated by terraces. It is observed that during equilibration of the profile, the topmost step disintegrates quickly and leads to many islands on the top of the profile due to. collision and annihilation of step edges of opposite sign. The islands then quickly disintegrate due to the line tension effect and this scenario repeats itself until the surface completely flattens
Fundamentals of applied probability and random processes
Ibe, Oliver
2014-01-01
The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t
Probability of Failure in Random Vibration
DEFF Research Database (Denmark)
Nielsen, Søren R.K.; Sørensen, John Dalsgaard
1988-01-01
Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...
An Objective Theory of Probability (Routledge Revivals)
Gillies, Donald
2012-01-01
This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma
Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem
Directory of Open Access Journals (Sweden)
Juliana Bueno-Soler
2016-09-01
Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.
Predicting non-square 2D dice probabilities
Pender, G. A. T.; Uhrin, M.
2014-07-01
The prediction of the final state probabilities of a general cuboid randomly thrown onto a surface is a problem that naturally arises in the minds of men and women familiar with regular cubic dice and the basic concepts of probability. Indeed, it was considered by Newton in 1664 (Newton 1967 The Mathematical Papers of Issac Newton vol I (Cambridge: Cambridge University Press) pp 60-1). In this paper we make progress on the 2D problem (which can be realized in 3D by considering a long cuboid, or alternatively a rectangular cross-sectioned dreidel). For the two-dimensional case we suggest that the ratio of the probabilities of landing on each of the two sides is given by \\frac{\\sqrt{{{k}^{2}}+{{l}^{2}}}-k}{\\sqrt{{{k}^{2}}+{{l}^{2}}}-l}\\frac{arctan \\frac{l}{k}}{arctan \\frac{k}{l}} where k and l are the lengths of the two sides. We test this theory both experimentally and computationally, and find good agreement between our theory, experimental and computational results. Our theory is known, from its derivation, to be an approximation for particularly bouncy or ‘grippy’ surfaces where the die rolls through many revolutions before settling. On real surfaces we would expect (and we observe) that the true probability ratio for a 2D die is a somewhat closer to unity than predicted by our theory. This problem may also have wider relevance in the testing of physics engines.
Spiga, D
2018-01-01
X-ray mirrors with high focusing performances are commonly used in different sectors of science, such as X-ray astronomy, medical imaging and synchrotron/free-electron laser beamlines. While deformations of the mirror profile may cause degradation of the focus sharpness, a deliberate deformation of the mirror can be made to endow the focus with a desired size and distribution, via piezo actuators. The resulting profile can be characterized with suitable metrology tools and correlated with the expected optical quality via a wavefront propagation code or, sometimes, predicted using geometric optics. In the latter case and for the special class of profile deformations with monotonically increasing derivative, i.e. concave upwards, the point spread function (PSF) can even be predicted analytically. Moreover, under these assumptions, the relation can also be reversed: from the desired PSF the required profile deformation can be computed analytically, avoiding the use of trial-and-error search codes. However, the computation has been so far limited to geometric optics, which entailed some limitations: for example, mirror diffraction effects and the size of the coherent X-ray source were not considered. In this paper, the beam-shaping formalism in the framework of physical optics is reviewed, in the limit of small light wavelengths and in the case of Gaussian intensity wavefronts. Some examples of shaped profiles are also shown, aiming at turning a Gaussian intensity distribution into a top-hat one, and checks of the shaping performances computing the at-wavelength PSF by means of the WISE code are made.
Hladíková, Radka
2010-01-01
Title: Data Profiling Author: Radka Hladíková Department: Department of Software Engineering Supervisor: Ing. Vladimír Kyjonka Supervisor's e-mail address: Abstract: This thesis puts mind on problems with data quality and data profiling. This Work analyses and summarizes problems of data quality, data defects, process of data quality, data quality assessment and data profiling. The main topic is data profiling as a process of researching data available in existing...
CSIR Research Space (South Africa)
Jafta, CJ
2011-07-01
Full Text Available Previous experimental investigations have only shown, without explanation, that the pre-exponential factor (D0), in the diffusion coefficient of Sb segregating in Cu, is dependent on the surface orientation of a crystal. In this study, the surface...
Computer Profiling Based Model for Investigation
Neeraj Choudhary; Nikhil Kumar Singh; Parmalik Singh
2011-01-01
Computer profiling is used for computer forensic analysis, and proposes and elaborates on a novel model for use in computer profiling, the computer profiling object model. The computer profiling object model is an information model which models a computer as objects with various attributes and inter-relationships. These together provide the information necessary for a human investigator or an automated reasoning engine to make judgments as to the probable usage and evidentiary value of a comp...
Karolinske psychodynamic profile (KAPP)
DEFF Research Database (Denmark)
Mathiesen, Birgit Bork; Søgaard, Ulf
2006-01-01
psykologiske testmetoder, assesment, Karolinska psychodynamic profile (KAPP), psykodynamisk profil......psykologiske testmetoder, assesment, Karolinska psychodynamic profile (KAPP), psykodynamisk profil...
Probability concepts in quality risk management.
Claycamp, H Gregg
2012-01-01
Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as
Contaminant concentrations are reported for surface water, sediment, seagrass, mangroves, Florida Crown conch, blue crabs and fish collected during 2010-2011 from the mangrove fringe along eastern Tampa Bay. Concentrations of trace metals, chlorinated pesticides, atrazine, total ...
Transition probability spaces in loop quantum gravity
Guo, Xiao-Kan
2018-03-01
We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.
Towards a Categorical Account of Conditional Probability
Directory of Open Access Journals (Sweden)
Robert Furber
2015-11-01
Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.
UT Biomedical Informatics Lab (BMIL) probability wheel
Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.
A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.
A probability space for quantum models
Lemmens, L. F.
2017-06-01
A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.
Fundamentals of applied probability and random processes
Ibe, Oliver
2005-01-01
This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections
Striatal activity is modulated by target probability.
Hon, Nicholas
2017-06-14
Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.
Defining Probability in Sex Offender Risk Assessment.
Elwood, Richard W
2016-12-01
There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.
National Oceanic and Atmospheric Administration, Department of Commerce — NODC Accession 0108084 includes chemical, discrete sample, meteorological, physical, profile and underway - surface data collected from MIRAI in the Coral Sea, North...
National Oceanic and Atmospheric Administration, Department of Commerce — NCEI Accession 0108123 includes Surface underway, discrete sample and profile data collected from MIRAI in the Bering Sea, North Pacific Ocean and South Pacific...
National Oceanic and Atmospheric Administration, Department of Commerce — NODC Accession 0108081 includes chemical, discrete sample, physical, profile and underway - surface data collected from MIRAI in the Bismarck Sea, North Pacific...
National Oceanic and Atmospheric Administration, Department of Commerce — NCEI Accession 0115019 includes Surface underway, chemical, discrete sample, meteorological, physical and profile data collected from THOMAS G. THOMPSON in the...
National Oceanic and Atmospheric Administration, Department of Commerce — NODC Accession 0115002 includes chemical, discrete sample, meteorological, physical, profile and underway - surface data collected from KNORR in the North Atlantic...
National Oceanic and Atmospheric Administration, Department of Commerce — NODC Accession 0115001 includes chemical, discrete sample, physical, profile and underway - surface data collected from METEOR in the North Atlantic Ocean from...
National Oceanic and Atmospheric Administration, Department of Commerce — NODC Accession 0115173 includes chemical, discrete sample, meteorological, physical, profile and underway - surface data collected from METEOR in the South Atlantic...
National Oceanic and Atmospheric Administration, Department of Commerce — NCEI Accession 0115157 includes Surface underway, discrete sample and profile data collected from MAURICE EWING in the North Atlantic Ocean and South Atlantic Ocean...
National Oceanic and Atmospheric Administration, Department of Commerce — NCEI Accession 0157461 includes Surface underway, chemical, discrete sample, meteorological, physical and profile data collected from PELICAN in the Coastal Waters...
The probability and the management of human error
International Nuclear Information System (INIS)
Dufey, R.B.; Saull, J.W.
2004-01-01
Embedded within modern technological systems, human error is the largest, and indeed dominant contributor to accident cause. The consequences dominate the risk profiles for nuclear power and for many other technologies. We need to quantify the probability of human error for the system as an integral contribution within the overall system failure, as it is generally not separable or predictable for actual events. We also need to provide a means to manage and effectively reduce the failure (error) rate. The fact that humans learn from their mistakes allows a new determination of the dynamic probability and human failure (error) rate in technological systems. The result is consistent with and derived from the available world data for modern technological systems. Comparisons are made to actual data from large technological systems and recent catastrophes. Best estimate values and relationships can be derived for both the human error rate, and for the probability. We describe the potential for new approaches to the management of human error and safety indicators, based on the principles of error state exclusion and of the systematic effect of learning. A new equation is given for the probability of human error (λ) that combines the influences of early inexperience, learning from experience (ε) and stochastic occurrences with having a finite minimum rate, this equation is λ 5.10 -5 + ((1/ε) - 5.10 -5 ) exp(-3*ε). The future failure rate is entirely determined by the experience: thus the past defines the future
Spatial probability aids visual stimulus discrimination
Directory of Open Access Journals (Sweden)
Michael Druker
2010-08-01
Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.
Measurement of probability distributions for internal stresses in dislocated crystals
Energy Technology Data Exchange (ETDEWEB)
Wilkinson, Angus J.; Tarleton, Edmund; Vilalta-Clemente, Arantxa; Collins, David M. [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Jiang, Jun; Britton, T. Benjamin [Department of Materials, Imperial College London, Royal School of Mines, Exhibition Road, London SW7 2AZ (United Kingdom)
2014-11-03
Here, we analyse residual stress distributions obtained from various crystal systems using high resolution electron backscatter diffraction (EBSD) measurements. Histograms showing stress probability distributions exhibit tails extending to very high stress levels. We demonstrate that these extreme stress values are consistent with the functional form that should be expected for dislocated crystals. Analysis initially developed by Groma and co-workers for X-ray line profile analysis and based on the so-called “restricted second moment of the probability distribution” can be used to estimate the total dislocation density. The generality of the results are illustrated by application to three quite different systems, namely, face centred cubic Cu deformed in uniaxial tension, a body centred cubic steel deformed to larger strain by cold rolling, and hexagonal InAlN layers grown on misfitting sapphire and silicon carbide substrates.
Is probability of frequency too narrow?
International Nuclear Information System (INIS)
Martz, H.F.
1993-01-01
Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed
International Nuclear Information System (INIS)
Komiya, S.; Mizuno, M.; Narusawa, T.; Maeda, H.; Yoshikawa, M.
1974-01-01
Preparation and evaluation of a clean Au film are investigated. Development of a preparation method for obtaining clean surface on a copper shell in the JFT-2a (DIVA) TOKAMAK toroidal vacuum chamber is the aim of the present work. Au films prepared by ion plating and vacuum evaporation have been analysed by a cylindrical mirror Auger electron analyser in combination with a quadrupole mass spectrometer during 2 keV Xe ion bombardment from a sputter ion gun over the whole range of thickness of several microns. Contaminants are found to segregate on the top surface and at the interface. To expose a clean Au surface by the ion bombardment, surface layers within 1000 A had to be removed from the surfaces contaminated by touching with either a naked hand or a nylon glove or covered by a small amount of Ti. Mutual diffusions across the interfaces are also analyzed as a function of the substrate temperature. A Nb sandwich layer inhibites effectively the mutual diffusion. (auth.)
International Nuclear Information System (INIS)
Watson, G S; Watson, J A
2008-01-01
In this paper we correlate the Atomic Force Microscope probe movement with surface location while scanning in the imaging and Force versus distance modes. Static and dynamic stick-slip processes are described on a scale of nanometres to microns on a range of samples. We demonstrate the limits and range of the tip apex being fixed laterally in the force versus distance mode and static friction slope dependence on probe parameters. Micron scale static and dynamic friction can be used to purposefully manipulate soft surfaces to produce well defined frictional gradients
Directory of Open Access Journals (Sweden)
Sharf Abdusalam M.
2014-03-01
Full Text Available In the oil and gas industries, understanding the behaviour of a flow through an annulus gap in a vertical position, whose outer wall is stationary whilst the inner wall rotates, is a significantly important issue in drilling wells. The main emphasis is placed on experimental (using an available rig and computational (employing CFD software investigations into the effects of the rotation speed of the inner pipe on the axial velocity profiles. The measured axial velocity profiles, in the cases of low axial flow, show that the axial velocity is influenced by the rotation speed of the inner pipe in the region of almost 33% of the annulus near the inner pipe, and influenced inversely in the rest of the annulus. The position of the maximum axial velocity is shifted from the centre to be nearer the inner pipe, by increasing the rotation speed. However, in the case of higher flow, as the rotation speed increases, the axial velocity is reduced and the position of the maximum axial velocity is skewed towards the centre of the annulus. There is a reduction of the swirl velocity corresponding to the rise of the volumetric flow rate.
Directory of Open Access Journals (Sweden)
H. Dupuis
Full Text Available Heat flux estimates obtained using the inertial dissipation method, and the profile method applied to radiosonde soundings, are assessed with emphasis on the parameterization of the roughness lengths for temperature and specific humidity. Results from the inertial dissipation method show a decrease of the temperature and humidity roughness lengths for increasing neutral wind speed, in agreement with previous studies. The sensible heat flux estimates were obtained using the temperature estimated from the speed of sound determined by a sonic anemometer. This method seems very attractive for estimating heat fluxes over the ocean. However allowance must be made in the inertial dissipation method for non-neutral stratification. The SOFIA/ASTEX and SEMAPHORE results show that, in unstable stratification, a term due to the transport terms in the turbulent kinetic energy budget, has to be included in order to determine the friction velocity with better accuracy. Using the profile method with radiosonde data, the roughness length values showed large scatter. A reliable estimate of the temperature roughness length could not be obtained. The humidity roughness length values were compatible with those found using the inertial dissipation method.
Directory of Open Access Journals (Sweden)
H. Dupuis
1995-10-01
Full Text Available Heat flux estimates obtained using the inertial dissipation method, and the profile method applied to radiosonde soundings, are assessed with emphasis on the parameterization of the roughness lengths for temperature and specific humidity. Results from the inertial dissipation method show a decrease of the temperature and humidity roughness lengths for increasing neutral wind speed, in agreement with previous studies. The sensible heat flux estimates were obtained using the temperature estimated from the speed of sound determined by a sonic anemometer. This method seems very attractive for estimating heat fluxes over the ocean. However allowance must be made in the inertial dissipation method for non-neutral stratification. The SOFIA/ASTEX and SEMAPHORE results show that, in unstable stratification, a term due to the transport terms in the turbulent kinetic energy budget, has to be included in order to determine the friction velocity with better accuracy. Using the profile method with radiosonde data, the roughness length values showed large scatter. A reliable estimate of the temperature roughness length could not be obtained. The humidity roughness length values were compatible with those found using the inertial dissipation method.